<<

Math 54. Selected Solutions for Week 5

Section 4.2 (Page 194)

28. Consider the following two systems of equations:

5x1 + x2 − 3x3 = 0 5x1 + x2 − 3x3 = 0

−9x1 + 2x2 + 5x3 = 1 −9x1 + 2x2 + 5x3 = 5

4x1 + x2 − 6x3 = 9 4x1 + x2 − 6x3 = 45 It can be shown that the first system has a solution. Use this fact and the theory from this section to explain why the second system must also have a solution. (Make no row operations.) Let A be the (common) coefficient of the systems, and let ~b = (0, 1, 9) . If ~x is a solution of the first system, then A~x = ~b , so A(5~x) = 5(A~x) = 5~b , and therefore 5~x is a solution of the second system. Wait a minute. This doesn’t use methods from this section. Instead, let A and ~b be as before. If the first system has a solution, then ~b lies in Col A . Since Col A is a linear subspace, 5~b also lies in Col A , and therefore the second system has a solution, too.

34. (Calculus required) Define T : C[0, 1] → C[0, 1] as follows. For f~ in C[0, 1] , let T (f~) be the antiderivative F~ of f~ such that F~ (0) = 0 . Show that T is a linear trans- formation, and describe the of T . (See the notation in Exercise 20 of Section 4.1.) From calculus, we see that Z x T (f~) = f~(t) dt 0 (strictly speaking, it is the from [0, 1] to R whose value at x ∈ [0, 1] is the above integral). The first thing we need to check is that T (f~) ∈ C[0, 1] for all f~ ∈ C[0, 1] . This is true because the above antiderivative is defined for all x ∈ [0, 1] , and is continuous on [0, 1] (by the Fundamental Theorem of Calculus). To check that T is a linear transformation, we check and multipli- cation: Z x Z x Z x T (f~ + ~g) = (f~(t) + ~g(t)) dt = f~(t) dt + ~g(t) dt = T (f~) + T (~g) 0 0 0 and Z x Z x T (cf~) = cf~(t) dt = c f~(t) dt = cT (f~) . 0 0 Therefore T is a linear transformation. 1 2

Section 4.3 (Page 201)

6. Determine whether the  1   −4   2  ,  3  −4 6

is a for R3 . If the set is not a basis, determine whether it is linearly independent and whether it spans R3 . Justify your answers. This set is linearly independent because it has two elements and neither is a scalar multiple of the other. It does not span R3 , though. This is because the matrix

 1 −4 1   2 3 0  −4 6 0

with nonzero 24 has linearly independent columns (by the Invertible Ma- trix Theorem). Therefore the first two columns are not a maximal linearly independent set, so they cannot be a basis of R3 (see the second paragraph of “Two Views of a Basis” on page 200).

14. Assume that A is row equivalent to B . Find bases for Nul A and Col A .

 1 2 3 −4 8   1 2 0 2 5  1 2 0 2 8 0 0 3 −6 3 A =   ,B =   .  2 4 −3 10 9   0 0 0 0 −7  3 6 0 6 9 0 0 0 0 0

The homogeneous linear system has free variables x2 and x4 . We have

x5 = 0 ; 3x3 = 6x4 − 3x5 ; and x1 = −2x2 − 2x4 − 5x5 .

Therefore the solution space in parametric vector form is

 −2   −2   1   0      x2  0  + x4  2  ,  0   1  0 0

so a basis for Nul A is the two vectors in the above expression. 3

A basis for Col A is the pivot columns of A ; namely,

 1   3   8  1 0 8   ,   ,   .  2   −3   9  3 0 9

15. Find a basis for the space spanned by the given vectors ~v1, . . . ,~v5 :

 1   0   2   2   3  0 1 −2 −1 −1   ,   ,   ,   ,    −2   2   −8   10   −6  3 3 0 3 9

Row reduce the matrix whose columns are the given vectors:

 1 0 2 2 3   1 0 2 2 3   1 0 2 2 3  0 1 −2 −1 −1 0 1 −2 −1 −1 0 1 −2 −1 −1   ∼   ∼   .  −2 2 −8 10 −6   0 2 −4 14 0   0 0 0 16 2  3 3 0 3 9 0 3 −6 −3 0 0 0 0 0 3

The first, second, fourth, and fifth columns are the pivot columns, so ~v1 , ~v2 , ~v4 and ~v5 form a basis for Span{~v1,~v2,~v3,~v4,~v5} .

26. In the of all real-valued functions, find a basis for the subspace spanned by {sin t, sin 2t, sin t cos t} . We have sin 2t = 2 sin t cos t , so sin 2t and sin t cos t are constant (scalar) mul- tiples of each other. So, a basis will be {sin t, sin 2t} or {sin t, sin t cos t} (in each case, neither element of the set is a constant multiple of the other, so they are linearly independent).

34. Consider the polynomials ~p1(t) = 1 + t , ~p2(t) = 1 − t , and ~p3(t) = 2 (for all t ). By inspection, write a linear dependence relation among ~p1 , ~p2 , and ~p3 . Then find a basis for Span{~p1, ~p2, ~p3} .

~p1 + ~p2 − ~p3 = ~0 . We can eliminate any one of the three. The remaining two will be linearly independent since neither is a constant multiple of the other, so any two of the three will be a basis for Span{~p1, ~p2, ~p3} . 4

Section 4.4 (Page 210)

11. Use an inverse matrix to find [~x]B for  1   −3   2  B = , , ~x = . −2 5 −5

 1 −3  Let A = P = . Then det A = −1 , so B −2 5

 5 3   −5 −3  A−1 = − = . 2 1 −2 −1

Since A[~x]B = ~x , we have  −5 −3   2   5  [~x] = A−1~x = = . B −2 −1 −5 1

2 2 2 13. The set B = {1 + t , t + t , 1 + 2t + t } is a basis for P2 . Find the coordinate vector of ~p(t) = 1 + 4t + 7t2 relative to B . As in Practice Problem #2 (on page 210; solution on page 212), we note that the coordinates of ~p(t) relative to B satisfy

2 2 2 2 c1(1 + t ) + c2(t + t ) + c3(1 + 2t + t ) = 1 + 4t + 7t . Simplifying the left-hand side gives

2 2 (c1 + c3) + (c2 + 2c3)t + (c1 + c2 + c3)t = 1 + 4t + 7t . This gives the linear system       1 0 1 c1 1  0 1 2   c2  =  4  . 1 1 1 c3 7 Row reduce the of the system:  1 0 1 1   1 0 1 1   1 0 1 1   0 1 2 4  ∼  0 1 2 4  ∼  0 1 2 4  1 1 1 7 0 1 0 6 0 0 −2 2  1 0 1 1   1 0 0 2  ∼  0 1 2 4  ∼  0 1 0 6  . 0 0 1 −1 0 0 1 −1 Therefore  2    ~p(t) B =  6  . −1 5

~ ~ 18. Let B = {b1,..., bn} be a basis for a vector space V . Explain why the B-coordinate ~ ~ vectors of b1,..., bn are the columns ~e1, . . . ,~en of the n × n identity matrix. ~ ~ ~ ~ ~ ~ For each i = 1, . . . , n , we have bi = 0b1 + ··· + 0bi−1 + 1bi + 0bi+1 + ··· + 0bn , so ~ [bi]B = (0,..., 0, 1, 0,..., 0) = ~ei since the 1 is in the ith coordinate.

20. Suppose {~v1, . . . ,~v4} is a linearly dependent spanning set for a vector space V . Show that each ~w in V can be expressed in more than one way as a of ~v1, . . . ,~v4 .[Hint: Let ~w = k1~v1 + ··· + k4~v4 be an arbitrary vector in V . Use the linear dependence of {~v1, . . . ,~v4} to produce another representation of ~w as a linear combination of ~v1, . . . ,~v4 .]

Since ~v1, . . . ,~v4 span V , given any ~w ∈ V we can write ~w = k1~v1 + ··· + k4~v4 for some k1, . . . , k4 ∈ R . Since ~v1, . . . ,~v4 are linearly dependent, they have a linear dependence relation c1~v1 + ··· + c4~v4 = ~0 . Adding this to the above equation, we get

~w = ~w + ~0 = k1~v1 + ··· + k4~v4 + c1~v1 + ··· + c4~v4 = (k1 + c1)~v1 + ··· + (k4 + c4)~v4 .

This is a different linear combination from k1~v1 + ··· + k4~v4 , because ci 6= 0 for some i , so ki 6= ki + ci for that value of i .

24. Show that the coordinate mapping is onto Rn . That is, given any ~y in Rn , with entries ~y1, . . . , ~yn , produce ~u in V such that [~u]B = ~y . n ~ ~ Given ~y = (y1, . . . , yn) ∈ R , let ~u = y1b1 + ··· + ynbn . Then (by definition of coordinate vector), we have   y1  .  [~u]B =  .  = ~y, yn and so the coordinate mapping maps ~u to ~y .

28. Use coordinate vectors to test the of the set of polynomials 1 − 2t2 − t3 , t + 2t3 , 1 + t − 2t2 . Explain your work. As in Example 6, these vectors give a matrix A , for which the augmented matrix [ A ~0 ] for the system A~x = ~0 is row reduced as follows:  1 0 1 0   1 0 1 0   1 0 1 0   1 0 1 0  0 1 1 0 0 1 1 0 0 1 1 0 0 1 1 0   ∼   ∼   ∼   .  −2 0 −2 0   0 0 0 0   0 0 0 0   0 0 −1 0  −1 2 0 0 0 2 1 0 0 0 −1 0 0 0 0 0 This system has no free variables, hence no nontrivial solutions, so the set is linearly independent. 6

Section 4.5 (Page 217)

5. For the subspace  p − 2q      2p + 5r    : p , q , r in R ,  −2q + 2r   −3p + 6r  (a) find a basis for the subspace, and (b) state the . In parametric vector form, this subspace equals

 1   −2   0  2 0 5 p   + q   + r   .  0   −2   2  −3 0 6

If we row reduce the matrix whose columns are the vectors occurring above, we find:

 1 −2 0   1 −2 0   1 −2 0  2 0 5 0 4 5 0 4 5   ∼   ∼   .  0 −2 2   0 −2 2   0 0 9/2  −3 0 6 0 −6 6 0 0 27/2

We can stop here, since it is clear that all columns are pivot columns, and therefore the vectors are linearly independent. Thus a basis for the subspace is

 1   −2   0     2   0   5    ,   ,   ,  0 −2 2   −3 0 6 

and the dimension of the space is 3.

7. For the subspace

{(a, b, c): a − 3b + c = 0, b − 2c = 0, 2b − c = 0} ,

(a) find a basis for the subspace, and (b) state the dimension. This subspace is the solution set of the linear system

a − 3b + c = 0 b − 2c = 0 2b − c = 0 , 7

which is the null space of the matrix

 1 −3 1   0 1 −2  , 0 2 −1

which row reduces as follows:  1 −3 1   1 −3 1   0 1 −2  ∼  0 1 −2  . 0 2 −1 0 0 3

Since all columns are pivot columns, there are no free variables, and therefore the subspace is the trivial subspace. Therefore a basis for the subspace is

∅ ,

and the dimension of the subspace is 0 .

11. Find the dimension of the subspace spanned by

 1   3   −2   5   0  ,  1  ,  −1  ,  2  . 2 1 1 2

The subspace spanned by these vectors is the column space of the matrix whose columns are the given vectors, and we row reduce it as follows:

 1 3 −2 5   1 3 −2 5   1 3 −2 5   0 1 −1 2  ∼  0 1 −1 2  ∼  0 1 −1 2  . 2 1 1 2 0 −5 5 −8 0 0 0 2

This matrix has three pivot columns, and so the subspace has dimension 3 .

14. Determine the dimensions of Nul A and Col A for the matrix  1 2 −4 3 −2 6 0  0 0 0 1 0 −3 7   .  0 0 0 0 1 4 −2  0 0 0 0 0 0 1

This matrix is already in row reduced form, so we can read off the answers directly from the matrix. There are four pivot columns, so dim(Col A) = 4 . There are three free variables, so dim(Nul A) = 3 . 8

26. Let H be an n-dimensional subspace of an n-dimensional vector space V . Show that H = V . The subspace H has a basis with n vectors. These are n linearly independent vectors in V , so by the Basis Theorem, they constitute a basis for V as well. So H = V , since both H and V are equal to the set spanned by that basis.

31. Let V and W be finite-dimensional vector spaces, and let T : V → W be a linear transformation. Let H be a nonzero subspace of V , and let T (H) be the set of images of vectors in H . Then T (H) is a subspace of W , by Exercise 35 in Section 4.2. Prove that dim T (H) ≤ dim H . ~ ~ Let B = {b1,..., bn} be a basis for H . Then the set

~ ~ T (B) = {T (b1),...,T (bn)}

spans T (H) . This is true because, for any ~y ∈ T (H) , we can write ~y = T (~x) for some ~x ∈ H . Write ~ ~ ~x = c1b1 + ··· + cnbn ; then ~ ~ ~ ~ ~y = T (~x) = T (c1b1 + ··· + cnbn) = c1T (b1) + ··· + cnT (bn) , ~ ~ ~ ~ and this lies in Span{T (b1),...,T (bn)} . Since all of T (b1),...,T (bn) lie in T (H) , we have that these vectors span T (H). ~ ~ By the Spanning Set Theorem, some of {T (b1),...,T (bn)} is a basis for ~ ~ T (H) . The set {T (b1),...,T (bn)} has n elements, so the basis composed of a subset of it has at most n elements. Therefore T (H) has dimension at most n .