<<

Math 54. Selected Solutions for Week 8

Section 6.1 (Page 282)   u1 22. Let ~u =  u2  . Explain why ~u · ~u ≥ 0 . When is ~u · ~u = 0 ? u3 2 2 2 We have ~u · ~u = u1 + u2 + u3 , which is ≥ 0 because it is a sum of squares (all of which are ≥ 0 ). It is zero if and only if ~u = ~0 . Indeed, if ~u = ~0 then ~u · ~u = 0 , as 2 can be seen directly from the formula. Conversely, if ~u · ~u = 0 then all the terms ui must be zero, so each ui must be zero. This implies ~u = ~0 .  5  26. Let ~u =  −6  , and let W be the set of all ~x in R3 such that ~u · ~x = 0 . What 7 theorem in Chapter 4 can be used to show that W is a subspace of R3 ? Describe W in geometric language. The condition ~u · ~x = 0 is equivalent to ~x ∈ Nul ~uT , and this is a subspace of R3 by Theorem 2 on page 187. Geometrically, it is the plane perpendicular to ~u and passing through the origin.

30. Let W be a subspace of Rn , and let W ⊥ be the set of all vectors orthogonal to W . Show that W ⊥ is a subspace of Rn using the following steps. (a). Take ~z ∈ W ⊥ , and let ~u represent any element of W . Then ~z · ~u = 0 . Take any c and show that c~z is orthogonal to ~u . (Since ~u was an arbitrary element of W , this will show that c~z is in W ⊥ .) ⊥ (b). Take ~z1 and ~z2 in W , and let ~u be any element of W . Show that ~z1 +~z2 is orthogonal to ~u . What can you conclude about ~z1 + ~z2 ? Why? (c). Finish the proof that W ⊥ is a subspace of Rn . a. We have (c~z) · ~u = c(~z · ~u) = c · 0 = 0 , so c~z is orthogonal to ~u . This is true for all ~u ∈ W , so c~z lies in W ⊥ . ⊥ b. Let ~z1 , ~z2 , and ~u be as in the problem. Since ~z1 and ~z2 lie in W , we have ~z1 · ~u = ~z2 · ~u = 0 . Therefore

(~z1 + ~z2) · ~u = ~z1 · ~u + ~z2 · ~u = 0 ,

so ~z1 + ~z2 is orthogonal to ~u . Since this is true for all ~u ∈ W , it follows that ~z1 + ~z2 is in W ⊥ . c. We also have ~0 · ~u = 0 for all ~u ∈ W (by properties of the inner product). Since this is true for all ~u ∈ W , it follows that ~0 ∈ W ⊥ . Also, parts (a) and (b) show that W ⊥ is closed under scalar and addition, respectively. Therefore W ⊥ is a subspace of Rn . 1 2

Section 6.2 (Page 290)

 −3   1  16. Let ~y = and ~u = . Compute the distance from ~y to the line through ~u 9 2 and the origin. First of all, project ~y to the line L through ~u and the origin:

~y · ~u 15  1   3  proj ~y = ~u = = . L ~u · ~u 5 2 6

The distance from ~y to L is the distance from ~y to its projection to L , which is √ √ √ k~y − projL ~yk = k(−3, 9) − (3, 6)k = k(−6, 3)k = 36 + 9 = 45 = 3 5 .

See also Example 4.

22. Determine whether the set  √   √    1/√18 1/ 2 −2/3  4/√18  ,  0√  ,  1/3  1/ 18 −1/ 2 −2/3

is orthonormal. If the set is only orthogonal, normalize the vectors to produce an orthonormal set. This set is orthonormal.

27. Let U be a square with orthonormal columns. Explain why U is invertible. (Mention the theorems you use.) Since the columns are orthonormal, they are all nonzero. Since they are nonzero and orthogonal, they are linearly independent (by Theorem 4 on page 284). Therefore U is invertible (by Theorem 8 on page 114, (e) ⇐⇒ (a)).

28. Let U be an n × n . Show that the rows of U form an orthonormal of Rn . Since U is an orthogonal matrix, we have U −1 = U T . Then U T is also orthog- onal, since (U T )−1 = (U −1)T = (U T )T . By Theorem 6, the rows of U are therefore orthonormal, since they are the columns of the orthogonal matrix U T . These vectors are linearly independent (by Theorem 4) and there are n of them, so they form a basis for Rn . 3

Section 6.3 (Page 298)

n 24. Let W be a subspace of R with an orthogonal basis {~w1, . . . , ~wp} , and let {~v1, . . . ,~vq} be an orthogonal basis for W ⊥ .

(a). Explain why {~w1, . . . , ~wp,~v1, . . . ,~vq} is an orthogonal set. (b). Explain why the set in part (a) spans Rn . (c). Show that dim W + dim W ⊥ = n .

a. The set {~w1, . . . , ~wp,~v1, . . . ,~vq} is an orthogonal set because any two distinct ~wi are orthogonal, any two distinct ~vj are orthogonal, and any ~wi is orthogonal to any ⊥ ~vj since ~vj is in W . b. The set spans Rn because any ~y ∈ Rn can be written asy ˆ + ~z withy ˆ ∈ W and ⊥ ~z ∈ W , and these in turn can be written as linear combinations of ~w1, . . . , ~wp and ~v1, . . . ,~vq , respectively. c. The set in part (a) is linearly independent because it is an orthogonal set of nonzero vectors (the vectors are nonzero because they are elements of bases). Therefore the set is a basis for Rn . This shows that dim W + dim W ⊥ = p + q = dim Rn = n .

Section 6.4 (Page 304)

3. The set  2   4   −5  ,  −1  1 2 is a basis for a subspace W . Use the Gram-Schmidt process to produce an orthogonal basis for W .

Let ~x1 and ~x2 be the given basis elements. The Gram-Schmidt process is

 2  ~v1 = ~x1 =  −5  ; 1  4   2   3  ~x · ~v 15 ~v = ~x − 2 1 ~v = −1 − −5 = 3/2 . 2 2 ~v · ~v 1   30     1 1 2 1 3/2

These two elements form an orthogonal basis for W . 4

13. Let  5 9   5/6 −1/6  1 7 1/6 5/6 A =   ,Q =   .  −3 −5   −3/6 1/6  1 5 1/6 3/6 The columns of Q were obtained by applying the Gram-Schmidt process to the columns of A . Find an upper R such that A = QR . Check your work. One could determine the entries of R by carrying out the Gram-Schmidt process to express the columns of Q as linear combinations of the columns of A ; then combine those coefficients to give R−1 ; and finally invert to get R . However, there’s an easier way: As in Example 4,

 5 9   5/6 1/6 −3/6 1/6  1 7  6 12  R = QT A =   = . −1/6 5/6 1/6 3/6  −3 −5  0 6 1 5

To check:

 5/6 −1/6   5 −1   5 9  1/6 5/6  6 12  1 5  1 2  1 7 QR =   =   =   = A.  −3/6 1/6  0 6  −3 1  0 1  −3 −5  1/6 3/6 1 3 1 5

20. Suppose A = QR , where R is an . Show that A and Q have the same column space. [Hint: Given ~y in Col A , show that ~y = Q~x for some ~x . Also, given ~y in Col Q , show that ~y = A~x for some ~x .] Following the hint, suppose ~y ∈ Col A . Then ~y = A~v , where the coordinates of ~v are the weights used to represent ~y as a of the columns of A . But then ~y = QR~v , so ~y is in the column space of Q since the weights for writing ~y as a linear combination of the columns of Q are the coordinates of R~v . Conversely, suppose that ~y is in Col Q . Then ~y = Q~u for some ~u whose coor- dinates are the weights used to express ~y as a linear combination of the columns of Q . But then we have Q = AR−1 , so ~y = AR−1~u , so ~y lies in Col A since it can be expressed as a linear combination of the columns of A using weights given by the coordinates of R−1~u .