Math 54. Selected Solutions for Week 8 Section 6.1 (Page 282) 22. Let U = U1 U2 U3 . Explain Why U · U ≥ 0. Wh
Total Page:16
File Type:pdf, Size:1020Kb
Math 54. Selected Solutions for Week 8 Section 6.1 (Page 282) 2 3 u1 22. Let ~u = 4 u2 5 . Explain why ~u · ~u ≥ 0 . When is ~u · ~u = 0 ? u3 2 2 2 We have ~u · ~u = u1 + u2 + u3 , which is ≥ 0 because it is a sum of squares (all of which are ≥ 0 ). It is zero if and only if ~u = ~0 . Indeed, if ~u = ~0 then ~u · ~u = 0 , as 2 can be seen directly from the formula. Conversely, if ~u · ~u = 0 then all the terms ui must be zero, so each ui must be zero. This implies ~u = ~0 . 2 5 3 26. Let ~u = 4 −6 5 , and let W be the set of all ~x in R3 such that ~u · ~x = 0 . What 7 theorem in Chapter 4 can be used to show that W is a subspace of R3 ? Describe W in geometric language. The condition ~u · ~x = 0 is equivalent to ~x 2 Nul ~uT , and this is a subspace of R3 by Theorem 2 on page 187. Geometrically, it is the plane perpendicular to ~u and passing through the origin. 30. Let W be a subspace of Rn , and let W ? be the set of all vectors orthogonal to W . Show that W ? is a subspace of Rn using the following steps. (a). Take ~z 2 W ? , and let ~u represent any element of W . Then ~z · ~u = 0 . Take any scalar c and show that c~z is orthogonal to ~u . (Since ~u was an arbitrary element of W , this will show that c~z is in W ? .) ? (b). Take ~z1 and ~z2 in W , and let ~u be any element of W . Show that ~z1 +~z2 is orthogonal to ~u . What can you conclude about ~z1 + ~z2 ? Why? (c). Finish the proof that W ? is a subspace of Rn . a. We have (c~z) · ~u = c(~z · ~u) = c · 0 = 0 , so c~z is orthogonal to ~u . This is true for all ~u 2 W , so c~z lies in W ? . ? b. Let ~z1 , ~z2 , and ~u be as in the problem. Since ~z1 and ~z2 lie in W , we have ~z1 · ~u = ~z2 · ~u = 0 . Therefore (~z1 + ~z2) · ~u = ~z1 · ~u + ~z2 · ~u = 0 ; so ~z1 + ~z2 is orthogonal to ~u . Since this is true for all ~u 2 W , it follows that ~z1 + ~z2 is in W ? . c. We also have ~0 · ~u = 0 for all ~u 2 W (by properties of the inner product). Since this is true for all ~u 2 W , it follows that ~0 2 W ? . Also, parts (a) and (b) show that W ? is closed under scalar multiplication and addition, respectively. Therefore W ? is a subspace of Rn . 1 2 Section 6.2 (Page 290) −3 1 16. Let ~y = and ~u = . Compute the distance from ~y to the line through ~u 9 2 and the origin. First of all, project ~y to the line L through ~u and the origin: ~y · ~u 15 1 3 proj ~y = ~u = = : L ~u · ~u 5 2 6 The distance from ~y to L is the distance from ~y to its projection to L , which is p p p k~y − projL ~yk = k(−3; 9) − (3; 6)k = k(−6; 3)k = 36 + 9 = 45 = 3 5 : See also Example 4. 22. Determine whether the set 2 p 3 2 p 3 2 3 1=p18 1= 2 −2=3 4 4=p18 5 ; 4 0p 5 ; 4 1=3 5 1= 18 −1= 2 −2=3 is orthonormal. If the set is only orthogonal, normalize the vectors to produce an orthonormal set. This set is orthonormal. 27. Let U be a square matrix with orthonormal columns. Explain why U is invertible. (Mention the theorems you use.) Since the columns are orthonormal, they are all nonzero. Since they are nonzero and orthogonal, they are linearly independent (by Theorem 4 on page 284). Therefore U is invertible (by Theorem 8 on page 114, (e) () (a)). 28. Let U be an n × n orthogonal matrix. Show that the rows of U form an orthonormal basis of Rn . Since U is an orthogonal matrix, we have U −1 = U T . Then U T is also orthog- onal, since (U T )−1 = (U −1)T = (U T )T . By Theorem 6, the rows of U are therefore orthonormal, since they are the columns of the orthogonal matrix U T . These vectors are linearly independent (by Theorem 4) and there are n of them, so they form a basis for Rn . 3 Section 6.3 (Page 298) n 24. Let W be a subspace of R with an orthogonal basis f~w1; : : : ; ~wpg , and let f~v1; : : : ;~vqg be an orthogonal basis for W ? . (a). Explain why f~w1; : : : ; ~wp;~v1; : : : ;~vqg is an orthogonal set. (b). Explain why the set in part (a) spans Rn . (c). Show that dim W + dim W ? = n . a. The set f~w1; : : : ; ~wp;~v1; : : : ;~vqg is an orthogonal set because any two distinct ~wi are orthogonal, any two distinct ~vj are orthogonal, and any ~wi is orthogonal to any ? ~vj since ~vj is in W . b. The set spans Rn because any ~y 2 Rn can be written asy ^ + ~z withy ^ 2 W and ? ~z 2 W , and these in turn can be written as linear combinations of ~w1; : : : ; ~wp and ~v1; : : : ;~vq , respectively. c. The set in part (a) is linearly independent because it is an orthogonal set of nonzero vectors (the vectors are nonzero because they are elements of bases). Therefore the set is a basis for Rn . This shows that dim W + dim W ? = p + q = dim Rn = n . Section 6.4 (Page 304) 3. The set 2 2 3 2 4 3 4 −5 5 ; 4 −1 5 1 2 is a basis for a subspace W . Use the Gram-Schmidt process to produce an orthogonal basis for W . Let ~x1 and ~x2 be the given basis elements. The Gram-Schmidt process is 2 2 3 ~v1 = ~x1 = 4 −5 5 ; 1 2 4 3 2 2 3 2 3 3 ~x · ~v 15 ~v = ~x − 2 1 ~v = −1 − −5 = 3=2 : 2 2 ~v · ~v 1 4 5 30 4 5 4 5 1 1 2 1 3=2 These two elements form an orthogonal basis for W . 4 13. Let 2 5 9 3 2 5=6 −1=6 3 1 7 1=6 5=6 A = 6 7 ;Q = 6 7 : 4 −3 −5 5 4 −3=6 1=6 5 1 5 1=6 3=6 The columns of Q were obtained by applying the Gram-Schmidt process to the columns of A . Find an upper triangular matrix R such that A = QR . Check your work. One could determine the entries of R by carrying out the Gram-Schmidt process to express the columns of Q as linear combinations of the columns of A ; then combine those coefficients to give R−1 ; and finally invert to get R . However, there's an easier way: As in Example 4, 2 5 9 3 5=6 1=6 −3=6 1=6 1 7 6 12 R = QT A = 6 7 = : −1=6 5=6 1=6 3=6 4 −3 −5 5 0 6 1 5 To check: 2 5=6 −1=6 3 2 5 −1 3 2 5 9 3 1=6 5=6 6 12 1 5 1 2 1 7 QR = 6 7 = 6 7 = 6 7 = A: 4 −3=6 1=6 5 0 6 4 −3 1 5 0 1 4 −3 −5 5 1=6 3=6 1 3 1 5 20. Suppose A = QR , where R is an invertible matrix. Show that A and Q have the same column space. [Hint: Given ~y in Col A , show that ~y = Q~x for some ~x . Also, given ~y in Col Q , show that ~y = A~x for some ~x .] Following the hint, suppose ~y 2 Col A . Then ~y = A~v , where the coordinates of ~v are the weights used to represent ~y as a linear combination of the columns of A . But then ~y = QR~v , so ~y is in the column space of Q since the weights for writing ~y as a linear combination of the columns of Q are the coordinates of R~v . Conversely, suppose that ~y is in Col Q . Then ~y = Q~u for some ~u whose coor- dinates are the weights used to express ~y as a linear combination of the columns of Q . But then we have Q = AR−1 , so ~y = AR−1~u , so ~y lies in Col A since it can be expressed as a linear combination of the columns of A using weights given by the coordinates of R−1~u ..