LINEAR ALGEBRA FALL 2007/08 PROBLEM SET 9 SOLUTIONS In

LINEAR ALGEBRA FALL 2007/08 PROBLEM SET 9 SOLUTIONS In

MATH 110: LINEAR ALGEBRA FALL 2007/08 PROBLEM SET 9 SOLUTIONS In the following V will denote a finite-dimensional vector space over R that is also an inner product space with inner product denoted by h ·; · i. The norm induced by h ·; · i will be denoted k · k. 1. Let S be a subset (not necessarily a subspace) of V . We define the orthogonal annihilator of S, denoted S?, to be the set S? = fv 2 V j hv; wi = 0 for all w 2 Sg: (a) Show that S? is always a subspace of V . ? Solution. Let v1; v2 2 S . Then hv1; wi = 0 and hv2; wi = 0 for all w 2 S. So for any α; β 2 R, hαv1 + βv2; wi = αhv1; wi + βhv2; wi = 0 ? for all w 2 S. Hence αv1 + βv2 2 S . (b) Show that S ⊆ (S?)?. Solution. Let w 2 S. For any v 2 S?, we have hv; wi = 0 by definition of S?. Since this is true for all v 2 S? and hw; vi = hv; wi, we see that hw; vi = 0 for all v 2 S?, ie. w 2 S?. (c) Show that span(S) ⊆ (S?)?. Solution. Since (S?)? is a subspace by (a) (it is the orthogonal annihilator of S?) and S ⊆ (S?)? by (b), we have span(S) ⊆ (S?)?. ? ? (d) Show that if S1 and S2 are subsets of V and S1 ⊆ S2, then S2 ⊆ S1 . ? Solution. Let v 2 S2 . Then hv; wi = 0 for all w 2 S2 and so for all w 2 S1 (since ? S1 ⊆ S2). Hence v 2 S1 . (e) Show that ((S?)?)? = S?. ? ? Solution. Applying (d) to (b) with S1 = S and S2 = (S ) , we get ((S?)?)? ⊆ S?: Apply (c) to S?, we get span(S?) ⊆ ((S?)?)? but by (a), span(S?) = S?. Hence we get equality. ? (f) Show that either S \ S must be either the empty set ? or the zero subspace f0V g. ? ? ? Solution. If S \ S 6= ?, then let v 2 S \ S . Since v 2 S , we have hv; wi = 0 2 for all w 2 S. Since v 2 S, in particular, kvk = hv; vi = 0, implying that v = 0V . In ? ? other words, the only vector in S \ S is 0V . So S \ S = f0V g. On the other hand, if ? ? 0V 2= S, then 0V 2= S \ S and so S \ S = ? (if not, then the previous argument gives a contradiction). 2. Let W be a subspace of V . The subspace W ? is called the orthogonal complement of W . (a) Show that V = W ⊕ W ?. ? Solution. Since W and W are both subspaces (the latter by Problem 1(a)), 0V 2 W ? ? and 0V 2 W . So by Problem 1(f), we have that W \W = f0V g. It remains to show that Date: December 9, 2007 (Version 1.0). 1 ? ? W + W = V . Clearly W + W ⊆ V . For the converse, let v 2 V and let fw1;:::; wrg be an orthonormal basis of W . Consider x := hv; w1iw1 + hv; w2iw2 + ··· + hv; wriwr and y := v − x: Clearly x 2 W . We claim that y 2 W ?. For any w 2 W , we could write w = hw; w1iw1 + hw; w2iw2 + ··· + hw; wriwr since fw1;:::; wrg is an orthonormal basis of W . So hy; wi = hv − x; wi = hv; wi − hx; wi D Xr E DXr E = v; hw; wiiwi − hv; wiiwi; w i=1 i=1 Xr Xr = hv; wiihw; wii − hv; wiihwi; wi i=1 i=1 = 0 ? since hw; wii = hwi; wi. Hence v = x + y 2 W + W . (b) Show that W = (W ?)?. Solution. By Problem 1(b), we have W ⊆ (W ?)?. But by (a), we have W ⊕ W ? = V = W ? ⊕ (W ?)? and so dim(W ) + dim(W ?) = dim(W ?) + dim((W ?)?) and so dim(W ) = dim((W ?)?): Hence W = (W ?)?. m×n n m 3. Let A 2 R and regard A : R ! R as a linear transformation in the usual way. Show that nullsp(A>) = (colsp(A))? and rowsp(A) = (nullsp(A))?: Hence deduce that m > R = nullsp(A ) ⊕ colsp(A) and n R = rowsp(A) ⊕ nullsp(A): m n ? Solution. Clearly colsp(A) = fAx 2 R j x 2 R g. So if v 2 (colsp(A)) , then hv;Axi = 0 n for all x 2 R . So hA>v; xi = (A>v)>x = v>Ax = hv;Axi = 0 n > for all x 2 R . In particular, we may pick x = A v to get kA>vk2 = hA>v;A>vi = 0; giving us A>v = 0: Hence v 2 nullsp(A>). Conversely, if v 2 nullsp(A>), then hv;Axi = v>Ax = (A>v)>x = 0>x = 0 2 n ? for all x 2 R . So v 2 (colsp(A)) . This proves the first equality. For the second equality, just note that rowsp(A) = colsp(A>) and apply the first equality to get (colsp(A>))? = nullsp(A); and then apply Problem 2(b) to get rowsp(A) = ((rowsp(A))?)? = (nullsp(A))?: The next two inequalities are immediate consequences of Problem 2(a). n×n n 4. Let A; B 2 R . Let h ·; · i denote the standard inner product on R . n (a) Show that if hAx; yi = 0 for all x; y 2 R , then A = O, the zero matrix. n n Solution. Let fe1;:::; eng be the standard basis of R . Observe that if A = [aij]i;j=1, then > aij = ei Aej = hei;Aeji = hAej; eii = 0 by letting x and y be ei and ej respectively. Since this is true for all i and j, we have that A = O. n (b) Is it true that if hAx; xi = 0 for all x 2 R , then A = O, the zero matrix? 2 n > Solution. Recall notation from Homework 4, Problem 1(d). Let A 2 ^ (R ), ie. A = −A. Then hAx; xi = x>A>x = −x>Ax = −hx;Axi = −hAx; xi; implying that hAx; xi = 0. So any non-zero skew-symmetric matrix would be a counter example. An explicit 2 × 2 example is given by 0 −1 A = : 1 0 n (c) Is it true that if hAx; xi = hBx; xi for all x 2 R , then A = B? Solution. Any two distinct skew-symmetric matrices would provide a counter example. An explicit 2 × 2 example is given by 0 −1 0 0 A = ;B = : 1 0 0 0 n 5. Let B1 = fx1;:::; xng and B2 = fy1;:::; yng be two bases of R . B1 and B2 are not necessarily orthonormal bases. n×n (a) Show that there exists an orthogonal matrix Q 2 R such that Qxi = yi; i = 1; : : : ; n; if and only if hxi; xji = hyi; yji; i; j = 1; : : : ; n: Solution. By a theorem we have proved in the lectures, an orthogonal matrix Q preserves inner product and so satisfies hyi; yji = hQxi;Qxji = hxi; xji: It remains to show that if the latter condition is satisfied, then such an orthogonal matrix may be defined. First, observe that since B1 and B2 are bases, the equations Qxi = yi; i = 1; : : : ; n; 3 1 n defines the matrix Q uniquely . We only need to verify that Q is orthogonal. Let x 2 R and let Xn x = αixi i=1 be the representation of x in terms of B1. Then kQxk2 = hQx;Qxi DXn Xn E = αiQxi; αjQxj i=1 j=1 DXn Xn E = αiyi; αjyj i=1 j=1 Xn Xn = αiαjhyi; yji i=1 j=1 Xn Xn = αiαjhxi; xji i=1 j=1 DXn Xn E = αixi; αjxj i=1 j=1 = hx; xi = kxk2: n Since this is true for all x 2 R , ie. Q preserves norm, the aforementioned theorem in the lectures imply that Q is orthogonal. n×n (b) Show that if B1 is an orthonormal basis and if Q 2 R is an orthogonal matrix, then fQx1;:::;Qxng is also an orthonormal basis. Solution. Clearly Qx1;:::;Qxn are linearly independent since if α1Qx1 + ··· + αnQxn = 0; then Q(α1x1 + ··· + αnxn) = 0; and so > α1x1 + ··· + αnxn = Q 0 = 0 and so α1 = ··· = αn = 0 by the linear independence of x1;:::; xn. Hence fQx1;:::;Qxng n is a basis (since the set has size n = dim(R )). The orthonormality follows from the fact that Q preserves inner product and that x1;:::; xn form an orthonormal basis: ( 0 if i 6= j; hQxi;Qxji = hxi; xji = 1 if i = j: 1 In fact, Q is just the matrix whose coordinate representation with respect to B1 and B2 is the identity matrix, ie. [Q]B1;B2 = I. Explicitly, if we let X and Y be the matrices whose columns are x1;:::; xn and y1;:::; yn respectively, then Q = YX−1. 4.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    4 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us