Vector Spaces
Total Page:16
File Type:pdf, Size:1020Kb
ApPENDIX A Vector Spaces This appendix reviews some of the basic definitions and properties of vectm spaces. It is presumed that, with the possible exception of Theorem A.14, all of the material presented here is familiar to the reader. Definition A.I. A set.A c Rn is a vector space if, for any x, y, Z E .A and scalars a, 13, operations of vector addition and scalar multiplication are defined such that: (1) (x + y) + Z = x + (y + z). (2) x + y = y + x. (3) There exists a vector 0 E.A such that x + 0 = x = 0 + x. (4) For any XE.A there exists y = -x such that x + y = 0 = y + x. (5) a(x + y) = ax + ay. (6) (a + f3)x = ax + f3x. (7) (af3)x = a(f3x). (8) 1· x = x. Definition A.2. Let .A be a vector space and let JV be a set with JV c .A. JV is a subspace of .A if and only if JV is a vector space. Theorem A.3. Let .A be a vector space and let JV be a nonempty subset of .A. If JV is closed under addition and scalar multiplication, then JV is a subspace of .A. Theorem A.4. Let .A be a vector space and let Xl' ••• , Xr be in .A. The set of all linear combinations of Xl' .•• ,xr is a subspace of .A. Appendix A. Vector Spaces 325 Definition A.S. The set of all linear combinations of Xl'OO',X, is called the space spanned by Xl' ... , x,. If % is a subspace of At and % equals the space spanned by Xl" .. , x" then {x 1, ... , x,} is called a spanning set for %. Vectors in Rn will be considered as n x 1 matrices. The 0 vector referred to in Definition A.l is just an n x 1 matrix of O's. Let A be an n x p matrix. Each column of A is a vector in Rn. The space spanned by the columns of A is called the column space of A and written C(A). If B is an n x r matrix, then C(A, B) is the space spanned by the columns of A and B. Definition A.6. Let Xl' 00 • , x, be vectors in At. If there exist scalars Cl 1 , 00 • , Clr not all zero so that L CliXi = 0, then Xl' 00 • , x, are linearly dependent. If such Cl;'S do not exist, Xl' ... , Xr are linearly independent. Definition A.7. If % is a subspace of At and if {x 1 ,oo.,x,} is a linearly independent spanning set for %, then {Xl' 00 • , x r } is called a basis for %. Theorem A.S. If % is a subspace of At, all bases for % have the same number of vectors. Theorem A.9. If v 1, ... , vr is a basis for % and x E % then the characterization x = L~~l CliVi is unique. PROOF. Suppose x = L~~l CliVi and x = L~~l PiVi' Then 0 = L~~l (Cli - P;)vi· Since the vectors Vi are linearly independent, Cl i - Pi = 0 for all i. 0 Definition A.IO. The rank of a subspace % is the number of elements in a basis of %. The rank is written r(%). If A is a matrix, the rank of C(A) is called the rank of A, and is written r(A). Definition A.H. Two vectors x and yare orthogonal (written x .1 y) if x' y = O. (x' is the transpose of x.) Two subspaces, %1 and %2 are orthogonal if x E %1 and y E.AI; implies x' y = O. {x 1, 00 • , xr} is an orthogonal basis for a space % if {x 1, 00 • , X,} is a basis for % and for i # j, xi Xj = o. {x 1,00 • , xr} is an ortho normal basis for % if {x l' ... , xr} is an orthogonal basis and xi Xi = 1 for i = 1, ... , r. The terms orthogonal and perpendicular are-used interchangeably. Our emphasis on orthogonality and our need to find orthogonal projection matrices make both the following theorem and its proof fundamental tools in linear model theory. Theorem A.12. (Gram-Schmidt Theorem). Let % be a space with basis {x 1, •.• , x,}. There exists an orthonormal basis for %, {y 1, ... , Yr} with Ys in the space spanned by Xl"'" x., S = 1, ... , r. 326 Appendix A. Vector Spaces PROOF. Define the y/s inductively Yl = (X~Xd-l/2Xl .-1 W. = x. - I (x;y;)y; ;=1 Y. = (W;W.)-1/2 W• o See Exercise A.1. Definition A.13. Let %.l = {y E vltly 1.. %}. %.l is called the orthogonal com plement of % with respect to vIt. If vIt is taken as Rn, then %.l is referred to as just the orthogonal complement of %. Theorem A.14. Let vIt be a vector space and let % be a subspace of vIt. The orthogonal complement of % with respect to vIt is a subspace of vIt and if x E vIt, x can be written uniquely as x = Xo + Xl with XoE% and Xl E%.l. The ranks of these spaces satisfy the relation r(vIt) = r(%) + r(%.l). PROOF. It is easily seen that %.l is a subspace by checking Theorem A.3. Let r(vIt) = nand r(%) = r. Let Vl""'Vr be a basis for % and extend this with WI"'" Wn-~ to a basis for vIt. Apply Gram-Schmidt to get v!, ... , v:, wt, ... , w:-r> an orthonormal basis for vIt with v!, ... , v: an orthonormal basis for %. If x E vIt, then r n-r X = I !X;vr + I pjWr i=l j=l Let Xo = I; !XiVr and Xl = I'i-r PiW;*, then XoE%, Xl E%.l, and X = Xo + Xl' To establish the uniqueness ofthe representation and the rank relationship we need to establish that {wt, ... , w:- r } is a basis for %.l. Since, by construc tion, the wt's are linearly independent and wt E %.l,j = 1, ... , n - r, it suffices to show that {wt, ... , w:- r } is a spanning set for %.l. If X E %.l, write r n-r X = I !XiVr + I pjWr i=l j=l However, since x'vr = 0 we have !Xi = 0 for i = 1, ... , r and x = Ij=r PjWj*, thus {wt, ... , w:- r } is a spanning set and a basis for %.l. To establish uniqueness, let X = Yo + Yt with Yo E % and Yt E %.l. Then Yo = "rL.,1 YiVi* and Yl = "n-rL.,l UjWj,s: * so X = "rL.,1 YiVi* + "n-rL.,1 UjWj.s: * BY t h e umque-. ness of the representation of X under any basis, Yi = !Xi and Pj = Dj for all i and j, thus Xo = Yo and Xl = Yl' Since a basis has been found for each of vIt, %, and %.l, we have r(vIt) = n, r(%) = r, and r(%.l) = n - r. Thus, r(vIt) = r(%) + r(%.l). 0 Definition A.I5. Let %1 and %2 be vector spaces. Then, the sum of %1 and AS is %1 + AS = {xix = Xl + X 2,Xl E%1,X2 E AS}· Appendix A. Vector Spaces 327 Exercises EXERCISE A.1 Give a detailed proof of the Gram-Schmidt theorem. Questions A.2 through A.13 involve the following matrices: 1 1001 0 0] r1001 0 0] r1 00] r1 22] A= r0 0 1 0 ; B= 0 1 0 ; D= 2 5 ; E= 2 7 ; 001 1 001 0 0 0 0 1561 5 6] r10521 0 5 2] r102261 0 2 2 6] F= r0 7 2 ; G= 2 5 7 9 ; H= 7 9 3 9 -1 o 0 9 0 0 0 3 0 0 0 3 -7 1001 0 0] r2002 0 0] rl]2 K= 1 1 0 ; L= 1 1 0 ; N= 3 . r101 101 4 EXERCISE A.2 Is the space spanned by the columns of A the same as the space spanned by the columns of B? How about the spaces spanned by the columns of K, L, F, D, and G? EXERCISE A.3 Give a matrix whose column space contains C(A). EXERCISE AA Give two matrices whose column spaces contain C(B). EXERCISE A.5 Which of the following equalities are valid: C(A) = C(A, D), C(D) = C(A,B), C(A,N) = C(A), C(N) = C(A), C(A) = C(F), C(A) = C(G), C(A) = C(H), C(A) = C(D)? EXERCISE A.6 Which of the following matrices have linearly independent columns: A,B,D,N,F,H,G? EXERCISE A. 7 Give a basis for the space spanned by the columns of each of the matrices listed: A, B, D, N, F, H, G. EXERCISE A.8 Give the ranks of A, B, D, E, F, G, H, K, L, and N. EXERCISE A.9 Which of the following matrices have columns that are mutually orthogonal: B, A, D? EXERCISE A.1O Give an orthogonal basis for the space spanned by the columns of each of the matrices listed: A, D, N, K, H, G. 328 Appendix A. Vector Spaces EXERCISE A.ll Find C(A).l and C(B).l (with respect to R"). EXERCISE A.12 Find two linearly independent vectors in the orthogonal comple ment of C(D) (with respect to R"). EXERCISE A.13 Find a vector in the orthogonal complement of C(D) with respect to C(A). EXERCISE A.14 Find an orthogonal basis for the space spanned by the columns of 1 4 2 3 0 X= 11 4 0 1 5 6 4 EXERCISE A.IS Find two linearly independent vectors in the orthogonal comple ment of C(X) (with respect to R").