
University of Cambridge Mathematics Tripos Part IB Linear Algebra Michaelmas, 2017 Lectures by A. M. Keating Notes by Qiangru Kuang Contents Contents 1 Vector Space 3 1.1 Definitions ............................... 3 1.2 Vector Subspace ........................... 3 1.3 Span, Linear Independence & Basis ................. 5 1.4 Dimension ............................... 7 1.5 Direct Sum .............................. 8 2 Linear Map 11 2.1 Definitions ............................... 11 2.2 Isomorphism of Vector Spaces .................... 11 2.3 Linear Maps as Vector Space .................... 13 2.3.1 Matrices, an Interlude .................... 14 2.3.2 Representation of Linear Maps by Matrices ........ 14 2.3.3 Change of Bases ....................... 17 2.3.4 Elementary Matrices and Operations ............ 20 3 Dual Space & Dual Map 23 3.1 Definitions ............................... 23 3.2 Dual Map ............................... 24 3.3 Double Dual .............................. 27 4 Bilinear Form I 29 5 Determinant & Trace 32 5.1 Trace .................................. 32 5.2 Determinant .............................. 32 5.3 Determinant of Linear Maps ..................... 37 5.4 Determinant of Block-triangular Matrices ............. 37 5.5 Volume Interpretation of Determinant ............... 38 5.6 Determinant of Elementary Operation ............... 38 5.7 Column Expansion & Adjugate Matrices .............. 39 5.8 Application: System of Linear Equations .............. 40 6 Endomorphism 42 6.1 Definitions ............................... 42 6.2 Polynomial Ring, an Aside ...................... 43 6.3 Characteristic Polynomial of Endormorphism ........... 43 7 Bilinear Form II 55 7.1 Symmetric Bilinear Forms ...................... 55 7.2 Sesquilinear Form ........................... 61 7.3 Hermitian Form ............................ 61 7.4 Alternating Form ........................... 63 1 Contents 8 Inner Product Space 65 8.1 Definitions ............................... 65 8.2 Orthonomal Basis .......................... 66 8.3 Orthogonal Complements & Projections .............. 68 8.4 Adjoints ................................ 70 8.5 Self-adjoint Maps & Isomoetries ................... 71 8.5.1 Spectral Theory for Self-adjoint Maps ........... 73 8.5.2 Spectral Theory for Unitary Maps ............. 74 8.5.3 Application to Bilinear Forms ................ 75 Index 77 2 1 Vector Space 1 Vector Space Convention. Throughout this course, F denotes a general field. If you wish, think of it as R or C. 1.1 Definitions Definition (Vector space). An F-vector space (or a vector space over F) is an abelian group (V; +) equipped with a function, called scalar multiplication: F × V ! V (λ, v) 7! λ · v satisfying the axioms • distributive over vectors: λ(v1 + v2) = λ(v1 + v2), • distributive over scalars: (λ1 + λ2)v = λ1v + λ2v, • λ(µv) = λµv, • 1 · v = v. The additive unit of V is denoted by 0. Example. 1. For all n 2 N; Fn is the space of column vectors of length n with entries in F. It is an vector space by entry-wise addition and entry-wise scalar multiplication. 2. Mm;n(F), the set of m × n matrices with entries in F, with the operation defined as entry-wise addition. 3. For any set X, RX = ff : X ! Rg, the set of R-valued functions on X, with addition and scalar multiplication defined pointwise. For instance, (f1 + f2)(x) = f1(x) + f2(x). Exercise. 1. Check the above examples satisfy the axioms. 2. 0 · v = 0 and (−1) · v = −v for all v 2 V . 1.2 Vector Subspace Definition (Vector subspace). Let V be an F-vector space. A subset U ⊆ V is a subspace, denoted U ≤ V , if • 0 2 U, • U is closed under addition: 8u1; u2 2 U; u1 + u2 2 U, • U is closed under scalar multiplication: 8u 2 U; 8λ 2 F; λu 2 U. 3 1 Vector Space Exercise. If U is a subspace of V , then U is also an F-vector space. Example. 1. V = RR, the set all functions from R to itself, has a (proper) subspace C(R), the space of continuous functions on R as continuous functions are closed under addition and scalar multiplication. C(R) in turn has a proper subspace P (R), the set of all polynomials in R. 3 2. f(x1; x2; x3) 2 R : x1 + x2 + x3 = tg where t is some fixed constant is a subspace of R3 if and only if t = 0. Proposition 1.1. Let V be an F-vector space, U; W ≤ V . Then U \W ≤ V . Proof. • 0 2 U; 0 2 V so 0 2 U \ W . • Suppose u; w 2 U \ W . Fix λ, µ 2 F. As U ≤ V , λu + µw 2 U. As W ≤ V , λu + µw 2 W so λu + µw 2 U \ W . Take λ = µ = 1 for vector addition and µ = 0 for scalar multiplication. Example. V = R3;U = f(x; y; z): x = 0g;W = f(x; y; z): y = 0g, then U \ W = f(x; y; z): x = y = 0g. Note. The union of a family of subspaces is almost never a subspace. For example, V = R2, U; V be x- and y-axis. Definition (Sum of vector spaces). Let V be an F-vector space, U; W ≤ V , the sum of U and W is the set U + W = fu + w : u 2 U; w 2 W g Example. Use the definition from the previous example, U + W = V . Proposition 1.2. U + W ≤ V . Proof. • 0 = 0 + 0 2 U + W , • u1; u1 2 U; w1; w2 2 W , (u1 + w2) + (u2 + w2) = (u1 + u2) + (w1 + w2) 2 U + W , • similar for scalar multiplication. Left as an exercise. Note. U + W is the smallest subspace containing both U and W . This is because all elements of the form u + w are in such a space by closure under addition. 4 1 Vector Space Definition (Quotient vector space). Let V be an F-vector space, U ≤ V . The quotient space V=U is the abelian gropup V=U equipped with scalar multiplication F × V=U ! V=U (λ, v + U) 7! λv + U Proposition 1.3. This is well-defined and V=U is an F-vector space. Proof. First check it is well-defined. Suppose v1 + U = v2 + U 2 V=U. Then v1 − v2 2 U. Now use closure under scalar multiplication and distributivity, λv1 − λv2 = λ(v1 − v2) 2 U so λv1 + U = λv2 + U 2 V=U. Now check vector space axioms of V=U, which will follow from the axioms for V : • λ(µ(v + U)) = λ(µv + U) = λ(µv) + U = (λµ)v + U = λµ(v + U), • other axioms are left as an exercise. 1.3 Span, Linear Independence & Basis Definition (Span). Let V be a F-vector space, S ⊆ V be a subset. The span of S n X o hSi = λss : λs 2 F s2S is the set of all the finite linear combinations of elements (i.e. all but finitely many of the λ are zero) of S. Remark. hSi is the smallest subspace of V containing all elements of S. Convention. h;i = f0g Example. 1. V = R3, S = f(1; 0; 0); (0; 1; 2); (3; −2; −4)g, hSi = f(a; b; 2b): a; b 2 Rg X 2. For any set X, R is a vector space. For x 2 X, define δx : X ! R; δx(x) = 1; δx(y) = 0 8y 6= x, then X hδx : x 2 Xi = ff 2 R : f has finite supportg Definition (Span). S spans V if hSi = V . Definition (Finite-dimensional). V is finite-dimensional over F if it is spanned by a finite set. 5 1 Vector Space Definition (Linear independence). The vectors v1; : : : ; vn are linearly in- dependent over F if n X λi = 0 ) λi = 0 8i i=1 A subset S ⊆ V is linearly independent if every finite subset of S is linearly independent. A subset is linearly dependent if it is not linearly independent. Example. In the first example above, the three vectors are not linearly inde- pendent. Exercise. The set fδx : x 2 Xg is linearly independent. Definition (Basis). S is a basis of V if it is linearly independent and spans V . Example. n 1. F has standard basis fe1; e2; : : : ; eng where ei is the column vector with 1 in the ith entry and 0 elsewhere. 2. V = C over C has natural basis f1g, but over R it has natural basis f1; ig. 3. V = P (R), the space of real polynomials, has natural basis f1; x; x2;::: g: It is an exercise to check this carefully. Lemma 1.4. Let V be a F-vector space. The vectors v1; : : : ; vn form a basis of V if and only if each vector v 2 V has a unique expression n X v = λivi; λi 2 F: i=1 Proof. P • ): Fix v 2 V . The vi span V , so exists λi 2 F such that v = λivi. P Suppose also v = µivi for some µi 2 F. Then the difference X (µi − λi)vi = 0: Since the vi are linearly independent, µi − λi = 0 for all i. Pn • (: The vi span V by assumption. Suppose i=1 λivi = 0. Note that Pn 0 = i=0 0 · vi. By appying uniqueness to 0, λi = 0 for all i. 6 1 Vector Space Lemma 1.5. If v1; : : : ; vn spans V over F, then some subset of v1; : : : ; vn is a basis of V over F. Proof. If v1; : : : ; vn is linearly independent then done. Otherwise for some `, there exist α1; : : : ; α`−1 2 F such that `−1 X v` = αivi: i=1 P λi (If λivi = 0, not all λi is zero. Take ` maximal with λ` 6= 0, then αi = − .) λ` Now v1; : : : ; v`−1; v`+1; : : : ; vn still span V . Continue iteratively until we have linear independence. Theorem 1.6 (Steinitz Exchange Lemma). Let V be a finite-dimensional vector space over F. Take v1; : : : ; vm to be linearly independent, w1; : : : ; wn to span V .
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages78 Page
-
File Size-