Short Notes on Linear Algebra 1 Vector Spaces, Independence

Short Notes on Linear Algebra 1 Vector Spaces, Independence

Short notes on Linear Algebra by Sanand D ||||||||||||||||||||||||||||||||||||||||- 1 Vector spaces, independence, dimension, basis, linear operators 1.1 Introduction Definition 1.1. Let V be a set whose elements are called vectors with an operation of vector addition (+) such that + is commutative and associative and the following properties are satisfied • f0g 2 V such that 0 + v = v 8v 2 V. • For all v 2 V, −v 2 V such that v + (−v) = 0. • For all v1; v2 2 V, v1 + v2 2 V. Let F be a field such that there is an operation of scalar multiplication (:) between elements of F and V satisfying the following properties • 1:v = v 8v 2 V, 1 2 F • (a1a2)v = a1(a2v), 8a1; a2 2 F • a(v + w) = av + aw, 8a 2 F and 8v; w 2 V • (a1 + a2)v = a1v + a2v, 8a1; a2 2 F. A set V satisfying the properties above is said to be a vector space over a field F. Example 1.2. Cn; Rn; Cn×m; Rn×m, solutions of homogeneous linear equations, solutions of homo- geneous odes, set of real/complex valued continuous/differentiable/analytic functions are examples of vector spaces. A set of polynomials with real/complex coefficients form a vector space over real/complex numbers. Definition 1.3. A subset W of V satisfying the properties above is called a subspace of V. Example 1.4. R; R2 form a subspace of R3. A set of differentiable functions form a subspace of a vector space of continuous functions. A set of complex polynomials of degree at most n form a subspace of C[x] (Geometrically, one can think of vector spaces as euclidean spaces and subspaces are planes passing through the origin.) Suppose v1; : : : ; vk 2 V. Then < v1; : : : ; vk > is the span of fv1; : : : ; vkg which is a collection of all linear combinations of fv1; : : : ; vkg. The span of any set forms a subspace. Definition 1.5. A set of non-zero vectors v1; : : : ; vk is said to be independent if α1v1+:::+αkvk = 0 implies that all αis (1 ≤ i ≤ k) are equal to zero. A set of vectors which is not independent is said to be dependent. (Geometrically if v depends on v1; : : : ; vk, then v lies in the space spanned by v1; : : : ; vk. Whereas if v is independent on v1; : : : ; vk, then v lies outside the space spanned by v1; : : : ; vk.) Definition 1.6. A maximal linearly independent set is called a basis. 1 Basis is not unique. By the definition of basis and linear independence, each v 2 V can be represented as a unique linear combination of basis vectors. Any linearly independent set can be extended to a basis by adjoining linearly independent vectors to the previous set. [ ] [ ] [ ] T T T Example 1.7. e1 = 1 0 ::: 0 ; e2 = 0 1 ::: 0 ; : : : ; en = 0 0 ::: 1 form a basis for Cn. This is called the standard basis. Definition 1.8. Dimension of a vector space is the cardinality of its basis. Example 1.9. Dimension of Cn over C is n. Dimension of C[x] over C is infinite with basis 1; x; : : : ; xn;:::. Dimension of a subspace of polynomials of degree at most n is n + 1 with basis 1; x; : : : ; xn. If V1 and V2 are two subspaces of V, then V1 +V2 and V1 \V2 are subspaces of V. By choosing a basis for V1 and extending to basis of V by taking linearly independent vectors in V2 which are not in V1, it follows that dim(V1 + V2) = dim(V1)+ dimV2− dim (V1 \ V2) Definition 1.10. If V1 + V2 = V and V1 \ V2 = f0g, then we say that V is a direct sum of V1 and V2. It is denoted by V = V1 ⊕ V2. Example 1.11. Cn = C ⊕ C ⊕ ::: ⊕ C i.e. direct sum of n copies of C. 1.2 Co-ordinates and linear maps on vector spaces Co-ordinates: Let fv1; : : : ; vng be a basis of V. Let v 2 V. Therefore, v can be expressed uniquely [as a linear combination] of vis, v = α1v1 + ::: + αnvn. Thus, w.r.t. this basis, v has co-ordinates T α1 α2 : : : αn . Matrix representation of linear operators: Let A : V ! V. A is said to be linear if for all v; w 2 V and c1; c2 2 F, A(c1v + c2w) = c1Av + c2Aw. Let fv1; : : : ; vng be a basis of V. Then for any v 2 V, Av = α1Av1 +:::+αnAvn. Thus, it is enough to define the action of A on basis vectors. 2 3 α1 6 7 [ ] 6 α2 7 6 7 Av = Av1 Av2 : : : Avn 6 : 7 : (1) 4 : 5 αn Let Avi = a1iv1 + ::: + anivn for 1 ≤ i ≤ n. Then, the matrix representation of A w.r.t. the basis above is 2 3 a11 a12 : : : a1n 6 7 6 a21 a22 : : : a2n 7 6 7 A = 6 :::::: 7 (2) 4 :::::: 5 an1 an2 : : : ann and, co-ordinates of Av w.r.t. the given basis are 2 3 2 3 a11 a12 : : : a1n α1 6 7 6 7 6 a21 a22 : : : a2n 7 6 α2 7 6 7 6 7 Av = 6 :::::: 7 6 : 7 : (3) 4 :::::: 5 4 : 5 an1 an2 : : : ann αn Definition 1.12. Kernel of an operator A : V ! W is the set of vectors v 2 V such that Av = 0. 2 Kernel of an operator A is denoted by ker(A) and it is a subspace of V. Image of A is set of all w 2 W such that w = Av for some v 2 V. Im(A) is also a subspace. When A is represented by a matrix w.r.t. some basis, then any vector in the image space of A is a linear combination of columns of A. Thus, Im(A) is spanned by columns of A. Definition 1.13. Rank of a matrix is equal to the number of its linearly independent columns. Consequently, it is equal to the dimension of the image space of A. Thus, Ax = b has a solution iff b lies in the column span of A iff the augmented matrix [A b] has the same rank as that of A. Note that A is onto , the rank of A is equal to the dimension of the codomain space and A is 1 − 1 , the dimension of ker(A) is equal to zero. It is 1 − 1 and onto , the dimension of ker(A) is equal to zero and the rank of A is equal to the dimension of the codomain space. (Note that Im(A) is a subspace of the codomain. If rank(A) = dimension of Im(A) is equal to the dimension of the codomain, then Im(A) = codomain space.) Theorem 1.14. (Rank-Nullity) Let A : V ! V where V is an n−dimensional vector space. Then rank(A)+dim(ker(A)) = n. Proof. Let v1; : : : ; vk for a basis for ker(A). Extend this set to a full basis of V say v1; : : : ; vn. Claim: fAvk+1; : : : ; Avng are linearly independent. Suppose not, therefore, αk+1Avk+1 +:::+αnAvn = 0. This implies that A(αk+1vk+1 +:::+αnvn) = 0, hence αk+1vk+1 +:::+αnvn 2 ker(A). Therefore, αk+1vk+1 +:::+αnvn = β1v1 +:::+βkvk. But this contradicts the linear independence of fv1; : : : ; vng. Therefore, fAvk+1; : : : ; Avng are linearly independent. Since rank (A) = dim(Im(A)), rank(A) = n − k and dim(ker(A)) = k. Suppose A : V ! W. Let fv1; : : : ; vng be a basis of V and fw1; : : : ; wmg be a basis for W. Then to find a matrix representation for A, it is enough to define Avi (1 ≤ i ≤ n). Avi = a1iw1 + ::: + amiwm (1 ≤ i ≤ n). Thus, A is represented by an m × n matrix A = [aij]. Rank- Nullity theorem holds for these linear maps as well where n is the dimension of the domain. Dual spaces: For a vector space V, its dual V∗ consists of all linear maps from V to F (the underlying field). If V is an n dimensional vector space whose elements are represented by column vectors w.r.t. some basis, then elements of V∗ are represented by 1 × n matrices which can be thought of as row vectors. Thus, the dual space of column vectors is row vectors and vice versa. f g V f ∗ ∗g V∗ ∗ For any basis v1; : : : ; vn of , there exists a dual basis v1; : : : ; vn of such that vi (vj) = 1 if ∗ 6 V ! W ∗ W∗ ! V∗ i = j and vi (vj) = 0 if i = j. A linear map A : induces a map A : . The action ∗ ∗ ∗ ∗ of A is defined as A w (v) := w (Av). Let A = [aij] be a matrix representation of A w.r.t. bases ∗ v1; : : : ; vn and w1; : : : ; wm of V and W respectively and [bij] be a representation of A w.r.t. dual ∗ ∗ ∗ W∗ ∗ bases. Consider the action of A on basis vectors w1; : : : ; wm of . Since A = [bij] is a matrix representation of A∗, ∗ ∗ ∗ ∗ A wi = (b1iv1 + ::: + bnivn) (4) ) ∗ ∗ ∗ ∗ A wi (vj) = (b1iv1 + ::: + bnivn)vj = bji: (5) Observe that since A∗w∗(v) = w∗(Av) from the definition of A∗, ∗ ∗ ∗ ∗ A wi (vj) = wi (Avj) = wi (a1jw1 + ::: + amjwm) = aij (6) (7) ∗ Thus, aij = bji and the matrix representation of induced map A is the transpose of A. Observe that (V∗)∗ = V i.e. the double dual of V is V itself for finite dimensional vector spaces. 3 2 Change of basis Let e1; : : : ; en be the standard basis of a vector space V. Let this be an old basis and let Aold be a matrix representation of a linear operator A : V ! V.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    26 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us