Linear Algebra (VII)
Yijia Chen
1. Review
Basis and Dimension. We fix a vector space V. Lemma 1.1. Let A, B ⊆ V be two finite sets of vectors in V, possibly empty. If A is linearly indepen- dent and can be represented by B. Then |A| 6 |B|. Theorem 1.2. Let S ⊆ V and A, B ⊆ S be both maximally linearly independent in S. Then |A| = |B|.
Definition 1.3. Let e1,..., en ∈ V. Assume that
– e1,..., en are linearly independent,
– and every v ∈ V can be represented by e1,..., en.
Equivalently, {e1,..., en} is maximally linearly independent in V. Then {e1,..., en} is a basis of V. Note that n = 0 is allowed, and in that case, it is easy to see that V = {0}. By Theorem 1.2:
0 0 Lemma 1.4. If {e1,..., en} and {e1,..., em} be both bases of V with pairwise distinct ei’s and with 0 pairwise distinct ei, then n = m.
Definition 1.5. Let {e1,..., en} be a basis of V with pairwise distinct ei’s. Then the dimension of V, denoted by dim(V), is n. Equivalently, if rank(V) is defined, then dim(V) := rank(V).
1 Theorem 1.6. Assume dim(V) = n and let u1,..., un ∈ V.
(1) If u1,..., un are linearly independent, then {u1,..., un} is a basis.
(2) If every v ∈ V can be represented by u1,..., un, then {u1,..., un} is a basis.
Steinitz exchange lemma.
Theorem 1.7. Assume that dim(V) = n and v1,..., vm ∈ V with 1 6 m 6 n are linearly indepen- dent. Furthermore, let {e1,..., en} be a basis of V. Then for some 1 6 i1 < i2 < ··· < in−m 6 n
v1,..., vm, ei1 ,..., ein−m is a basis of V.
1 We do not assume beforehand that u1,..., un are pairwise distinct, although under the conditions in the theorem, they have to be, i.e., ui 6= uj for every 1 6 i < j 6 n.
1 Proof: We prove by induction on m and start with m = 1. Since {e1,..., en} is a basis, v1 can be represented by e1,..., en. Thus, there exist a1,..., an ∈ R such that
v1 = aiei. iX∈[n] As v1 6= 0 (otherwise, v1 is linearly dependent), there is an i ∈ [n] with ai 6= 0. It follows that
1 aj aj ei = v1 + − · ei + − · ei. ai ai ai i Now assume that m > 1 and v1,..., vm ∈ V are linearly independent. Of course v1,..., vm−1 are linearly independent too. By induction hypothesis on m − 1, there exist 1 6 i1 < i2 < ··· < in−m+1 6 n such that v1,..., vm−1, ei1 ,..., ein−m+1 is a basis of V. In particular, vmcan be represented by this basis, i.e., there exist a1,..., am−1, c1,..., cn−m+1 ∈ R such that vm = aivi + cjeij . i∈X[m−1] j∈[nX−m+1] Observe that j∈[n−m+1] cjeij 6= 0, otherwise, v1,..., vm would be linearly dependent. Thus, for some j ∈ [n − m + 1] we have c 6= 0, and thereby P j ai 1 c` eij = − · vi + · vm + − · ei` cj cj cj i∈X[m−1] `∈[n−Xm+1]\{j} Then it is easy to see that v1,..., vm, ei1 ,..., eij−1 , eij+1 ,..., ein−m+1 is a basis of V. 2 Remark 1.8. (i) The above proof is in fact essentially the same proof for Lemma 1.1. (ii) Again, we can drop the requirement 1 6 m, in particular, the case m = 0 holds trivially. 2. Back to the Textbook Matrix and matrix operations. Recall an m × n matrix has the form a11 a12 ··· a1n a21 a21 ··· a2n A = = a , . . .. . ij m×n . . . . am1 a21 ··· amn T where each aij ∈ R. The transpose matrix of A, denoted by A , is the n × m matrix a11 a21 ··· am1 a12 a22 ··· am2 . . . .. . . . . . a1n a2n ··· amn 2 Definition 2.1 (Matrix Addition). Let A = aij m×n and B = bij m×n be two m × n-matrices. Then a11 + b11 ··· a1n + b1n A + B := a + b = . .. . . ij ij m×n . . . am1 + bm1 ··· amn + bmn Definition 2.2. The zero m × n-matrix is 0 ··· 0 . .. . 0m×n = . . . . 0 ··· 0 When m, n are clear from the context, we write 0 instead of 0m×n. Lemma 2.3. (i) A + B = B + A. (ii) (A + B) + C = A + (B + C). (iii) (A + B)T = AT + BT . Definition 2.4 (Scalar Multiplication). Let A = aij m×n be an m × n-matrix and k ∈ R. Then k · a11 ··· k · a1n k · A := k · a = . .. . . ij m×n . . . k · am1 ··· k · amn Lemma 2.5. Let A and B be two m × n-matrices and k, ` ∈ N. (i) k · (` · A) = (k · `) · A. (ii) (k + `) · A = k · A + ` · A. (iii) k · (A + B) = k · A + k · B. (iv) (k · A)T = k · AT .