Part 1: Linear Algebra 1 Chapter 1
Total Page:16
File Type:pdf, Size:1020Kb
Part 1: Linear Algebra 1 Chapter 1 Vector in Rn. Addition and scalar multiplication. Inner product, norm • (length or magnitude) of a vector, angle between two vectors u v cos θ = · u v k kk k orthogonality and Pythagorean theorem. Cauchy-Schwarz inequality: • u v u v . | · | ≤k kk k The textbook provides one proof in page 17 (1.14). Here we have another proof. 1. Step 1: If v = 0, the inequality is true (equal of the two sides). 2. Step 2: Introducing u v w = u · v, − v 2 k k it can be proved that w is orthogonal to v, u v u = w + · v. v 2 k k The RHS are two orthogonal vectors, therefore u v 2 u v 2 u 2= w 2 +| · | | · | . k k k k v 2 ≥ v 2 k k k k 3. Step 3: Multiply the inequality by v 2 and take square root, we have the Cauchy-Schwarz inequalityk k u v u v . k kk k≥ | · | Cross (vector product) u v, parallel vectors (n = 2, 3). • × 1 2 Chapter 1 Matrix, row and column vectors, transpose, addition, scalar multipli- • cation. l m m n Matrix multiplication, A R × ,B R × , the resulting matrix • l n ∈ ∈ is C = AB R × . Arithmetic rules of matrix multiplication cij = m ∈ k aikbkj. n n 1 SquareP matrix A R × , identity matrix, inverse of a matrix AA− = • I. ∈ Symmetric matrix aij = aji, anti-symmetric (skew-symmetric) matrix • T T T aij = aji, orthogonal matrix AA = I, normal matrix AA = A A. Symmetric,− skew-symmetric, and orthogonal matrices are normal, but other matrices could also be normal. H With complex matrix: conjugate transpose A , aij aji∗ . Hermitian • H H H → 1 H A = A, skew-hermitian A = A, unitary A = A− or A A = I and normal matrix AAH = AH A.− Other special matrices: diagonal, upper and lower triangular matrices. • n n 1 Vector as a special matrix v R v R × . Matrix and vector • multiplication A Rm n and∈ v R→n, Av∈ = w Rm. ∈ × ∈ ∈ Example: A + AH is Hermitian, A AH is skew-Hermitian, A can • be made as a summation of B + C,− where B is Hermitian and C is skew-Hermitian. 3 Chapter 3 Homogeneous and non-homogeneous equations. • Degenerate equation, consistency and uniqueness. • Echelon form, pivot variables and free variables. • Elementary operations: • 1. E1–Interchange two equations. 2 2. E2–Multiply an equation by a constant. 3. E3–Add (or subtract) another equation. Gauss elimination. Example: • x 3x 2x = 6 1 − 2 − 3 2x 4x 3x = 8 1 − 2 − 3 3x + 6x + 8x = 5 − 1 2 3 − (1) Echelon matrix satisfies the following properties • (1). All zero rows are at the bottom of the matrix; (2). Each leading nonzero entry in a row is to the right of the leading nonzero entry in the preceding row; Row canonical form of the matrix has two additional properties (3). Each pivot is equal to 1; (4). Each pivot is the only nonzero entry in its column. Row equivalence, two matrices A and B are row equivalent if the two • matrices can become each other by a set of elementary operations. Rank is the number of pivots in the echelon form. • Gauss elimination in matrix form. Example • 1 1 2 4 5 1 1 0 10 9 − − − [A, b] = 2 2 3 1 3 0 0 1 7 7 3 3 −4 2 1 ∼ 0 0 0− 0− 0 − − Existence: for M = [A, b] and A, the solution exist if and only if • rank(M) is the same as rank(A). uniqueness: the solution is unique if and only if the rank of A is n, • where n is the number of unknowns. Square system of linear equations. • 3 Inverse of coefficient matrix A. A square system of equations Ax = b • has a unique solution iff the coefficient matrix A is invertible. In this 1 case, A− b is the unique solution to the equation Ax = b. Proof: • 1 1 1. Sufficiency: if A− exists, let x = A− b, we have 1 1 A(A− b) = (AA− )b = Ib = b satisfies the equation. 2. Neccessity: If v is a solution and Av = b, then 1 1 1 v = Iv = (A− A)v = A− (Av) = A− b. The concept of linear combination. Interpretation of Ax = b: b is a • linear combination of column vectors of the matrix A. Homogeneous system of equations. Homogeneous equation always has • the trivial solution. When will homogeneous equation has non-trivial solution? If u and v are solutions of the homogeneous equation, the linear com- • bination of them is also the solution. Dimension and basis of solution to a homogeneous equation. • The non-zero solution of homogeneous equation. Example: • x + 2x 3x + 2x 4x = 0 1 2 − 3 4 − 5 x 3x + 2x = 0 3 − 4 5 2x 6x + 4x = 0 3 − 4 5 to become x + 2x 3x + 2x 4x = 0 1 2 − 3 4 − 5 x 3x + 2x = 0 3 − 4 5 Free variables are x2, x4, x5. Let one of them equal to one and others zero, find the three basis. 4 Corresponding non-homogeneous equation, if w is a particular solution • to the non-homogeneous equation, u + w is also a solution, where u is the solution of the corresponding homogeneous equation. Elementary matrix, elementary operations can be represented by mul- • tiplication of an elementary matrix. Non-singular matrix is a product of elementary matrices. Example of 3 3 elementary matrices. • × 1. Exchange two rows (second and third): 1 0 0 E = 0 0 1 0 1 0 2. Multiplying a (second) row by a constant k: 1 0 0 E = 0 k 0 0 0 1 3. Add a row to another (add second to third): 1 0 0 E = 0 1 0 0 1 1 Finding inverse of a square matrix A. • E E E A = I,E E E IA = I, 1 2 ··· n 1 2 ··· n therefore E E E I is the inverse of A. Example: 1 2 ··· n 1 0 2 A = 2 1 3 4− 1 8 Counting # of operations in Gauss elimination method. • 5 Lower and upper triangular matrices. The inverse of a lower triangular • matrix is another lower triangular matrix. The inverse of an upper triangular matrix is another upper triangular matrix. Gauss elimination can be expressed by the multiplication of a series of • ”atomic” lower triangular matrix. Its inverse gives the LU decomposi- tion. Doolittle algorithm. • (n) (n) (n 1) 1. Define A = L A − 2. Let 1 0 ... (n 1) − (n) 1 ai,n L = . , li,n = , i = n+1, , N. .. − (n 1) ··· ln+1,n an,n− . . .. 0 l 1 N,n 3. (N 1) (N 1) (1) U = A − = L − L A ··· 4 Chapter 4 Vector space, definitions: V is a vector space if it satisfies two conditions • 1. If u, v V , then u + v V . ∈ ∈ 2. If u V , then ku V , where k is a scalar. ∈ ∈ Axioms of vector space: 1. (u + v) + w = u + (v + w). 2. There exists 0, where u+0 = u; there exists u, where u+( u) = 0. − − 3. 1u = u, Examples of vector spaces: vector, matrix, polynomial, real functions. • 6 Subspace of a space: a subset and a space. • 1. An m-th degree polynomial is a subspace of an n-th degree poly- nomial if m < n. 2. Polynomial functions form a subspace of the real function space. 3. Vector subspace. Linear combinations, spanning sets. If V is a vector space, a set of • vectors u1, u2, , un forms a spanning set of V if ALL vectors in V can be made by··· a linear combination of the set. Linear dependence and independence. Given a set of vectors u1, u2, , um, • if a vector v can be made as a linear combination of the set, v is linearly··· dependent on the set. Otherwise v is linearly independent of the set. Basis and dimension. A minimum set of vectors which spans a space • V is called the basis of the space. The minimum number of vectors to span a space V is called the dimension of the space. Row space and rank of a matrix. Elementary operations will not change • the row space. Row equivalent matrices have the same row space. In echelon and row canonical forms, row vectors are linearly independent. They form the basis of the row space. The number of row in the two forms is called the rank of the matrix. An n n square matrix with rank n. • × Example: find the basis for W = span(u , u , , u ). • 1 2 ··· r 1. Form matrix using vectors as rows. 2. Reduce to the echelon form. 3. Output nonzero rows as the basis. The column rank is the same as the row rank. Column vectors form a • column space. Interpretation of Ax = b: x1a1 + x2a2 + xnan = b is a linear combination of column vectors of A. b must be··· in the column space, or there is no solution. 7 Example: Determine if a given vector w is a linear combination of n • vectors U = (u1, u2, un), and find the coefficient of linear combina- tion. This is to solve··· x u + x u + x u = w or Ux = w, 1 1 2 2 ··· n n where U is a matrix using u , u , , u as column vectors. 1 2 ··· n Polynomial as a vector. A polynomial is a vector whose components • are its coefficients. Homogeneous equation Ax = 0. The solutions of the equation form • a vector space, called the null space of A.