Matrix Algebra: Summary
Total Page:16
File Type:pdf, Size:1020Kb
May 1, 2017 Appendix E Matrix Algebra: Summary Contents E.1 VectorsandMatrtices.......................... 2 E.1.1 Notation . 2 E.1.2 SpecialTypesofVectors. 2 E.1.3 SpecialTypesofMatrices . 2 E.1.4 VectorProducts ............................. 3 E.1.5 Useful Matrix Theorems . 4 E.2 EigenvalueProblem ........................... 5 E.2.1 Properties and Formulas for Eigenvalues and Eigenvectors . 5 E.2.2 Useful Eigenvalue Theorems . 6 The following is a brief summary of notation and facts from matrix algebra that are relevant to the contents of this textbook. Some of this material is in the text, and is included here for completeness. 1 Introduction to Scientific Computing and Data Analysis by M. Holmes (Springer, 2016) E.1 Vectors and Matrtices E.1.1 Notation Vectors: bold lower case letters v, x, etc. All vectors are assumed to be written in column format (see below) Matrices: bold upper case letters A, R,etc(seebelow) v a a a 1 11 12 ··· 1n v2 a21 a22 a2n v = 0 . 1 A = 0 . ··· . 1 . B C B ··· C Bv C Ba a a C B nC B m1 m2 ··· mnC @ A @ A Transpose: designated using a superscript T a a a 11 21 ··· m1 a a a T T 12 22 m2 v = v1 v2 vn A = 0 . ··· . 1 ··· . B ··· C Ba a a C B 1n 2n ··· mn C @ A Important properties: (AB)T = BT AT ,(A + B)T = AT + BT ,(AT )T = A, and (Av)T = vT AT E.1.2 Special Types of Vectors Independent Vectors: v , v , , v are independent if it is not possible to write any 1 2 ··· k one of them as a linear combination of the others. More mathematically, they are independent if the only numbers c , c , , c that satisfy c v + c v + + c v = 0 1 2 ··· k 1 1 2 2 ··· k k are c = c = = c = 0. 1 2 ··· k Orthogonal Vectors: two vectors v and w are orthogonal if and only if v w =0 · Orthonormal Vectors: v , v , , v are orthonormal if they are mutually orthogonal 1 2 ··· k (so, v v =0, i = j) and v v =1, i i · j 8 6 i · i 8 Unit Vector: this is a vector with unit length, which means that v =1 || || Zero Vector: this is the vector containing only zero entries. This vector is denoted as 0. E.1.3 Special Types of Matrices Diagonal Matrix: this is a square matrix with the only nonzero entries on the diagonal d 0 0 1 ··· 0 d2 0 D = 0 . ··· . 1 . B ··· C B 00 d C B ··· nC @ A Worth mentioning: one or more of the di’s can be zero 2 Introduction to Scientific Computing and Data Analysis by M. Holmes (Springer, 2016) Identity Matrix: this is a diagonal matrix with d = 1, i. This matrix is denoted as I. i 8 10 0 ··· 01 0 I = 0 . ··· . 1 . B ··· C B 00 1 C B ··· C @ A 1 Inverse Matrix: if A is square (and invertible), then inverse is denoted as A− and it has 1 1 the property that A− A = I (note that it is also true that AA− = I). 1 1 1 1 1 Important properties: (AB)− = B− A− and (A− )− = A Lower Triangular Matrix: one in which all entires above the diagonal are zero Worth mentioning: if A is lower triangular then AT is upper triangular Orthogonal Matrix: this is a square matrix whose columns are orthonormal vectors Positive Definite Matrix: one that satisfies xT Ax > 0, x = 0 8 6 Comment: it is always assumed that A is symmetric Square Matrix: n n matrix (so, the number of rows equals the number of columns) ⇥ Strictly Diagonal Dominant Matrix: one that satisfies n a > a i | ii| | ij|8 j=1 Xj=i 6 Symmetric Matrix: one that satisfies AT = A. Worth mentioning: a symmetric matrix must be square. Tridiagonal Matrix: a square matrix with the only nonzero entries appearing on the diag- onal, sub-diagonal and the super-diagonal Upper Triangular Matrix: one in which all entires below the diagonal are zero Worth mentioning: if A is upper triangular then AT is lower triangular E.1.4 Vector Products n n Inner (or, Dot) product: assuming v, w R ,thenv w viwi. This can be 2 · ⌘ i=1 written in vector form as v w = vT w or as v w = wT v. · · P Outer product: assuming v Rn and w Rm,then 2 2 0 "" "1 vwT w v w v w v . ⌘ 1 2 ··· m B C B ## #C B C B C @ A Worth mentioning: a) vwT is an n m matrix; b) if v and w are nonzero, then vwT ⇥ has rank one; c) (vwT )T = wvT ; and d) vvT is a symmetric matrix 3 Introduction to Scientific Computing and Data Analysis by M. Holmes (Springer, 2016) E.1.5 Useful Matrix Theorems Orthogonal Matrix Theorem. Suppose Q is an n n matrix. ⇥ 1 T 1. Q is an orthogonal matrix if and only if Q− = Q . 2. Q is an orthogonal matrix if and only if its rows are orthonormal vectors. 3. If Q is an orthogonal matrix then Qx = x . || ||2 || ||2 4. If Q1 and Q2 are orthogonal matrices then Q1Q2 is an orthogonal matrix. T T 1 5. If Q is an orthogonal matrix, then Q is an orthogonal matrix. Note that (Q )− is 1 T T the same matrix as (Q− ) , and both can be designated as Q− . 6. If Q is an orthogonal matrix, then det(Q)= 1. ± Positive Definite Matrix Theorem. Assume A is a symmetric n n matrix. ⇥ 1. A is not positive definite if any diagonal entry is negative or zero, or if the largest number, in absolute value, is o↵the diagonal. 2. A is positive definite if the diagonals are all positive and it is strictly diagonal dominant. 3. A is positive definite if and only if all of its eigenvalues are positive. 1 4. If A is positive definite then it is invertible and A− is positive definite. AT A and AAT Theorem. Assume A is an m n matrix. ⇥ 1. AT A is an n n symmetric matrix with non-negative eigenvalues, and AAT is an ⇥ m m symmetric matrix with non-negative eigenvalues. ⇥ 2. If the columns of A are linearly independent, then AT A is positive definite. Moreover, T 2 2(A A)=[2(A)] . 3. If the rows of A are linearly independent, then AAT is positive definite. Moreover, T T 2 2(AA )= 2(A ) . 4. If λ is a nonzero⇥ eigenvalue⇤ of AT A with eigenvector x, then λ is an eigenvalue of AAT with eigenvector Ax.Ifλ =0is an eigenvalue of AT A with eigenvector x,and if Ax = 0, then λ =0is an eigenvalue of AAT with eigenvector Ax. 6 5. If λ is a nonzero eigenvalue of AAT with eigenvector y, then λ is an eigenvalue of AT A with eigenvector AT y.Ifλ =0is an eigenvalue of AAT with eigenvector y,and if AT y = 0, then λ =0is an eigenvalue of AT A with eigenvector AT y. 6 4 Introduction to Scientific Computing and Data Analysis by M. Holmes (Springer, 2016) E.2 Eigenvalue Problem Given a square matrix A,findthenumber(s)λ and nonzero vectors x that satisfy Ax = λx. Analytical procedure to solve this problem: 1. Solve (for λ) det(A λI)=0 − This is known as the characteristic equation. 2. For each eigenvalue λ, solve (for x) (A λI)x = 0 − E.2.1 Properties and Formulas for Eigenvalues and Eigenvectors Defective matrix: An n n matrix is said to be defective if it does not have n linearly ⇥ independent eigenvectors. Geometric multiplicity: The number of linearly independent eigenvectors for an eigen- value is the eigenvalue’s geometric multiplicity. Inverse Matrix: a matrix with a zero eigenvalue is not invertible. Nonuniqueness: If x is an eigenvector, then ↵ x is an eigenvector for the same eigenvalue for any nonzero scalar ↵. Rayleigh quotient: given an eigenvector x, then the associated eigenvalue can be comput- ing using the formula x Ax λ = · x x · Trace formula: Let A is an n n matrix, with eigenvalues λ , λ , , λ (each eigenvalue ⇥ 1 2 ··· n listed as many times as its respective geometric multiplicity), then n tr(A)= λi, Xi=1 5 Introduction to Scientific Computing and Data Analysis by M. Holmes (Springer, 2016) E.2.2 Useful Eigenvalue Theorems Symmetric Eigenvalue Theorem. If A is a symmetric n n matrix, then the following ⇥ hold: 1. Its eigenvalues are real numbers. 2. If x and x are eigenvectors for di↵erent eigenvalues, then x x =0. i j i · j 3. It is possible to find a set of orthonormal basis vectors u , u , , u , where each u 1 2 ··· n i is an eigenvector for A. Modified Eigenvalue Theorem. Suppose the eigenvalues of an n n matrix A are λ , ⇥ 1 λ , , λ . 2 ··· n 1. Setting B = A !I, where ! is a constant, then the eigenvalues of B are λ !, λ !, − 1 − 2 − , λ !.Also,ifx is an eigenvector for A that corresponds to λ , then x is an ··· n − i i i eigenvector for B that corresponds to λ !. i − 1 2. If A is invertible, then the eigenvalues of A− are 1/λ1, 1/λ2, , 1/λn.Also,ifxi ··· 1 is an eigenvector for A that corresponds to λi, then xi is an eigenvector for A− that corresponds to 1/λi. Spectral Decomposition Theorem. If A is a symmetric matrix, then it is possible to factor A as A = QDQT , where D is a diagonal matrix and Q is an orthogonal matrix.