
Appendix: Concepts of Linear Algebra In this appendix, some essential topics in linear algebra are reviewed. For each topic, we present some definitions, basic properties, and numerical examples. Notations An m ! n matrix A consists of m rows and n columns and mn elements (real or complex numbers) and is denoted by a11 a12 ! a1n a21 a22 ! a2n mn A " " $aij %i,j 1 " $aij %mn " $aij %. " "#" " am1 am2 ! amn The element aii is called the ith diagonal element of A and aij for i ! j is called the &i,j'th element of A. We say the size of A is m ! n or the order of A is m when m " n.Anm ! 1 matrix is said to be an m-vector or a column m-vector; and an 1 ! n matrix is said to be an n-vector or a row n "vector. To avoid any confusion, an n-vector means a column vector in this appendix and a row vector is represented by the transpose (it will be defined shortly) of a column vector. Commonly, Rn and Cn are notations for the sets of real and complex column n-vectors, respectively; and Rm!n and Cm!n are notations for the sets that contain all m ! n real and complex matrices, respectively. If we do not specify the type of a matrix A, then A can be either real or complex. The following are examples of a 2 ! 3 matrix, a column 2-vector and a row 3-vector: 102" i 1 A " , v " , w " abc . "2.5 3i "4 "2 A is said to be a square matrix if m " n, otherwise a rectangular matrix. Z is said to be a zero matrix, denoted by Z " $0%mn " 0, if all elements of Z are zero. Matrix D is said to be an n ! n diagonal matrix if all elements of D are zero except its diagonal elements and is commonly written as D "diag d1, #, dn .Ann ! n diagonal matrix with all diagonal elements equal to 1 is called the n ! n identity matrix, denoted by In or I. A matrix T is said to be an upper (lower) triangular matrix if all its elements below (above) its diagonal are zero. A matrix S is said to be a submatrix of A if the rows and columns of S are consecutive rows and columns of A. If the rows and and columns start from the first ones, S is also called a leading submatrix of A. For example, 3 "14 1 "2 S " 12 is a submatrix of A " and S " is a leading submatrix 512 "45 1 "23 of A " "45"6 . 7 "89 Basic Operations Transpose and Hermitian m!n T Given A " $aij % in R , the transpose of A, denoted by A ,isann ! m matrix whose rows are columns of A and columns are rows of A. When A is in Cm!n, the Hermitian of A, denoted by A!, n!m # is in C and its &i,j'th element is aji. For example, 14 123 1 $ i "2i 1 " i 3 A " , AT " 25 and B " , B! " . 456 34" i 2i 4 $ i 36 Trace of a Square Matrix The trace of an n ! n real square matrix A " $aij % is defined by the sum of the diagonal elements of A, that is, tr A n a . & ' " $k"1 kk 12 Example Let A " . Then, tr&A' " 1 $ 4 " 5. 34 It is not difficult to show that tr&AB' "tr&BA' provided that AB and BA both exist. Dot Product (or Inner Product) and Orthogonality u1 v1 Given two vectors u " " and v " " in Cn, the dot product or inner product of u un vn and v is a scalar ! and is defined as v1 n ! # # # ! " u v " u1 ! un " " $ uk vk. k"1 vn 1 "4 $ i "3 Example Let u " , v " and w " . Then 2 " 3i 5 " 6i 2 "4 $ i "3 u!v " 12$ 3i " "12 $ 26i, and wTw " "32 " 13. 5 " 6i 2 ! Vectors u and v are said to be orthogonal if u v " 0. A set of vectors (v1,...,vm ) is said to be ! ! orthogonal if vi vj " 0 for all i ! j; and is said to be orthonormal if in addition vi vi " 1 for all i " 1,...,m. Consider vectors 1 2 1 1 1 2 u1 " , u2 " , v1 " and v2 " . "1 2 2 "1 22 2 The set u1, u2 is orthogonal and the set v1, v2 is orthonormal. The dot product satisfies the Cauchy-Schwarz Inequality: &x!y'2 % &x!x'&y!y' for any vectors x and y in Cn. Matrix Addition and Scalar Multiplication Two matrices with the same size can be added or subtracted element-wise, and a matrix can be multiplied by a scalar (real or complex) element-wise. Let A " $aij %mn, B " $bij %mn and !," be scalars. Then A $ B " $aij $ bij %mn, !A " $!aij %mn and !A $ "B " $!aij $ "bij %mn. 123 789 Example Let A " , B " and ! " "2j. Then 456 10 11 12 81012 "2j "4j "6j "11 "10 "9 A $ B " , !A " , and 3A " 2B " . 14 16 18 "8j "10j "12j "8 "7 "6 Matrix addition and scalar multiplication have the following properties: 1. A $ B " B $ A; 2. A $&B $ C'"&A $ B'$C; 3. &!"'A " !&"A' " "&!A'; 4. &A $ B'T " AT $ BT. Matrix Multiplication Given two matrices A " $aij % and B " $bkl % with sizes m ! r and r ! n, the product C " AB " $cij % is an m ! n matrix and its &i,j'thelement is defined as b1j r b2j cij " $ aktbkj " ai1 ai2 ! air k"1 " brj the dot product of the ith row of A " and the jth column of B. 12 1 "1 Example Let A " 34 , and B " . Then "12 56 "13 "1 "1 "1 2 "3 AB " "15 , BAT " , BB " , and 357 "35 "17 51117 AAT " 11 25 39 . 17 39 61 For a square matrix A, the notation An for a positive integer n stands for the product AA#A (n times) and A0 & I. Matrix multiplication has the following properties: 1. ABC " A&BC'"&AB'C; 2. &A $ B'C " AC $ BC; 3. A&B $ C' " AB $ AC; 4. &AB'T " BTAT if A and B are real, and &AB'! " B!A! if A and B are complex. In general, matrix multiplication is not commutative, i.e. AB ! BA even if both AB and BA are well-defined and have the same size. When A is a matrix and B is a vector, we can write AB in terms of the columns of A and elements of B, or the rows of A and vector B. Let A be an m ! n matrix and R1 A " C1 # Cn " " Rm b1 % % where Cis and Ris are columns and rows of A, respectively. Let B " " . Then bn R1B AB " b1C1 $ ! $ bnCm " " . RmB Partitioned Matrices In many applications it is convenient to partition a matrix into blocks (submatrices). For 123 A11 A12 example, the matrix A " 456 can be partitioned as A " where A21 A22 789 12 3 A11 " , A " , A21 " 78 , and A22 " $9%;or 45 12 6 4 56 A11 " 12 , A12 " $3%, A21 " , and A22 " . Operations on partitioned 7 89 matrices work as if the blocks were scalars. For example, A11 A12 A13 B11 B12 B13 A11$B11 A12$B12 A13$B13 $ " , A21 A22 A23 B21 B22 B23 A21$B21 A22$B22 A23$B23 A11 A12 A11B11$A12B21 A11B12$A12B22 B11 B12 A21 A22 " A21B11$A22B21 A21B12$A22B22 B21 B22 A31 A32 A31B11$A32B21 A31B12$A32B22 provided that all the block products are well-defined. Determinant of a Square Matrix Determinant The determinant of a square matrix A, denoted by det&A', is a scalar which provides some useful information about A. The determinants of 2 ! 2 and 3 ! 3 matrices are defined respectively as: a11 a12 det " a11a22 " a12a21, a21 a22 a11 a12 a13 a11a22a33 $ a21a13a32 $ a31a12a23 det a21 a22 a23 " . "a11a23a32 " a21a12a33 " a31a13a22 a31 a32 a33 For a general n ! n matrix A " $aij %, the determinant is defined as: n n i$k k$j det&A' " $&"1' aik det&Aik ' " $&"1' akj det&Akj ' k"1 k"1 for any 1 % i,j % n where Apq is the &n " 1' ! &n " 1' matrix resulting from the deletion of the row p and the column q of A. For example, 1 "23 let i " 1 5 "6 det "45"6 &"1'1$1&1'det " "89 7 "89 "4 "6 "45 $ &"1'1$2&"2'det $ &"1'1$3&3'det 79 7 "8 " &"3' " &"2'&6' $ &3'&"3' " 0 1 "23 let j " 2 "4 "6 det "45"6 &"1'1$2&"2'det " 79 7 "89 13 13 $ &"1'2$2&5'det $ &"1'3$2&"8'det 79 "4 "6 " "&"2'&6' $ &5'&"12' " &"8'&6' " 0 p$q The determinant of Apq, det&Apq ', is called the &p,q'th minor of A and &"1' det&Apq ' is called the cofactor of apq. Directly from the definition, the determinant of a diagonal matrix is the product of its diagonal elements and the determinant of an upper or lower triangular matrix is also the product of its diagonal elements. Determinants have the following properties: 1. det&AB' " det&A'det&B'; 2. det&!A' " !n det&A' for any scalar ! and n ! n matrix A; 3. det&AT ' " det&A'; 4. det Ak " &det&A''k; 5. det&A' " 0 if any row (or column) of A is a scalar multiple of another row (or column); 6.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages21 Page
-
File Size-