<<

화공수학    4 1 2 0 − 3 5    Coefficient : Coefficient 3x + 4z = 0 ) 5x – 2y + z = 0 Ex. 1 row elements or entries column    0 8 Part B. , Vector Calculus Algebra, Vector Part B. Linear Linear Systems of Equations Linear Systems 4 . 32 0 − 2 5    Chap. 6: Linear Algebra; Matrices, Vectors, , Algebra; Matrices, Vectors, Chap. 6: Linear eigenvalue problems. - and application Theory of linear systems equations, transformations, and - matrices, determinants, … Vectors, 6.1. Basic concepts - concepts and rules of matrix Basic and vector algebra - Matrix: rectangular array of numbers (or functions) Loan (1996) Van Reference: “Matrix Computations” by G.H. Golub, C.F. Ch. 6: Linear Algebra Ch. 6: Linear 화공수학 n 2 1 b b b = = = n n n x x x n n nn 2 1 a a m x n matrix m = n: a linear equations) of the matrix. (important for simultaneous + + : principal or main diagonal : principal or + square ii a - rectangular matrixnot that is   b  + + + = 2 2 2 x       x x x 2 n n n 22 2 1  mn 12 A a a a a a a + + + 1 1 1      x x x 1 n 21 11 2 a a a        22 12  m 2 1 n a a a  b b b       1 = 21 11 m  a a a             2 1 n  x = x x       jk       a 0 n 0 n [] 0 nn 2 1  = = = a a a = ) ) ) A n n n x x x     Column (Variable) ,..., ,..., Row (Eqn.) ,..., B 2 2 2 b 2 , 22 n 12  , x x x a a ij A a a , , , 1 1 : a : 1 x x x 1 ( ( ( 21 n 11  2 n 1 a a a or f f f        Vector Matrix General Notations and Concepts General A Ch. 6: Linear Algebra Ch. 6: Linear ) ij a

− 화공수학 = T ij ) a ij ( a = A T − ij = a ( T A A = T A , T A      T 1 m  b B b      = = T T Square matrices Square B b A Symmetric matrices: Skew-symmetric matrices:  () T , A T T       B B 2 1 T + m  mn m C a a a T = A T m =      )     b T 1 C m  m c c B n B ×      ,..., 21 22 2  A 1 n + () a a a = ( b [] , c A () T T = n 11 12 1  , A A b a a a k A        = = = ) T T n kj T a × A [] k A m () () = ( T Properties of transpose: Properties A A Row vector: Column vector: Vectors Transposition Ch. 6: Linear Algebra Ch. 6: Linear 화공수학 ) B factor A B nd c k + + A A A c c same size same size ck = = () ) A = B ) ) k + ij ij A + b b A k ( () c = + A c c () k ij ij a − a kj ( b = = = number of rows of 2 = number of rows ik B ij A A a c ) ( = 1 = k m k  A − B ( = + factor ij W A st C + = V () C + l × U n = A C 0 + = W = l B × A + m A = = B V − B () 0 m + × + + + n U A A A () A number of columns 1 ( Equalities of Matrices: Matrix Addition: Multiplication: 6.2. - Multiplication of matrices by Ch. 6: Linear Algebra Ch. 6: Linear 화공수학 tridiagonal) (especially, (especially, Results of finite solutions difference ODE. PDE or for ) 0       0 = = mn  0 0 a A A A B B ≠     or B 0 22 A 12  0 a a = B 21 11  0 Banded matrix a a or       0 D = =       A C mn  0 0 a     2 22  m 0 a a ) 1 B 21 11 m  a a a k Lower triangular matrix (       C A B =       + ) C n n 2 1 mn  B C a a a B A A ( A = () k     C = = D 0 A B B C = 22 ) 12  0 = B + a a () A B C A k LU decomposition A A ( () A 11  0 0 a

Difference from Multiplication of Numbers - Matrix multiplication is not commutative in general: -- does not necessarily imply does not necessarily imply (even when Special Matrices Matrix can be decomposed into Lower & Upper triangular matrices        Ch. 6: Linear Algebra Ch. 6: Linear Upper triangular matrix 화공수학 p × m C = p × n B n × m Identity (or unit) matrix A ) j ) B ≠ i , 0 = ij I k (        b p p p k 1 b b b a ⋅ ⋅  = ⋅ 1 2 1 = m n ii th first column of k a a  I a k = ).( or        A    2 1 n I  b b b       2 2 2 b b n b ⋅ ⋅  ⋅ Diagonal matrix a 2 1 m a a a ) j  1 th first row of 1 ≠ 1 j b ( b b i ⋅ 2 ⋅ ( ⋅  a 2 1 m 0 k a a a b =        Symmetric matrix, 1 ⋅ ij j = a [] a ji a , B = a 0 = A b = ≠ ⋅ jk ij c a ii a a Special Matrices: Inner Product of Vectors of Row and Column Vectors: Product in Terms Ch. 6: Linear Algebra Ch. 6: Linear       화공수학 2 1 m  b b b       =       2 1 n  x x x             n n       2 1  mn 2 1 a a m a  b b b n n 2 1  mn     a a a at least one solution, i.e., x = 0) 2 22 12      m a a a 2 22 12  m 1 a a a 21 11 m  a a a 1       21 11 m  Gauss Elimination a a a       ces: the solution of linear systems equations m 2 b 1 b = b = n = n x n x x mn n n 2 a 1 0: nonhomogeneous system a a + ≠ + w + Augmented matrix b 0: homogeneous system (this has = b w  C   + B = + + 2 = 2 2 x w x 2 x x b B m 22 12 = , a A a a x x + = + + A 1 A 1 1 y x x x = 1 m 21 11 y   a a a  Linear Transformations Linear Systems of Equations: 6.3. - most important practical The use of matri Augmented Matrix Coefficient Matrix, Linear System, Ch. 6: Linear Algebra Ch. 6: Linear 화공수학 ill-conditioned  - to identify the solution difficult - error extremely sensitive to round-off 1 x Precisely one solution Infinite solutions close to singular Very Singular 2 x Same slope intercept Different No solution Geometric interpretation: Existence of solutions Geometric interpretation: Existence of Ex. 1) Ch. 6: Linear Algebra Ch. 6: Linear 4 x 34 ' a

− 화공수학 3 ' b       2 1 3 4 = ' ' b b b b 3       x ) = →       3 ' 2 3 4 1 x x x x b       =       4 x 24 14 44 ' ' − j a a a 34 ' x a ij ' + a 23 13 1 ' ' − − 3 + i a a N x =  j , − 4 i ' ' 22 12 ' ' b − − b a a = : pivot element) = 4 i 11 x x 41 Repeat back substitution, moving upward pivot element 0 1 − a       22       1 2 3 4 '       b b b b 2 3 4 1 ' ' ' '       b b b b       =       = 2 3 4 1       x x x x 2 3 4 1       x x x x                   14 24 44 ' − a a 24 a 34 14 ' ' ' 1 a a a 13 23 ' − − a a 23 13 ' ' 1 − a a 12 22 ' − − a a 12 ' 1 − − a 21 41 1 − a a 0 0 0 1 Upper triangular form And then second row becomes pivot row: a’ And then second row During this operations the first row: pivot row (a             Forward elimination of unknowns & solution through back substitution & solution through Forward elimination of unknowns ( Gauss Elimination - Standard method for solving linear systems - elimination sets of equations. The of unknowns can be extended to large Ch. 6: Linear Algebra Ch. 6: Linear b = b

화공수학 + 0 = p x A + h x A = ) p x + h x ( A b = 0 x = A x  A p  h equations is less than the number of n unknowns (m < n) : A homogeneous system possess nontrivial solutions if the number of m A : is also a solutions of the nonhomogeneous systems h +x p Determined: equations = unknowns unknowns Underdetermined: equations < Inconsistent: system has no solutions at all. Nonhomogeneous solution: x x Theorem Overdetermined: equations > unknowns Consistent: system has at least one solution. Homogeneous solution: x Homogeneous systems: always consistent (why? trivial solution exists, x=0) Elementary Row Operation: Row-Equivalent Systems - Interchange of two rows -to another row Addition of a constant multiple one row - Multiplication of a row by nonzero constant c Ch. 6: Linear Algebra Ch. 6: Linear 화공수학 : if r < n and : if r=n and Next issue

, if present, are zero.  , if present, are zero. is not zero. : if r < m and one of the numbers m m b b m ~ ~ b ~ ,..., ,..., 1 1 ,..., + + 1 r r + b b r ~ ~ b ~ Precisely one solution one Precisely Infinitely many solutions Infinitely many No solution      (c) (a) (b) 2 3 12 − 1 + r 3 r m / 0 1 2 b * ~ b b ~ ~ 1 1 b = b = = = 3 n = 0 0 / n x 2 0 1 n x rn − x n c n 2 1 c + 3 0 0 a      + +    +   and r + + Existence and uniqueness of the solutions      x 2 3 2 rr / 0 1 x x c 1 22 12 c a 3 / + 2 0 1 1 − x 11 3 0 0 a      Echelon Form: Information Resulting from It Ch. 6: Linear Algebra Ch. 6: Linear 2 4 3 1 ' ' ' ' ' ' ' '

b b 화공수학 b b = = = = 2 4 3 1 x x x x and 12       2 3 4 1 ' ' ' ' ' ' ' ' b b b b       =       2 3 4 1 x x x x             0 0 0 1 0 0 0 1 0 0 0 1 subtracting the second row from first 0 0 0 1       column containing the pivot element, except of column containing the pivot Multiplying the second row by a’ e all elements both above and below the pivot particularly well digital suited to computation             2 3 4 1 1 ' ' 2 3 4 ' ' ' ' ' ' ' b b b b b b b b             = =             2 3 4 1 2 3 4 1 x x x x x x x x                         24 34 14 ' ' ' 14 1 24 34 ' ' ' 1 ' a a a a a a 23 13 ' ' 1 − 13 23 ' a a ' 1 ' − a a 12 ' 1 − − 0 1 − − a 0 0 0 1 Gauss-Jordan elimination: 0 0 0 1             element in order to clear zero the entire course for the 1 in pivot position on main diagonal. - approach is to eliminat most efficient The Ch. 6: Linear Algebra Ch. 6: Linear . A . B B 화공수학 A      0 0 1 0 = 32 0 1 D l − from the manipulations of rhs manipulations of from the x 21 31 1 l A l , but with different rhs constants but with different , U      A = . L      2 3 1 must be evaluated for a single value of must be d d d nvolves only operations on the matrix of coefficient nvolves only operations on the matrix of coefficient B      vectors can be evaluated effectively. vectors can be evaluated effectively. =      matrix inverse 2 3 1 x x x           23 33 13 u u u B 22 12 0 = u u x A 11 0 0 u      ) ↑ 0 = B − Elimination step can be formulated so that it i formulated so be Elimination step can x LU-Decomposition once A has been “decomposed”, multiple rhs A once A (computation effort (computation effort  Overview of LU Decomposition -elimination Forward + back-substitution -many rhs suited for those situations where Well - solving equations with the same coefficient when Inefficient - means to compute the an efficient Provides  (1) LU Decomposition (1) LU - decomposition separates the time-consuming LU elimination of • -Upper triangular form: - on the diagonal: lower diagonal matrix Assume with 1’s Gauss elimination: LU Decomposition: Algebra Ch. 6: Linear      0 0 1 화공수학 32 0 1 f 21 31 1 f f      = L      33 23 ' 13 ' ' a a a 22 12 ' 0 a a 11 0 0 a      = B      U 2 3 1 = b b b      D L = ,      2 3 1 A U x x x = L           U U upper triangular matrix form ! → L 23 33 13 a a a A ∴  and B L D 22 32 12 = 32 a = a a B 21 31       − LD Ux A 21 31 11 x 33 23 ' 13 ' ' a a a a A a a      from = from ) D x 22 12 32 ' D f a a to eliminate a − to eliminate a to eliminate a x 22 11 11 11 21 31 U /a’ f f a ( /a /a      32 L 21 31 = a = a = a’ 21 31 32 f f LU decomposition step: Substitution step: LU Decomposition Version of Gauss Elimination LU Decomposition Version So that - strategy for obtaining solutions: Two-step Store f’s • a. Gauss Elimination: Use f Ch. 6: Linear Algebra Ch. 6: Linear 화공수학 1 n ,..., ,..., 2 3 , − 2 n , = 1 i − n for = j i b ij a for 1 1 = − j i  j − x i B ij d a = 1 ii + = i n a D = i  j L d − i d (identical to the back-substitution phase of conventional Gauss elimination) Gauss (identical of conventional to the back-substitution phase = i x & nn a / n d = n x Back-substitution step: b. Forward-substitution step: Ch. 6: Linear Algebra Ch. 6: Linear 화공수학 . 0 = 3 . a − 2 a 5 linearly dependent . ) 0 : any scalars) 1 j  c − / 1 j a c m 6 Two vectors are linearly independent. Two − a = m j c k + linearly independent  (same size) + are where m 2 ( m a 2 15 54 c m 0 − a + , …, a = m 1 1 , …, a 0 1 k m a 2 24 1 a + c m ’s: a 2 j c  21 42 + + − 2 0 a  6 2 + k − 21 3 [] [] 2 [] = a = = = 1 2 2 3 a 1 c a a a ) + ) and dependence ) Linear independence and 1 ex a 1 c 6.4. of a Matrix: Linear Dependence. Linear Dependence. Rank Vector 6.4. of a Matrix: -of solutions concepts for existence and uniqueness Key Linear Independence and Dependence of Vectors Given set of m vectors: a : holds with scalars not all zero Above relation (2) (c Algebra Ch. 6: Linear Ex. 1 -subset. independent into linearly can be expressed Vectors - Conditions satisfying above relation: (1) (Only) all zero c A 화공수학 rank ]: jk =[a A      . 21 0 42 A − (r) cannot exceed the      A 21 29 +      6 3 21 −      rank = 2 2 3  =      15 2 54 −           ,      15 2 54 21 − 0 42 −      same rank. 2 0 2 3 24 T + A equals the maximum number of linearly independent column      A 6 21 0 3 42 21 and − −      A 0 2 3 = 6 3 A =  21 − .           : (rank in terms of column vectors) A 2 0 = 24 =0 iff =0 iff      A A ) Rank ) Maximum number of linearly independent row vectors of independent Maximum number of linearly maximum number of linearly independent column vectors Ex. 3 Rank of a Matrix -row vectors of a matrix independent Maximum number of linearly The rank of a matrix Algebra Ch. 6: Linear vectors of - - rank Ex. 4 Theorem 1 화공수학 all their V . V V b are elements of β a+ α such that 0such x + x = 0 + x = is said to be a vector space if the following ten is said to be a vector space if the following , then kx is in V V V of vectors such that with any two a and b in x V 2 , then x+y is in V. V , there exists a vector –x such that x+(-x) = (-x)+x=0 , x+y = y+x )x x+ k x+ linear combination V 2 V 1 k : a set 1 )x = k 2 x) = (k 2 +k 1 be a set of elements on which two operations called vector addition and scalar (k 1 V Vector space Vector properties are satisfied. Axioms for vector addition Vector Space, Dimension, Vector - Let Then multiplication are defined. (i) If x and y are in (ii) For all x, y in zx, y, For all (iii) x+(y+z) = (x+y) +z in V, There is a unique vector 0(iv) in (v) For each x in Axioms for scalar multiplications (vi) If k is any scalar and x is in (vii) k(x+y) = kx + ky (viii) (k (ix) k (x) 1x = x Algebra Ch. 6: Linear . 화공수학 V 3 , a 1 or a 2 , a 1 have the same with the same number p A A ) , …, a 1 V A = dim V . A in Ex.1.: vector space of dim 2, basis a A : span of the column vectors a matrix : span of the row vectors a matrix number of vectors in V. (number of vectors a basis for W is called a subspace of V. of V. W is called a subspace (i) If x and y then x+y are in W, is in W. (ii) If x then kx is in W and k is any scalar, is W. dimension, equal to rank operations of vector addition and defined on V, then defined on V, and scalar multiplication of vector addition operations : a linearly independent set in V consisting of a maximum possible of a maximum V consisting set in independent : a linearly : (dim V) the maximum number of linearly independent vectors in independent : (dim V) the maximum number of linearly : If a subset W of vector space V is itself under the : The row space and the column of a matrix : of components (Span is a vector space) : the set of linear combinations of given vectors a : the set of linear combinations ) Consider a matrix Subspace Dimension A of Row space Basis for V Span Column space of A of Column space Theorem 2 Ex. 5 Vector Space, Dimension, Basis Vector - - - Algebra Ch. 6: Linear - - 화공수학 =2 A rank      2 0 58 2 0 28 0 0 42 has rank p: they are linearly dependent if p 3 0 0      , …, x  1      consisting of all vectors with n components has of all vectors with n components consisting n 15 2 54 , (with n components) are linearly independent if the , (with n components) are linearly independent − p 2 0 24 , …, x 1 : no change of rank property) 21 0 42 A − 6 3 21 matrix with row vectors x that rank is less than p. dimension n. −      : p vectors with n < p components are always linearly dependent. : p vectors with n < components are always linearly : Row-equivalent matrices : Row-equivalent The vector space R : : p vectors x = A ) dependence of the vectors dependence Practical application of rank in connection with the linearly independence and independence with the linearly of rank in connection Practical application Row-equivalent matrices have the same rank. Row-equivalent (Echelon form of Theorem 5 Invariance of Rank under Elementary Row Operations Theorem 3 Theorem 6 Algebra Ch. 6: Linear Ex. 6  Theorem 4 A

화공수학 ~ A m 2 b 1 b b = = r < n, system has infinitely many solutions. = n = and the augmented matrix have the same and the augmented n A x n A x x mn n n 2 a 1 a a + + +    + : If rank of + + 2 2 2 x 2 x x m elimination. 22 12 a ~ : exist, they can all be obtained by the Gauss If solutions a A a + + + 1 1 1 x and equals n. x x 1 m 21 11 : above system has precisely one solution iff this common rank r of : above system has precisely one solution iff a a a  : m equations in n unknowns : m equations : Fundamental theorem for linear systems rank. has solutions iff the coefficient matrix the coefficient iff has solutions 6.5. Solutions of Linear Systems: Existence, Uniqueness, Form 6.5. General Theorem 1 (a) Existence (b) Uniqueness Algebra Ch. 6: Linear (c) Infinitely many solutions (d) Gauss elimination 화공수학 x=b has A =0 n =0,…, x 1 is also solution vector. 2 : solution of homogeneous system) : solution of homogeneous x h 2 +c 1 x 1 x=b, x A

a 화공수학 − D 2 b 11 a = 21 2 1 a b b 23 33 13 12 a a a D a 23 13 21 a 11 − a a a 22 32 12 D 22 a a a a = 22 12 11 2 a a 3 2 1 a x b b 31 b , = a 2 = + b 22 12 1 a a 12 33 D 13 a a a − D ,... 21 11 a 1 a 22 32 12 D a D 1 = a a b = 21 A 1 = a x − det 22 12 a a = 23 33 D a a D 2 3 1 2 1 b b b b b 22 32 = = = = a a 3 3 3 1 x x x 11 x a 23 33 13 a a a = + + + 23 33 13 2 2 2 2 1 a a a x x x b b 12 32 22 = = a a a 22 32 12 2 2 + + + a a a x x 1 1 1 22 12 x x x a a 11 21 31 31 11 21 a + a + a a a a 1 1 = x ) Cramer’s rule: ) Cramer’s rule x 11 21 D a a DEs, vector algebra, etc.) 6.6. Determinants. Cramer’s Rule 6.6. -(eigenvalues, applications in computations, but important engineering Impractical - associated with an nxn square matrix Second-order Determinants Ex. 2 Third-order Determinants Ex. 3 Algebra Ch. 6: Linear ) ) n n , , 화공수학   , , 2 2 , , 1 1 = = j k ( ( jk jk M M jk jk a a k k + + j j ) ) 1 1 − − ) ( ( 1 n 1 = n n = , ) j k   n ,  = = , 2  , D D , 1 2 , = 1 k = ( j ( nk 21 jn C n n nn 2 1 in D  M C nk a a a − jk jn a = a + + 21      C  + 2 , + 22 k n 12  2 a 2 a a 33 13 j of a C a a jk k C 2 2 1 j 21 n 11  a M a a a a 32 12 k + a in D a + + j = k ) 1 jk 1 j = 1 A C C − k 21 1 ( j 1 a a det = M = = = ) Third-order jk D D C D Ex. 4 Cofactor of a Determinant of Any Order n Determinant of Algebra Ch. 6: Linear 화공수학 25 8 . . 12 6 3 − 47 4 4 . 0 9 − 2 lies the value of determinant by c. 5 0 0 0 another row does not alter the value of 2 0 0 0 leaves the value of a determinant unaltered.  1 0 6 1 − 4 6 9 1 − Transposition 2 5 0 8 3 2 4 0 : (d) : (a) Interchange of two rows multiplies the value of the determinant by –1. the value of determinant : (a) Interchange of two rows multiplies − = D (b) Addition of a multiple row to (b) zero row or column renders the value of a determinant zero. A (e) (f) Proportional rows or columns render the value of a determinant zero. In particular, a determinant with two identical rows or columns has the value zero. determinant. of a row by c multip (c) Multiplication by reduction to triangular form Ex. 7) Determinant Theorem 2 Theorem 1 General Properties of Determinants Algebra Ch. 6: Linear . –1

화공수학 . c 25 8 . . 12 6 al rows or columns has the value zero. 3 − 47 4 4 . 0 9 − 2 another row does not alter the value of render the value of a determinant zero. iplies the value of determinant by 5 0 0 0 2 0 0 0 leaves the value of a determinant unaltered. renders the value of a determinant zero.  1 0 6 1 − 4 6 9 1 A − det Transposition 2 5 0 8 n k 3 2 4 0 = : (d) : (a) Interchange of two rows multiplies the value of the determinant by the value of determinant : (a) Interchange of two rows multiplies − ) = A k D A zero row or column A ) Determinant by reduction to triangular form ) Determinant by reduction to triangular determinant. Proportional rows or columns In particular, a determinant with two identic In particular, det( (b) Addition of a multiple row to (b) (e) (f) (c) Multiplication of a row by c mult (c) Multiplication Ex. 7 Theorem 2 Theorem 1 General Properties of Determinants Algebra Ch. 6: Linear 화공수학 , it has A 0. has an r x submatrix with ≠ A A n 1 iff 1 iff D ≥ D n 2 iff det iff 1 = b b b 2 = = has is zero. = x n n rank r n A , x x rank n x n n nn 2  1 a a heoretically interesting in DEs and others heoretically a , ] has + + 1 + jk D D    =[a = + + + A 2 2 2 2 x x x x 2 , 22 n 12 1 a a a D D + + + 1 1 1 = x x x 1 1 n 21 11 x a a a  is square, n x n, it has A with r+1 or more rows that precisely one solution. If this system has a nonzero coefficient determinant D=det determinant If this system has a nonzero coefficient nonzero determinant, whereas the determinant of every square submatrix : (a) Linear system of n equations in the same number of unknowns, x in the same number of unknowns, : (a) Linear system of n equations : An m x n matrix : - If Cramer’s Rule Cramer’s - Not practical in computations, but t Theorem 4 - Rank ~~~ determinants Theorem 3 Rank in Terms of Determinants Rank in Terms Algebra Ch. 6: Linear is A

화공수학 ) 0. Hence 1 ≠ − A A I (  ) I A ( -1 =n, hence iff det =n, hence iff A A < n.  A         0 0 1 0, it has only the trivial solution x=0. If D=0, matrix ≠ I 0 0 0 ,  I exists iff rank exists iff  0 0 1 and D A  0 0 1 n n nn 2 1  is a nonsingular matrix (unique inverse !) matrix (unique is a nonsingular a a a A a singular matrix. system also has nontrivial solutions. has nontrivial system also     = n, and is singular if rank = n, and is singular I homogeneous A 2 = 22 12 n  n x identity matrix a a a of an n x matrix of an A  1 -1 − 1 n 21 A A 11  A a a a : Existence of the inverse =        1 homogeneous = − ) I A the has an inverse, then A A A ( ~~ no inverse, (b) If the system is nonsingular if rank The inverse 6.7. Inverse of a Matrix: Gauss-Jordan Inverse of a Matrix: Elimination 6.7. - If Theorem 1 Determination of the Inverse - n x n matrix Ch. 6: Linear Algebra Ch. 6: Linear     4 4 4 1 1 3 − − 4 4 assumes 4 1 1 I 화공수학 3 − − 4 4 4 1 1 3 − −         0 0 1 0 0 1 0 0 1     →     0 0 1 0 0 0 2 2 2 1 1 1 − −         2 2 2 or of unknown x, while square matrix while square x, or of unknown 3 1 1 2 2 2 3 1 1 0 0 1     →     0 0 1 0 0 1 2 . 0 0 1 B         2 2 1 1 2 2 1 1 assumes the role of the column vect assumes the role of column I 1 1 1 -1     = A → A the first column of the matrix inverse.     1 − 0 0 1 resulting solution will be resulting solution A … the second column of the matrix inverse … the second 0 0 1 =         1 0 0 0 0 1 1 0 0 1 −             A     = = A 2 1 1 Matrix Inverse (evaluate multiple rhs vectors) Best way: use the LU decomposition algorithm b b 2 1 1  elimination b. Matrix inversion by Gauss-Jordan The square matrix the role of rhs column vector a. Calculating the inverse -fashion in a column-by-column inverse can be computed The by generating solutions with unit vectors as the rhs constants. Ch. 6: Linear Algebra Ch. 6: Linear 2 1 1     화공수학       2 1 n nn n ] is given by  is diagonal with entries is diagonal jk A A A    -1 12 =[a A 11 a a A     − . n 21 22 2 A  21 22 A A a A 0. Then a ≠ − jj    n 11 12 1 in det  A A A A jk       1 det A = 1 1 det − A = T  jk    A [] 22 12 a A a is the cofactor of a have an inverse iff all a have an inverse iff 1 jk A det 21 1 11 − a a =    A 1 . − 1 = − nn A where A C A : The inverse of a nonsingular n x matrix The inverse of a nonsingular : = 1 − ) , …, 1/a C ) Inverse of a diagonal matrix ) Inverse of a diagonal ) For 3 x matrix 11 A ( 1/a Ex. 4 - Some Useful Formulas for Inverse Theorem 2 Ex. 3 - Diagonal matrices Ch. 6: Linear Algebra Ch. 6: Linear , 0 화공수학 ≠ B ) as well 0 0 = 0 = ≠ A A A A B B ≠ or , but B 0 0 A = = B AB or D 0 = = B B C Hence if A and 0. det . C = A A = B AB B det < n. = ) and B A , then implies B ( BA 0 AC = det = = AB ) AB B A ( < n and rank < n and A be n x n matrices. Then be n x matrices. det =n and =n, then : For any n x matrices C : Cancellation law : Cancellation A A , D B 0 is singular, so are is singular, , A = A = A B then rank C A Let A Vanishing of Matrix Products. Cancellation Law of Matrix Vanishing -is not commutative in general: Matrix multiplication -- does not necessarily imply does not necessarily imply (even when Products Matrix Determinants of Theorem 4 (a) If rank Theorem 3 (b) If rank (c) If Ch. 6: Linear Algebra Ch. 6: Linear