Week 7 More Gaussian Elimination and Matrix Inversion 7.1 Opening Remarks 7.1.1 Introduction * View at edX 289 Week 7. More Gaussian Elimination and Matrix Inversion 290 7.1.2 Outline 7.1. Opening Remarks..................................... 289 7.1.1. Introduction..................................... 289 7.1.2. Outline....................................... 290 7.1.3. What You Will Learn................................ 291 7.2. When Gaussian Elimination Breaks Down....................... 292 7.2.1. When Gaussian Elimination Works........................ 292 7.2.2. The Problem.................................... 297 7.2.3. Permutations.................................... 299 7.2.4. Gaussian Elimination with Row Swapping (LU Factorization with Partial Pivoting) 304 7.2.5. When Gaussian Elimination Fails Altogether................... 311 7.3. The Inverse Matrix.................................... 312 7.3.1. Inverse Functions in 1D.............................. 312 7.3.2. Back to Linear Transformations.......................... 312 7.3.3. Simple Examples.................................. 314 7.3.4. More Advanced (but Still Simple) Examples................... 320 7.3.5. Properties...................................... 324 7.4. Enrichment......................................... 325 7.4.1. Library Routines for LU with Partial Pivoting................... 325 7.5. Wrap Up.......................................... 327 7.5.1. Homework..................................... 327 7.5.2. Summary...................................... 327 7.1. Opening Remarks 291 7.1.3 What You Will Learn Upon completion of this unit, you should be able to • Determine, recognize, and apply permutation matrices. • Apply permutation matrices to vectors and matrices. • Identify and interpret permutation matrices and fluently compute the multiplication of a matrix on the left and right by a permutation matrix. • Reason, make conjectures, and develop arguments about properties of permutation matrices. • Recognize when Gaussian elimination breaks down and apply row exchanges to solve the problem when appropriate. • Recognize when LU factorization fails and apply row pivoting to solve the problem when appropri- ate. • Recognize that when executing Gaussian elimination (LU factorization) with Ax = b where A is a square matrix, one of three things can happen: 1. The process completes with no zeroes on the diagonal of the resulting matrix U. Then A = LU and Ax = b has a unique solution, which can be found by solving Lz = b followed by Ux = z. 2. The process requires row exchanges, completing with no zeroes on the diagonal of the resulting matrix U. Then PA = LU and Ax = b has a unique solution, which can be found by solving Lz = Pb followed by Ux = z. 3. The process requires row exchanges, but at some point no row can be found that puts a nonzero on the diagonal, at which point the process fails (unless the zero appears as the last element on the diagonal, in which case it completes, but leaves a zero on the diagonal of the upper triangular matrix ). In Week 8 we will see that this means Ax = b does not have a unique solution. • Reason, make conjectures, and develop arguments about properties of inverses. • Find the inverse of a simple matrix by understanding how the corresponding linear transformation is related to the matrix-vector multiplication with the matrix. • Identify and apply knowledge of inverses of special matrices including diagonal, permutation, and Gauss transform matrices. • Determine whether a given matrix is an inverse of another given matrix. 0 1 a0;0 a0;1 • Recognize that a 2 × 2 matrix A = @ A has an inverse if and only if its determinant is a1;0 a1;1 not zero: det(A) = a0;0a1;1 − a0;1a1;0 6= 0. • Compute the inverse of a 2 × 2 matrix A if that inverse exists. Week 7. More Gaussian Elimination and Matrix Inversion 292 7.2 When Gaussian Elimination Breaks Down 7.2.1 When Gaussian Elimination Works * View at edX We know that if Gaussian elimination completes (the LU factorization of a given matrix can be com- puted) and the upper triangular factor U has no zeroes on the diagonal, then Ax = b can be solved for all right-hand side vectors b. Why? • If Gaussian elimination completes (the LU factorization can be computed), then A = LU for some unit lower triangular matrix L and upper triangular matrix U. We know this because of the equiva- lence of Gaussian elimination and LU factorization. If you look at the algorithm for forward substitition (solving Lz = b) in Figure 7.1 you notice that the only computations that are encountered are multiplies and adds. Thus, the algorithm will complete. Similarly, the backward substitution algorithm (for solving Ux = z) in Figure 7.2 can only break down if the division causes an error. And that can only happen if U has a zero on its diagonal. So, under the mentioned circumstances, we can compute a solution to Ax = b via Gaussian elimination, forward substitution, and back substitution. Last week we saw how to compute this solution. Is this the only solution? We first give an intuitive explanation, and then we move on and walk you through a rigorous proof. The reason is as follows: Assume that Ax = b has two solutions: u and v. Then • Au = b and Av = b. • This then means that vector w = u − v satisfies Aw = A(u − v) = Au − Av = b − b = 0: • Since Gaussian elimination completed we know that (LU)w = 0; or, equivalently, Lz = 0 and Uw = z: 7.2. When Gaussian Elimination Breaks Down 293 Algorithm: [b] := LTRSV UNB VAR1(L;b) 0 1 0 1 LTL 0 bT Partition L ! @ A , b ! @ A LBL LBR bB whereLTL is 0 × 0, bT has 0 rows while m(LTL) < m(L) do Repartition 0 1 0 1 0 1 L00 0 0 0 1 b0 LTL 0 B C bT B C ! B T C, ! B C @ A B l10 l11 0 C @ A B b1 C LBL LBR @ A bB @ A L l L b • 20 21 22 2 wherel11 is 1 × 1 , b1 has 1 row b2 := b2 − b1l21 Continue with 0 1 0 1 0 1 L00 0 0 0 1 b0 LTL 0 B C bT B C B T C B C @ A B l10 l11 0 C, @ A B b1 C LBL LBR @ A bB @ A L20 l21 L22 b2 endwhile Figure 7.1: Algorithm for solving Lz = b when L is a unit lower triangular matrix. The right-hand side vector b is overwritten with the solution vector z. • It is not hard to see that if Lz = 0 then z = 0: 0 10 1 0 1 1 0 0 ··· 0 z0 0 B CB C B C B l 1 0 ··· 0 CB z C B 0 C B 1;0 CB 1 C B C B CB C B C B l ; l ; 1 ··· 0 CB z C = B 0 C B 2 0 2 1 CB 2 C B C B . CB . C B . C B . .. CB . C B . C @ A@ A @ A ln−1;0 ln−1;1 ln−1;2 ··· 1 zn−1 0 means z0 = 0. But then l1;0z0 +z1 = 0 means z1 = 0. In turn l2;0z0 +l2;1z1 +z2 = 0 means z2 = 0. And so forth. • Thus, z = 0 and hence Uw = 0. Week 7. More Gaussian Elimination and Matrix Inversion 294 Algorithm: [b] := UTRSV UNB VAR1(U;b) 0 1 0 1 UTL UTR bT Partition U ! @ A , b ! @ A UBL UBR bB where UBR is 0×0, bB has 0 rows while m(UBR) < m(U) do Repartition 0 1 0 1 0 1 U00 u01 U02 0 1 b0 UTL UTR B C bT B C ! B T C ! B C @ A B 0 u11 u12 C, @ A B b1 C 0 UBR @ A bB @ A • 0 0 U22 b2 T b1 := b1 − u12b2 b1 := b1=u11 Continue with 0 1 0 1 0 1 U00 u01 U02 0 1 b0 UTL UTR B C bT B C B T C, B C @ A B 0 u11 u12 C @ A B b1 C 0 UBR @ A bB @ A 0 0 U22 b2 endwhile Figure 7.2: Algorithm for solving Ux = b when U is an upper triangular matrix. The right-hand side vector b is overwritten with the solution vector x. • It is not hard to see that if Uw = 0 then w = 0: 0 10 1 0 1 u0;0 ··· u0;n−3 u0;n−2 u0;n−1 w0 0 B . CB . C B . C B . ... CB . C B . C B . CB . C B . C B CB C B C B 0 ··· u u u CB w C = B 0 C B n−3;n−3 n−3;n−2 n−3;n−1 CB n−3 C B C B CB C B C B 0 ··· 0 un−2;n−2 un−2;n−1 CB wn−2 C B 0 C @ A@ A @ A 0 ··· 0 0 un−11;n−1 wn−1 0 means un−1;n−1wn−1 = 0 and hence wn−1 = 0 (since un−1;n−1 6= 0). But then un−2;n−2wn−2 + un−2;n−1wn−1 = 0 means wn−2 = 0. And so forth. We conclude that If Gaussian elimination completes with an upper triangular system that has no zero diagonal coeffi- cients (LU factorization computes with L and U where U has no diagonal zero elements), then for all right-hand side vectors, b; the linear system Ax = b has a unique solution x. 7.2. When Gaussian Elimination Breaks Down 295 A rigorous proof n×n Let A 2 R . If Gaussian elimination completes and the resulting upper triangular system has no zero coefficients on the diagonal (U has no zeroes on its diagonal), then there is a unique solution x to Ax = b for all b 2 R.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages45 Page
-
File Size-