<<

AML702 Applied Computational Methods

I I T Lecture 08 Dc System of – E Gauss Elimination, L Pivoting, tridiagonal H systems I Solving Linear System of Equations

We want to solve the linear system

I a11x1 + a12 x2 +L+ a1n xn = b1 I a x + a x + + a x = b T 21 1 22 2 L 2n n 2 Dc M E an11x1 + an2 x2 +L+ ann xn = bn L This will be done by successively eliminating unknowns H from equations, until eventually we have only one I in one unknown. This process is known as Gaussian elimination. To solve such systems, there are direct methods and 2 iterative methods Revision of Algebra

Consider the following 2 x 2 matrices I c1 c2 I ⎡a11 a12 ⎤ ⎡b11 b12 ⎤ T A = arc = r1 a11 a12 = ⎢ ⎥; B = brc = ⎢ ⎥ ⎣a21 a22 ⎦ ⎣b21 b22 ⎦ Dc r2 a21 a22 E Matrix Addition L H a + b a + b I ⎡ 11 11 12 12 ⎤ A + B = ⎢ ⎥ ⎣a21 + b21 a22 + b22 ⎦ Matrix Operations

Multiplication by a quantity I

I ⎡ka11 ka12 ⎤ T kA = ⎢ ⎥ ⎣ka21 ka22 ⎦ Dc E ⎡a a ⎤⎡b b ⎤ ⎡a b + a b a b + a b ⎤ L C = AB = 11 12 11 12 = 11 11 12 21 11 12 12 22 ⎢a a ⎥⎢b b ⎥ ⎢a b + a b a b + a b ⎥ H ⎣ 21 22 ⎦⎣ 21 22 ⎦ ⎣ 21 11 22 21 21 12 22 22 ⎦ I Defined for matrices of r x c, c x r dimensions Matrix Operations

Transpose of a matrix – Reflection along leading diagonal I

I ⎡a11 a12 ⎤ ⎡a11 a21 ⎤ T A = ⎢ ⎥ ⇒ A′ = ⎢ ⎥ ⎣a21 a22 ⎦ ⎣a12 a22 ⎦ Dc ⎡1 0⎤ E I = ⎢0 1⎥ L ⎣ ⎦ H AA−1 = A−1 A = I I Every matrix has a . If At=A-1, such a matrix is called orthonormal Some Special Matrices

Symmetric Matrix I I T Diagonal Matrix Dc E Upper Triangle Matrix L H I Lower Triangle Matrix

A tridiagonal Matrix Gaussian Elimination x1 + x2 + x3 = 6 2x + 3x + 4x = 20 Consider a simple linear1 system2 3 3x1 + 4x2 + 2x3 = 17 x + x + x = 6 I 1 2 3 2x + 3x + 4x = 20 I 1 2 3 3x1 + 4x2 + 2x3 = 17 T ⎡1 1 1 6 ⎤ The A|b= ⎢2 3 4 20 ⎥ ⎢ ⎥ D Row operations ⎣⎢3 4 2 17 ⎦⎥ E ⎡1 1 1 6 ⎤ R ® R - 2/1 R ⎢0 1 2 8 ⎥ 2 2 1 ⎢ ⎥ L 0 1 -1 -1 R3 ® R3 - 3/1 R 1 = ⎣⎢ ⎦⎥ H ⎡1 1 1 6 ⎤ R3 ® R3 -1/1 R 2 ⎢0 1 2 8 ⎥ I Back Substitutiion: ⎢ ⎥ ⎣⎢0 0 -3 -9 ⎦⎥ -3x3 = -9 Æ x3 = 3 Æ x2 = 8 - 2x3 = 2 Æx1 = 6 -2 -3 =1 x = (1, 2, 3) T A similar procedure can be applied from the last 7 row to make the matrix A as lower diagonal Gaussian Elimination - Pivoting

• A serious problem with the Gauss elimination process is division by the diagonal term while I converting the augmented matrix into upper I triangular form. T • If the diagonal element is zero or a vanishingly very small then the elements of the rows below this D diagonal become very large in magnitude and difficult E to handle because of the finite storage capacity of L the computers. H • To overcome this problem, we convert the system I such that the element which has large magnitude in that column comes at the pivotal position i.e., the diagonal position.

8 Partial Pivoting

• Partial Pivoting: If only row interchanging is used to bring the element of large magnitude of the pivotal I column to the pivotal position at each step of I diagonalization then such a process is called partial T pivoting 3x + 3x + 4x = 20 •Ex: 1 2 3 D x1+ x2 + x3 = 6 2x + x + 3x = 13 E 1 2 3 ⎡3 3 4 20 ⎤ • Augm. A= L ⎢1 1 1 6 ⎥ ⎢ ⎥ H ⎣⎢2 1 3 13 ⎦⎥ I ⎡3 3 4 20 ⎤ • R2 ® R2 - 1/3 R 1 ⎢0 0 −1/2 −3/2 ⎥ ⎢ ⎥ • R3 ® R3 - 2/3 R 1 ⎣⎢0 −1 1/ 3 −1/3 ⎦⎥ • R 2«R3 (interchange rows tow and three) 9 Complete Pivoting

• Complete Pivoting: In this process the largest element(in magnitude) of the whole coefficient matrix I A is first brought at 1x 1 position of the coefficient I matrix and then leaving the first row and first column, T the largest among the remaining elements is brought to the pivotal 2 x2 position and so on by using both D row and column transformations, is called complete E pivoting L H I

10