<p>Last Time - Point Iterative Methods</p><p>2 A U = v for the system U = g with a computational molecule</p><p>B 3</p><p>2 B 2 B 0 B 1 = h g + B . C . s</p><p>B 4</p><p>[A] partitioned into [R] + [D] + [S] Below Diagonal Above</p><p> n n n + 1 n n + 1 + ( 1 - ) U i , j n n + 1 n n + 1 n + 1 - 1 n n n n U i , j = [ B 1 U i + 1 , j + B 2 U i - 1 , j + B 3 U i , j + 1 + B 4 U i , j - 1 – R h s ] B 0</p><p>U n + 1 = - D - 1 R + S U n + D - 1 V</p><p>Jacobi: G J</p><p>U n + 1 = - R + D - 1 S U n + R + D - 1 V</p><p>Gauss - Seidel: G G S</p><p>ME 525 (Sullivan) Point Iterative Techniques Continued - Lecture 5 1 S.O.R.: n + 1 - 1 n - 1 U = D + R 1 - D - S U + D + R V</p><p>G </p><p>Basic Rule: Type I Boundary: Do not use the PDE on the Bdy. Type II or III Bdys: Use PDE Plus B.C. together</p><p>Spectral Radius, , of iteration matrix [G] is the largest magnitude eigenvalue of [G]</p><p> < 1 for convergence</p><p>Bare Essentials of Iterative Methods</p><p>Computational Estimate for </p><p> n = U n - U n - 1</p><p>1/ 2 轾M 2 Un- U n-1 d n 犏 ( i i ) r @ = 臌i=1 n-1 1/ 2 M 2 d 轾 n-1 n - 2 犏 (Ui- U i ) 臌i=1</p><p>ME 525 (Sullivan) Point Iterative Techniques Continued - Lecture 5 2</p><p>Now to prove that an iteration scheme can converge</p><p>Consider the following worst case situations. Recall def. Strict Diagonal Dominance</p><p> a i i > a i j j i </p><p>Expanding on the iteration handout</p><p> n n n - 1 Define = U - U = U - A v exact algebraic solutions (unknown) Since n n - 1 U = G U + r and U = G U + r</p><p> n = G U n - 1 + r - G U - r = G U n - 1 - U or n = G n - 1 = G G n - 2 G n 0</p><p>0 However we still don’t know But n n n - 1 Define = U - U This incremental error can be determined for all n n = G U n - 1 + r - G U n - 2 - r = G U n - 1 - U n - 2 or n = G n - 1 G n 0</p><p>Finally we can examine Residuals:</p><p>ME 525 (Sullivan) Point Iterative Techniques Continued - Lecture 5 3 Normally A U = v 0 = A U - v A U n - v Define R n = A U n - v = A U n - A A - 1 v = A U n - A - 1 v n n remember = U - U R n = A n = A G n - 1 = A G A - 1 A n - 1 R n - 1 R n = A G A - 1 R n - 1 A G n A - 1 R 0</p><p>Therefore, we have the following error measures</p><p> n n - 1 n 0 = G G numerical vs algebraic</p><p> n n - 1 n 0 = G G incremental errors</p><p> n n - 1 n 0 R = G R G R residual error</p><p>Each of these error indicators converges to zero if and only if the spectral radius, , (or largest absolute value eigenvalue) of the iteration matrix is less than 1. Therefore, n n - 1</p><p>ME 525 (Sullivan) Point Iterative Techniques Continued - Lecture 5 4 n n - 1</p><p>R n R n - 1 and one can estimate the spectral radius of the system via</p><p>1/ 2 轾M 2 Un- U n-1 d n 犏 ( i i ) r @ = 臌i=1 n-1 1/ 2 M 2 d 轾 n-1 n - 2 犏 (Ui- U i ) 臌i=1</p><p>If one measures expect the following</p><p>1</p><p></p><p>I t e r a t i o n s</p><p>ME 525 (Sullivan) Point Iterative Techniques Continued - Lecture 5 5 Given A U = v where [A] strict diagonal dominance</p><p>Prove Convergence a i j i = a Define j i i i </p><p> n + 1 = G n Recall a i j - 1 j 1 G J = - D R + S = – Jacobi a i i </p><p> n + 1 1 n : i = - a i j j a i i j 1 </p><p> n + 1 a i j n n i j i j i a i i </p><p> n m a x n where j j</p><p>Worst case n + 1 n m a x </p><p> m a x < 1 sufficient for convergence</p><p>Note: Elliptic equation m a x = 1 Jacobi will not diverge.</p><p>ME 525 (Sullivan) Point Iterative Techniques Continued - Lecture 5 6 Examine Gauss - Seidel :</p><p> N n + 1 - 1 n n + 1 n n 1 = a 1 j j 1 1 a 1 1 j = 2 since 1 < 1</p><p> N n + 1 - 1 n + 1 n 2 = a 2 1 1 + a 2 j j a 2 2 j = 3 </p><p>N </p><p> n + 1 a 2 1 n a 2 j n n 2 + j 2 a 2 2 j = 3 a 2 2 etc. . . .</p><p> a 2 1 n + 1 a 2 1 n 1 < In general since a 2 2 a 2 2</p><p> the Gauss - Seidel will converge faster (or diverge faster) than Jacobi. and 2 G S = J</p><p>ME 525 (Sullivan) Point Iterative Techniques Continued - Lecture 5 7 From S.O.R. theory for [A] Symmetric, Consistently ordered, “Property A”</p><p>2 G S = J</p><p> = 2 = 2 o p t 2 1 + 1 - J 1 + 1 - G S Recall: Self Adjoint implies symmetry</p><p>Rates of Convergence</p><p>In the limit of large n, recall that</p><p> n + M M n = and therefore</p><p> n + M = M n or n + M M = n and if you wish to reduce the existing error by a factor of K</p><p> n + M K = n one can write</p><p>M = K and solve for M, the number of iterations required to get to desired accuracy</p><p>ME 525 (Sullivan) Point Iterative Techniques Continued - Lecture 5 8 M = l n ( K ) / l n ( )</p><p>2 When solving U = 0 on a square </p><p>Jacobi 2 = m a x = c o s h ~ 1 - h a s h 0 2</p><p>The rate of convergence of a linear iteration</p><p> n + 1 n U = G U + r characterized by the matrix [G] is</p><p>R G J = - l o g G J = - l o g </p><p>(-) since < 1 for (+) convergence rate</p><p>Ref: Young, D.M. Trans. AM. Math. Soc., 76, #92, 1954 Ames - on reserve Westlake - listed in handout class 1 - Appendix B Eigenvalue Bounds</p><p>2 2 - l o g ~ - l o g 1 - h = h + O h 4 and 2 2</p><p>Thus, the convergence rate for Jacobi iterations is approximately h 2 / 2 which is slow for small values of h</p><p> l o g 1 + x = x - x 2 + x 3 - x 4 2 3 4 - 1 < x 1</p><p>ME 525 (Sullivan) Point Iterative Techniques Continued - Lecture 5 9 The max in G G S is c o s 2 h ~ 1 - h 2 as h 0</p><p>2 2 4 R G G S = - l o g c o s h ~ h + O h or the Gauss-Seidel iterations will converge twice as fast as Jacobi.</p><p>Finally max in G ~ 1 - 2 h as h 0</p><p>2 R G ~ 2 h + O h for optimal S.O.R.</p><p>2 h ~ 2 h 2 h times faster than Gauss-Seidel which for small h is significant</p><p>ME 525 (Sullivan) Point Iterative Techniques Continued - Lecture 5 10</p>
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages10 Page
-
File Size-