Upper Hessenberg Matrix: Zero Entries Below the First Subdiagonal

Upper Hessenberg Matrix: Zero Entries Below the First Subdiagonal

Lecture 8: Fast Linear Solvers (Part 6) 1 Nonsymmetric System of Linear Equations • The CG method requires to 퐴 to be an 푛 × 푛 symmetric positive definite matrix to solve 퐴풙 = 풃. • If 퐴 is nonsymmetric: – Convert the system to a symmetric positive definite one – Modify CG to handle general matrices 2 Normal Equation Approach The normal equations corresponding to 퐴풙 = 풃 are 퐴푇퐴풙 = 퐴푇풃 • If 퐴 is nonsingular then 퐴푇퐴 is symmetric positive definite and the CG method can be applied to solve 퐴푇퐴풙 = 퐴푇풃 (CG normal residual -- CGNR). • Alternatively, we can first solve A퐴푇풚 = 풃 for 풚, then 풙 = 퐴푇풚. • Disadvantages: – Each iteration requires 퐴푇퐴 or A퐴푇 – Condition number of 퐴푇퐴 or A퐴푇 is square of that of 퐴. However, CG works well if condition number is small 3 Arnoldi Iteration • The Arnoldi method is an orthogonal projection onto a Krylov subspace Κ푚(퐴, 풓0) for 푛 × 푛 nonsymmetric matrix 퐴. Here 푚 ≪ 푛. • Arnoldi reduces 퐴 to a Hessenberg form. Upper Hessenberg matrix: zero entries below the first subdiagonal. 2 3 4 1 2 5 1 9 0 2 1 2 0 0 3 2 Lower Hessenberg matrix: zero entries above the first superdiagonal. 3 2 0 0 2 5 1 0 1 2 1 2 3 4 3 2 4 Let 퐻푚 be a 푚 × 푚 Hessenberg matrix: ℎ11 ℎ12 … ℎ1푚 ℎ ℎ … ℎ 퐻 = 21 22 2푚 푚 0 ⋱ ⋱ ⋮ 0 … ℎ푚,푚−1 ℎ푚푚 Let (푚 + 1) × 푚 퐻 푚 be the extended matrix of 퐻푚: ℎ11 ℎ12 … ℎ1푚 ℎ21 ℎ22 … ℎ2푚 퐻 푚 = 0 ⋱ ⋱ ⋮ 0 … ℎ푚,푚−1 ℎ푚푚 0 … 0 ℎ푚+1,푚 The Arnoldi iteration produces matrices 푉푚, 푉푚+1 and 퐻 푚 for matrix 퐴 satisfying: 푇 퐴푉푚 = 푉푚+1퐻 푚 = 푉푚퐻푚 + 풘푚풆푚 Here 푉푚, 푉푚+1 have orthonormal columns 푉푚 = [풗1 풗2 … |풗푚], 푉푚+1 = [풗1 풗2 … |풗푚|풗푚+1] 5 Arnoldi Algorithm Choose 풓0 and let 풗1 = 풓0/| 풓0 | for 푗 = 1, … , 푚 − 1 푗 푇 풘 = 퐴풗푗 − 푖=1((퐴풗푗) 풗푖)풗푖 풗푗+1 = 풘/||풘||2 endfor 6 푇 • 푉푚 푉푚 = 퐼푚×푚. • If Arnoldi process breaks down at 푚푡ℎ step, 풘푚 = ퟎ is still well- defined but not 푣푚+1, and the algorithm stop. • In this case, the last row of 퐻푚 is set to zero, ℎ푚+1,푚 = 0 7 Theorem. The Arnoldi procedure generates a reduced QR factoriztion of the Krylov matrix 2 푘−1 Κ푚 = [풓0|퐴풓0|퐴 풓0| … |퐴 풓0] in the form Κ푚 = 푉푚푅푚, 푚×푚 with 푅푚 being a triangular matrix 푅푚 ∈ 푅 . Furthermore, 푇 푉푚 퐴푉푚 = 퐻푚. Remark: 푗+1 퐴풗푗 = 푖=1 ℎ푖푗풗푖, for 푗 = 1, … , 푚 − 1 푇 퐴푉푚 = 푉푚+1퐻 푚 = 푉푚퐻푚 + 풘푚풆푚 8 Stable Arnoldi Algorithm Choose 풙0 and let 풗1 = 풙0/| 풙0 | . for 푗 = 1, … , 푚 풘 = 퐴풗푗 for 푖 = 1, … , 푗 ℎ푖푗 =< 풘, 풗푖 > 풘 = 풘 − ℎ푖푗풗푖 endfor ℎ푗+1,푖 = ||풘||2 풗푗+1 = 풘/ℎ푗+1,푖 endfor 9 Generalized Minimum Residual (GMRES) Method Let the Krylov space associated with 퐴풙 = 풃 be 2 푘−1 Κ푘 퐴, 풓0 = 푠푝푎푛 풓0, 퐴풓0, 퐴 풓0, … , 퐴 풓0 , where 풓0 = 풃 − 퐴 풙0 for some initial guess 풙0. The 푘푡ℎ (푘 ≥ 1) iteration of GMRES is the solution to the least squares problem: 푚푖푛푖푚푖푧푒풙∈풙0+Κ푘||풃 − 퐴풙||2, i.e. Find 풙푘 ∈ 풙0 + Κ푘 such that ||풃 − 퐴풙푘||2 = 푚푖푛풙∈풙0+Κ푘||풃 − 퐴풙||2 10 푘−1 푗 If 풙 ∈ 풙0 + Κ푘, then 풙 = 풙0 + 푗=0 훾푗퐴 풓0. 푘−1 푗+1 So 풃 − 퐴풙 = 풃 − 퐴풙0 − 푗=0 훾푗퐴 풓0 = 풓0 − 푘 푗 푗=1 훾푗−1퐴 풓0 . Define: Let 푝푘 be a 푘푡ℎ degree polynomial such that 푝푘 0 = 1. 푝푘 is called a residual polynomial. The set of 푘푡ℎ degree residual polynomial is 푃푘 = 푝푘 푝푘 푖푠 푎 푘푡ℎ 푑푒푟푒푒 푝표푙푦푛표푚푖푎푙 푎푛푑 푝푘 0 = 1 푘 푗 풓 = 풓0 − 훾푗−1퐴 풓0 = 푝푘 퐴 풓0 푗=1 Theorem. Let 풙푘 be the 푘푡ℎ GMRES iteration. Then for all 푝푘 ∈ 푃푘 ||풓푘||2 ≤ ||푝푘 퐴 풓0||2 11 GMRES Implementation • The 푘푡ℎ (푘 ≥ 1) iteration of GMRES is the solution to the least squares problem: 푚푖푛푖푚푖푧푒풙∈풙0+Κ푘||풃 − 퐴풙||2 • Suppose we have used Arnoldi process constructed an orthogonal basis 푉푘 for Κ푘 퐴, 풓0 . 푇 – 풓0 = 훽푉푘풆1, where 풆1 = (1,0,0, … ) , 훽 = ||풓0||2 – Any vector 풛 ∈ Κ푘 퐴, 풓0 can be written as 풛 = 푘 푘 푘 푙=1 푦푙풗푙 , where 풗푙 is the 푙푡ℎ column of 푉푘. Denote 푇 푘 풚 = (푦1, 푦2, … , 푦푘) ∈ 푅 . 풛 = 푉푘풚 12 Since 풙 − 풙0 = 푉푘풚 for some coefficient vector 푘 풚 ∈ 푅 , we must have 풙푘 = 풙0 + 푉푘풚 where 풚 minimizes ||풃 − 퐴(풙0 + 푉푘풚)||2 = ||풓0 − 퐴푉푘풚||2. • The 푘푡ℎ (푘 ≥ 1) iteration of GMRES now is equivalent to a least squares problem in 푅푘, i.e. 푚푖푛푖푚푖푧푒풙∈풙0+Κ푘||풃 − 퐴풙||2 = 푚푖푛푖푚푖푧푒푦∈푅푘||풓0 − 퐴푉푘풚||2 – Remark: This is a linear least square problem. 푇 – The associate normal equation is (퐴푉푘) 퐴푉푘풚 = 푇 (퐴푉푘) 풓0. But we will solve it differently. 13 • Let 풙푘 be 푘푡ℎ iterative solution of GMRES. Define: 풓푘 = 풃 − 퐴풙푘 = 풓0 − 퐴 풙푘 − 풙0 = 훽푉푘+1풆1 − 퐴 풙0 + 푉푘풚 − 풙0 = 훽푉푘+1풆1 − 푘 푘 푉푘+1퐻 푘풚 = 푉푘+1(훽풆1 − 퐻 푘풚 ) Using orthogonality of 푉푘+1: 푚푖푛푖푚푖푧푒풙∈풙0+Κ푘||풃 − 퐴풙||2 푘 = 푚푖푛푖푚푖푧푒푦∈푅푘||훽풆1 − 퐻푘풚 ||2 14 푘 푚푖푛푖푚푖푧푒푦∈푅푘||훽풆1 − 퐻푚풚 ||2 Theorem. Let 푛 × 푘 (푘 ≤ 푛) matrix 퐵 be with linearly independent columns (full column rank). Let 퐵 = 푄푅 be a 푄푅 factorization of 퐵. Then for each 풃 ∈ 푅푛, the equation 퐵풖 = 풃 has a unique least-square solution, given by 풖 = 푅−1푄푇풃. Using Householder reflection to do QR factorization (푚+1)×(푚+1) gives 퐻 푚 = 푄푚+1푅 푚 where 푄푚+1 ∈ 푅 is (푚+1)×푚 orthogonal and 푅 푚 ∈ 푅 has the form 푅 푅 = 푚 , where 푅 ∈ 푅푚×푚 is upper triangular. 푚 0 푚 15 “C.T. Kelley, Iterative Methods for Linear and Nonlinear Equations” . 16 • 풗푗 may become nonorthogonal as a result of round off errors. 푘 – ||훽풆1 − 퐻 푘풚 ||2 which depends on orthogonality, will not hold and the residual could be inaccurate. – Replace the loop in Step 2c of Algorithm gmresa with 17 “C.T. Kelley, Iterative Methods for Linear and Nonlinear Equations” . 18 “C.T. Kelley, Iterative Methods for Linear and Nonlinear Equations” . 19 .

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    19 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us