The Lanczos and conjugate gradient algorithms G´erard MEURANT October, 2008 1 The Lanczos algorithm 2 The Lanczos algorithm in finite precision 3 The nonsymmetric Lanczos algorithm 4 The Golub–Kahan bidiagonalization algorithm 5 The block Lanczos algorithm 6 The conjugate gradient algorithm The Lanczos algorithm Let A be a real symmetric matrix of order n The Lanczos algorithm constructs an orthogonal basis of a Krylov subspace spanned by the columns of k−1 Kk = v, Av, ··· , A v Gram–Schmidt orthogonalization (Arnoldi) v 1 = v j i hi,j = (Av , v ), i = 1,..., j j j j X i v¯ = Av − hi,j v i=1 j hj+1,j = kv¯ k, if hj+1,j = 0 then stop v¯j v j+1 = hj+1,j k+1 k T AVk = Vk Hk + hk+1,k v (e ) Hk is an upper Hessenberg matrix with elements hi,j Note that hi,j = 0, j = 1,..., i − 2, i > 2 T Hk = Vk AVk If A is symmetric, Hk is symmetric and therefore tridiagonal Hk = Jk j We also have AVn = VnJn, if no v is zero before step n since v n+1 = 0 because v n+1 is a vector orthogonal to a set of n orthogonal vectors in a space of dimension n Otherwise there exists an m < n for which AVm = VmJm and the algorithm has found an invariant subspace of A, the eigenvalues of Jm being eigenvalues of A starting from a vector v˜1 = v/kvk 1 1 2 1 1 α1 = (Av , v ), v˜ = Av − α1v and then, for k = 2, 3,... k ηk−1 = kv˜ k v˜k v k = ηk−1 k k k T k αk = (v , Av ) = (v ) Av k+1 k k k−1 v˜ = Av − αk v − ηk−1v A variant of the Lanczos algorithm has been proposed by Chris Paige to improve the local orthogonality in finite precision computations k T k k−1 αk = (v ) (Av − ηk−1v ) k+1 k k−1 k v˜ = (Av − ηk−1v ) − αk v Since we can suppose that ηi 6= 0, the tridiagonal Jacobi matrix Jk (k) has real and simple eigenvalues which we denote by θj They are known as the Ritz values and are the approximations of the eigenvalues of A given by the Lanczos algorithm Theorem Let χk (λ) be the determinant of Jk − λI (which is a monic polynomial), then k 1 k−1 χk−1(λ) v = pk (A)v , pk (λ) = (−1) η1 ··· ηk−1 The polynomials pk of degree k − 1 are called the normalized Lanczos polynomials The polynomials pk satisfy a scalar three–term recurrence ηk pk+1(λ) = (λ − αk )pk (λ) − ηk−1pk−1(λ), k = 1, 2,... with initial conditions, p0 ≡ 0, p1 ≡ 1 Theorem Consider the Lanczos vectors v k . There exists a measure α such that Z b k l (v , v ) = hpk , pl i = pk (λ)pl (λ)dα(λ) a where a ≤ λ1 = λmin and b ≥ λn = λmax , λmin and λmax being the smallest and largest eigenvalues of A Proof. Let A = QΛQT be the spectral decomposition of A j T Since the vectors v are orthonormal and pk (A) = Qpk (Λ)Q , we have k l 1 T T 1 (v , v ) = (v ) pk (A) pl (A)v 1 T T T 1 = (v ) Qpk (Λ)Q Qpl (Λ)Q v 1 T T 1 = (v ) Qpk (Λ)pl (Λ)Q v n X 2 = pk (λj )pl (λj )[ˆvj ] , j=1 where vˆ = QT v 1 The last sum can be written as an integral for a measure α which is piecewise constant 0 if λ < λ 1 Pi 2 α(λ) = j=1[ˆvj ] if λi ≤ λ < λi+1 Pn 2 j=1[ˆvj ] if λn ≤ λ The measure α has a finite number of points of increase at the (unknown) eigenvalues of A The Lanczos algorithm can be used to solve linear systems Ax = c when A is symmetric and c is a given vector Let x0 be a given starting vector and r 0 = c − Ax0 be the corresponding residual Let v = v 1 = r 0/kr 0k k 0 k x = x + Vk y We request the residual r k = c − Axk to be orthogonal to the Krylov subspace of dimension k T k T T 0 T k T 0 k Vk r = Vk c − Vk Ax − Vk AVk y = Vk r − Jk y = 0 0 0 1 T 0 0 1 But, r = kr kv and Vk r = kr ke k 0 1 Jk y = kr ke The Lanczos algorithm in finite precision arithmetic It is well known since Lanczos that the basis vectors v k may loose their orthogonality. Moreover multiple copies of the already converged Ritz values appear again and again Consider an example devised by Z. Strakoˇs: a diagonal matrix with elements i − 1 λ = λ + (λ − λ )ρn−i , i = 1,..., n i 1 n − 1 n 1 We choose n = 30, λ1 = 0.1, λn = 100, ρ = 0.9 5 0 −5 −10 −15 −20 30 25 30 20 25 15 20 10 15 10 5 5 0 0 ˜ T ˜ log10(|V30V30|) for the Strakos30 matrix In this example the first Ritz value to converge is the largest one λn k 1 Then vn = pk (λn)vn must converge to zero (in exact arithmetic). What happens? |Log ∆ v|, i=30 10 0 −2 −4 −6 −8 −10 −12 −14 −16 −18 −20 0 5 10 15 20 25 30 k Strakos30, log10(|v30|) with (dashed), without (solid) reorthogonalization and their difference (dotted) More iterations |Log v|, i=30 10 0 −2 −4 −6 −8 −10 −12 −14 −16 −18 10 20 30 40 50 60 70 80 90 100 k Strakos30, log10(|v30|) with (dashed) and without reorthogonalization (solid) Distances to the largest eigenvalue of A |Log ∆ v|, i=30 10 5 0 −5 −10 −15 −20 20 40 60 80 100 120 140 k Strakos30, log10(|v30|) and the distances to the 10 largest Ritz values This behavior can be studied by looking at perturbed scalar three-term recurrences Theorem Let j be given and p˜j,k be the polynomial determined by p˜j,j−1 = 0, p˜j,j = 1 η˜k+1p˜j,k+1(λ) = (λ − α˜k )˜pj,k (λ) − η˜k p˜j,k−1(λ), k = j,... Then the computed Lanczos vector is k l k+1 1 X f v˜ =p ˜1,k+1(A)˜v + p˜l+1,k+1(A) η˜l+1 l=1 k+1 1 Note that the first term vˇ =p ˜1,k+1(A)˜v is different from what we have in exact arithmetic since the coefficients of the polynomial are the ones computed in finite precision Proposition The associated polynomial pj,k , k ≥ j is given by k−j χj,k−1(λ) pj,k (λ) = (−1) ηj+1 ··· ηk where χj,k (λ) is the determinant of Jj,k − λI,Jj,k being the tridiagonal matrix obtained from the coefficients of the second order recurrence from step j to step k, that is discarding the j − 1 first rows and columns of Jk The nonsymmetric Lanczos algorithm When the matrix A is not symmetric we cannot generally construct a vector v k+1 orthogonal to all the previous basis vectors by only using the two previous vectors v k and v k−1 Construct bi-orthogonal sequences using AT choose two starting vectors v 1 and v˜1 with (v 1, v˜1) 6= 0 normalized such that (v 1, v˜1) = 1. We set v 0 =v ˜0 = 0. Then for k = 1, 2,... k k k k−1 z = Av − ωk v − ηk−1v k T k k k−1 w = A v˜ − ωk v˜ − η˜k−1v˜ k k k k ωk = (˜v , Av ), ηk η˜k = (z , w ) zk w k v k+1 = , v˜k+1 = η˜k ηk ω1 η1 η˜1 ω2 η2 .. .. .. Jk = . η˜k−2 ωk−1 ηk−1 η˜k−1 ωk and 1 k 1 k Vk = [v ··· v ], V˜k = [˜v ··· v˜ ] Then, in matrix form k+1 k T AVk = Vk Jk +η ˜k v (e ) T ˜ ˜ T k+1 k T A Vk = Vk Jk + ηk v˜ (e ) Theorem If the nonsymmetric Lanczos algorithm does not break down with ηk η˜k being zero, the algorithm yields biorthogonal vectors such that (˜v i , v j ) = 0, i 6= j, i, j = 1, 2,... 1 k 1 1 k The vectors v ,..., v span Kk (A, v ) and v˜ ,..., v˜ span T 1 Kk (A , v˜ ). The two sequences of vectors can be written as k 1 k T 1 v = pk (A)v , v˜ =p ˜k (A )˜v where pk and p˜k are polynomials of degree k − 1 η˜k pk+1 = (λ − ωk )pk − ηk−1pk−1 ηk p˜k+1 = (λ − ωk )˜pk − η˜k−1p˜k−1 The algorithm breaks down if at some step we have (zk , w k ) = 0 Either k k I a) z = 0 and/or w = 0 If zk = 0 we can compute the eigenvalues or the solution of the linear system Ax = c. If zk 6= 0 and w k = 0, the only way to deal with this situation is to restart the algorithm I b) The more dramatic situation (“serious breakdown”) is when (zk , w k ) = 0 with zk and w k 6= 0 Need to use look–ahead strategies or restart For our purposes we will use the nonsymmetric Lanczos algorithm with a symmetric matrix! We can choose q k k ηk = ±η˜k = ± |(z , w )| k k with for instance, ηk ≥ 0 and η˜k = sgn[(z , w )] ηk .
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages40 Page
-
File Size-