
L. Vandenberghe ECE133B (Spring 2020) 9. QR algorithm • tridiagonal symmetric matrices • basic QR algorithm • QR algorithm with shifts 9.1 QR algorithm • the standard method for computing eigenvalues and eigenvectors • we discuss the algorithm for symmetric eigendecomposition n T X T A = QΛQ = λiqiqi i=1 there are two stages 1. reduce A to tridiagonal form by an orthogonal similarity transformation T Q1 AQ1 = T; T tridiagonal, Q1 orthogonal Λ T 2. compute eigendecomposition T = Q2 Q2 by a fast iterative algorithm QR algorithm 9.2 Outline • tridiagonal symmetric matrices • basic QR algorithm • QR algorithm with shifts Reflector Q = I − vvT p • v is a vector with norm kvk = 2 • Q is symmetric and orthogonal • product Qx = x − ¹vT xºv requires 4m flops if x and v are m-vectors Reflection to multiple of first unit vector (133A lecture 6) • an easily constructed reflector maps a given y to a multiple of e1 = ¹1;0;:::;0º • if y , 0, choose the reflector defined by p 2 v = w; w = y + sign¹y ºkyke kwk 1 1 this reflector maps y to Qy = −sign¹y1ºkyke1 QR algorithm 9.3 Reduction to tridiagonal form given an n × n symmetric matrix A, find orthogonal Q such that 2 a1 b1 0 ··· 0 0 0 3 6 7 6 b1 a2 b2 ··· 0 0 0 7 6 7 6 0 b2 a3 ··· 0 0 0 7 T 6 : : : : : : : 7 Q AQ = 6 : : : : : : : : 7 6 7 6 0 0 0 ··· an−2 bn−2 0 7 6 7 6 0 0 0 ··· bn−2 an−1 bn−1 7 6 7 6 0 0 0 ··· 0 bn−1 an 7 4 5 • this can be achieved by a product of reflectors Q = Q1Q2 ··· Qn−2 • complexity is order n3 QR algorithm 9.4 First step partition A as T a1 c1 A = c1 is an ¹n − 1º-vector, B1 is ¹n − 1º × ¹n − 1º c1 B1 • − T find a reflector I v1v1 that maps c1 to b1e1 and define 1 0 Q = 1 − T 0 I v1v1 • multiply A with Q1 to introduce zeros in positions 3;:::; n of 1st column and row a cT¹I − v vTº Q AQ = 1 1 1 1 1 1 ¹ − Tº ¹ − Tº ¹ − Tº I v1v1 c1 I v1v1 B1 I v1v1 T a b eT v B1v1 = 1 1 1 where w = B v − 1 v − T − T 1 1 1 1 b1e1 B1 v1w1 w1v1 2 • computation of 2;2 block requires order 4n2 flops QR algorithm 9.5 General step after k − 1 steps, 2 a b 0 ··· 0 0 0 0 3 6 1 1 7 6 b a b ··· 0 0 0 0 7 6 1 2 2 7 6 0 b a ··· 0 0 0 0 7 6 2 3 7 6 : : : : : : : : : : 7 ··· ··· 6 7 Qk−1 Q1AQ1 Qk−1 = 6 0 0 0 ··· a b 0 0 7 6 k−2 k−2 7 6 0 0 0 ··· b a b 0 7 6 k−2 k−1 k−1 7 6 T 7 6 0 0 0 ··· 0 bk−1 ak c 7 6 k 7 6 0 0 0 ··· 0 0 ck Bk 7 4 5 • − T ¹ − º find a reflector I vkvk that maps the n k -vector ck to bke1 and define 2 Ik−1 0 0 3 T 6 7 T T ak+1 ck+1 Qk = 6 0 1 0 7 ; ¹I − vkvk ºBk¹I − vkvk º = 6 T 7 ck+1 Bk+1 6 0 0 I − vkv 7 4 k 5 • complexity of step k is 4¹n − kº2 plus lower order terms QR algorithm 9.6 Complexity • complexity for complete algorithm is dominated by n−2 4 X 4¹n − kº2 ≈ n3 k=1 3 • Q is stored in factored form (the vectors vk are stored) • if needed, assembling the matrix Q adds another order n3 term QR algorithm 9.7 QR factorization of tridiagonal matrix suppose A is n × n and tridiagonal, with QR factorization A = QR Q and R have a special structure (dots indicate possible nonzero elements): 2 •• 3 2 •••••• 3 2 ••• 3 6 7 6 7 6 7 6 ••• 7 6 •••••• 7 6 ••• 7 6 7 6 7 6 7 6 ••• 7 6 ••••• 7 6 ••• 7 6 7 6 7 6 7 6 7 = 6 7 6 7 6 ••• 7 6 •••• 7 6 ••• 7 6 7 6 7 6 7 6 ••• 7 6 ••• 7 6 •• 7 6 7 6 7 6 7 6 •• 7 6 •• 7 6 • 7 4 5 4 5 4 5 • Q is zero below the first subdiagonal (Qij = 0 if i > j + 1) column k is column k of A orthogonalized with respect to previous columns • R is zero above second superdiagonal (Rij = 0 if j > i + 2) follows from considering R = QT A and the property of Q QR algorithm 9.8 Computing tridiagonal QR factorization QR factorization of n × n tridiagonal A takes order n operations QT A = R for example, in the Householder algorithm (133A lecture 6) • T − T Q is a product of reflectors Hk = I vkvk that make A upper triangular 2 A11 A12 0 ··· 0 0 3 6 7 6 A21 A22 A23 ··· 0 0 7 6 7 6 0 A32 A33 ··· 0 0 7 Hn−1 ··· H1 6 : : : : : : 7 = R 6 : : : : : : : 7 6 7 6 0 0 0 ··· An−1;n−1 An−1;n 7 6 7 6 0 0 0 ··· An;n−1 Ann 7 4 5 if A is tridiagonal, each vector vk has only two nonzero elements • Q is stored in factored form (as the reflectors vk) • we can allow zeros on diagonal of R, to extend QR factorization to singular A QR algorithm 9.9 Outline • tridiagonal symmetric matrices • basic QR algorithm • QR algorithm with shifts QR algorithm suppose A is a symmetric n × n matrix Basic QR iteration: start at A1 = A and repeat for k = 1;2;:::, • compute QR factorization Ak = Qk Rk • define Ak+1 = RkQk for most matrices, • Ak converges to a diagonal matrix of eigenvalues of A • Uk = Q1Q2 ··· Qk converges to matrix of eigenvectors QR algorithm 9.10 Some immediate properties A1 = A; Ak = Qk Rk; Ak+1 = RkQk ¹k ≥ 1º • the matrices Ak are symmetric: the first matrix A1 = A is symmetric and T Ak+1 = RkQk = Qk AkQk • continuing recursively, we see that an orthogonal similarity relates Ak and A: T Ak+1 = ¹Q1Q2 ··· Qkº A¹Q1Q2 ··· Qkº T = Uk AUk therefore the matrices Ak all have the same eigenvalues as A • the orthogonal matrices Uk = Q1Q2 ··· Qk and Rk satisfy AUk−1 = Uk−1Ak = Uk−1Qk Rk = Uk Rk QR algorithm 9.11 Equivalent form a related algorithm (“simultaneous iteration”) uses the last property to generate Uk: AUk−1 = Uk Rk we note that the right-hand side is a QR factorization Simultaneous iteration: start at U0 = I and repeat for k = 1;2;:::, • multiply with A: compute Vk = AUk−1 • compute QR factorization Vk = Uk Rk if the matrices Uk converge to U, then Rk converges to a diagonal matrix, since T T Rk = Uk Vk = Uk AUk−1 T so the limit of Rk is both symmetric (U AU) and triangular, hence diagonal QR algorithm 9.12 Interpretation simultaneous iteration is a matrix extension of the power iteration Power iteration: start at n-vector u0 with ku0k = 1, and repeat for k = 1;2;:::, • multiply with A: compute vk = Auk−1 • normalize: uk = vk/kvk k this is a simple iteration for computing an eigenvector with the largest eigenvalue • suppose the eigenvalues of A satisfy jλ1j > jλ2j ≥ · · · ≥ jλnj • suppose u0 = α1q1 + ··· + αnqn where qi is a normalized eigenvector for λi • after k power iterations, uk is the normalized vector k k k k A u0 = λ1¹α1q1 + α2¹λ2/λ1º q2 + ··· + αn¹λn/λ1º qnº • ± T if α1 , 0, the vector uk converges to q1, and uk Auk converges to λ1 QR algorithm 9.13 QR iteration with tridiagonal A now suppose A in the basic QR iteration on page 9.10 is tridiagonal and symmetric • we already noted that the matrices Ak are symmetric if A is symmetric (p. 9.11): T Ak+1 = RkQk = Qk AkQk • Q-factor of a tridiagonal matrix is zero below the first subdiagonal (page 9.8) • this implies that the product Ak+1 = RkQk is zero below the first subdiagonal: 2 ••• 3 2 •••••• 3 2 •••••• 3 6 7 6 7 6 7 6 ••• 7 6 •••••• 7 6 •••••• 7 6 7 6 7 6 7 6 ••• 7 6 ••••• 7 6 ••••• 7 6 7 6 7 6 7 6 7 6 7 = 6 7 6 ••• 7 6 •••• 7 6 •••• 7 6 7 6 7 6 7 6 •• 7 6 ••• 7 6 ••• 7 6 7 6 7 6 7 6 • 7 6 •• 7 6 •• 7 4 5 4 5 4 5 • since Ak+1 is also symmetric, it is tridiagonal hence, for tridiagonal symmetric A, the complexity of one QR iteration is order n QR algorithm 9.14 Outline • tridiagonal symmetric matrices • basic QR algorithm • QR algorithm with shifts QR algorithm with shifts in practice, a multiple of the identity is subtracted from Ak before factoring QR iteration with shifts: start at A1 = A and repeat for k = 1;2;:::, • choose a shift µk • compute QR factorization Ak − µk I = Qk Rk • define Ak+1 = RkQk + µk I • iteration still preserves symmetry and tridiagonal structure in Ak • with properly chosen shifts, the iteration always converges • with properly chosen shifts, convergence is fast (usually cubic) QR algorithm 9.15 Complexity overall complexity of QR method for symmetric eigendecomposition A = QΛQT Eigenvalues: if eigenvectors are not needed, we can leave Q in factored form • reduction of A to tridiagonal form costs ¹4/3ºn3 • for tridiagonal matrix, complexity of one QR iteration is linear in n • on average, number of QR iterations is a small multiple of n hence, cost is dominated by ¹4/3ºn3 for initial reduction to tridiagonal form Eigenvalues and eigenvectors: if Q is needed, order n3 terms are added • reduction to tridiagonal form and accumulating orthogonal matrix costs ¹8/3ºn3 • finding eigenvalues and eigenvectors of tridiagonal matrix costs 6n3 hence, total cost is ¹26/3ºn3 plus lower order terms QR algorithm 9.16 References • Lloyd N.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages20 Page
-
File Size-