Eigenvalues Rho1 = Rho; Rho = R' * Y; If (I == 1) Example 5.0.1 (Analytic Solution of Homogeneous Linear Ordinary Differential Equations)

Eigenvalues Rho1 = Rho; Rho = R' * Y; If (I == 1) Example 5.0.1 (Analytic Solution of Homogeneous Linear Ordinary Differential Equations)

MATLAB PCG algorithm function x = pcg(Afun,b,tol,maxit,Binvfun,x0) x = x0; r = b - feval(Afun,x); rho = 1; for i = 1 : maxit y = feval(Binvfun,r); 5 Eigenvalues rho1 = rho; rho = r' * y; if (i == 1) Example 5.0.1 (Analytic solution of homogeneous linear ordinary differential equations). → [47, p = y; else Remark 5.6.1] beta = rho / rho1; p = y + beta * p; Autonomous homogeneous linear ordinary differential equation (ODE): end q = feval(Afun,p); y˙ = Ay , A ∈ Cn,n . (5.0.1) alpha = rho /(p' * q); x = x + alpha * p; λ 1 if (norm(b - evalf(Afun,x)) <= tolb*norm(b)) , return; end 1 z=S− y A = S ... S−1 , S ∈ Cn,n regular =⇒ y˙ = Ay ←→ z˙ = Dz . r = r - alpha * q; end λn =:D Dubious termination criterion ! | {z } ➣ solution of initial value problem: △ n −1 4º¿ y˙ = Ay , y(0) = y0 ∈ C ⇒ y(t) = Sz(t) , z˙ = Dz , z(0) = S y0 . 5º¼ Summary: Ôº ¾77 Ôº ¾79 Advantages of Krylov methods vs. direct elimination (, IF they converge at all/sufficiently fast). The initial value problem for the decoupled homogeneous linear ODE z˙ = Dz has a simple analytic • They require system matrix A in procedural form y=evalA(x) ↔ y = Ax only. solution −1 T • They can perfectly exploit sparsity of system matrix. zi(t) = exp(λit)(z0)i = exp(λit) (S )i,:y0 . • They can cash in on low accuracy requirements (, IF viable termination criterion available). In light of Rem. ??: • They can benefit from a good initial guess. λ1 A = S ... S−1 ⇔ A (S) = λ (S) i = 1,..., n . (5.0.2) :,i i :,i λn In order to find the transformation matrix S all non-zero solution vectors (= eigenvectors) x ∈ Cn of 4.4 Essential Skills Learned in Chapter 4 the linear eigenvalue problem Ax = λx You should know: have to be found. • the relation between a linear system of equations with a s.p.d. matrix and a quadratic minimization problem 3 • how the steepest descendent method works and its convergence properties Example 5.0.2 (Normal mode analysis). → [?] • the idea behind the conjugate gradient method and its convergence properties In the computation of the IR spectra of a molecule one is interested into the vibrational frequencies • how to use the Matlab-function pcg n 4º4 of a molecule which is described by n positional degrees of freedom x ∈ R and corresponding 5º¼ the importance of the preconditioner dx n • Ôº ¾78 R Ôº ¾8¼ velocities x˙ = dt ∈ . m 2 6.In case of different masses we end with the system Suppose all masses are equal with an effective mass m kinetic energy K(x, x˙ ) = 2 kx˙ k2 Ma¨ = −Ha . Model for total potential energy U(x) with the mass matrix M which is symmetric, positiv-definite, but not necessarily diagonal. We 1. find a local minimum x∗ of the potential energy: thus must perform normal mode analysis on the matrix M−1H: ∗ 2 ∗ DU(x ) = 0 , H = D U(x ) sym. pos. semi-def. , −1 j j j j M Hw = λjw ⇐⇒ Hw = λjMw . hence with Taylor we have near the local minimum: find a local minimum x∗ of the potential 3 energy: 1 U(x∗ + a) = U(x∗) + aT Ha . 2 2. Newton’mechanics near the local minimum: 5.1 Theory of eigenvalue problems ∗ mx¨ = ma¨ = −DaU(x + a) . As we are around the minimum: ∗ 1 T ma¨ = −Da(U(x ) + a Ha) = −Ha . 2 Definition 5.1.1 (Eigenvalues and eigenvectors). • λ ∈ C eigenvalue (ger.: Eigenwert) of A ∈ Kn,n :⇔ det(λI − A) = 0 characteristic polynomial χ(λ) 5º¼ 5º½ • spectrum of A ∈ Kn,n: σ(A) := {λ ∈ C: λ eigenvalue of|A} {z } Ôº ¾8½ Ôº ¾8¿ 3. As H is real and symmetric, its eigenvectors wj, with j = 1,..., n are orthogonal and hence eigenspace (ger.: Eigenraum) associated with eigenvalue : they form a conveninent basis for representing any vector: Two simple facts: 1 n a = c1(t)w + ... + cn(t)w . λ ∈ σ(A) ⇒ dim EigA(λ) > 0 , (5.1.1) Inserting into the system of second-order ODEs, we get: det(A) = det(AT ) ∀A ∈ Kn,n ⇒ σ(A) = σ(AT ) . (5.1.2) m(c¨ w1 + ... + c¨ wn) = −(c Hw1 + ... + c Hwn) = −(λ w1 + ... + λ wn) 1 n 1 n 1 n ✎ notation: ρ(A) := max{|λ|: λ ∈ σ(A)} =ˆ spectral radius of A ∈ Kn,n j j where we denoted the associated eigenvalues by λj: Hw = λjw . 4. Taking the scalar product with the eigenvector k we obtain an uncoupled set of equations for ' $ w Theorem 5.1.2 (Bound for spectral radius). each , : ck(t) k = 1,..., n For any matrix norm k·k induced by a vector norm (→ Def. 2.4.2) mc¨k = −λkck . ρ(A) ≤ kAk . & % 5. Looking for solutions of the form ck = αk sin(ωkt) and substituting it into the differential equation one get the angular vibrational frequency of the normal mode ωk = λk/m ' $ Lemma 5.1.3 (Gershgorin circle theorem). For any A ∈ Kn,n holds true p n σ(A) ⊂ {z ∈ C: |z − a | ≤ |a |} . jj i6=j ji 5º¼ j[=1 X 5º½ Ôº ¾8¾ & %Ôº ¾84 ' $ Eigenvalue Lemma 5.1.4 (Similarity and spectrum). problems: ➊ Given A ∈ Kn,n find all eigenvalues (= spectrum of A). (EVPs) The spectrum of a matrix is invariant with respect to similarity transformations: ➋ Given A ∈ Kn,n find σ(A) plus all eigenvectors. ➌ Given A Kn,n find a few eigenvalues and associated eigenvectors ∀A ∈ Kn,n: σ(S−1AS) = σ(A) ∀ regular S ∈ Kn,n . ∈ & % (Linear) generalized eigenvalue problem: ' $ Given A ∈ Cn,n, regular B ∈ Cn,n, seek x 6= 0, λ ∈ C Lemma 5.1.5. Existence of a one-dimensional invariant subspace Ax = λBx ⇔ B−1Ax = λx . (5.1.3) ∀C ∈ Cn,n: ∃u ∈ Cn: C (Span {u}) ⊂ Span {u} . x =ˆ generalized eigenvector, λ =ˆ generalized eigenvalue & % Obviously every generalized eigenvalue problem is equivalent to a standard eigenvalue problem ' $ −1 Theorem 5.1.6 (Schur normal form). Ax = λBx ⇔ B A = λx . However, usually it is not advisable to use this equivalence for numerical purposes! ∀A ∈ Kn,n: ∃U ∈ Cn,n unitary: UHAU = T with T ∈ Cn,n upper triangular . Remark 5.1.1 (Generalized eigenvalue problems and Cholesky factorization). & % If B = BH s.p.d. (→ Def. 2.6.2) with Cholesky factorization B = RHR 5º½ 5º½ Ax = λBx ⇔ Ay = λy where A := R−HAR−1, y := Rx . Ôº ¾85 Ôº ¾87 ➞ e e ' $ This transformation can be used for efficient computations. Corollary 5.1.7 (Principal axis transformation). △ n,n H H n,n H A ∈ K , AA = A A: ∃U ∈ C unitary: U AU = diag(λ1,..., λn) , λi ∈ C . & % A matrix A ∈ Kn,n with AAH = AHA is called normal. 5.2 “Direct” Eigensolvers • Hermitian matrices: AH = A ➤ σ(A) ⊂ R Examples of normal matrices are • unitary matrices: AH = A−1 ➤ |σ(A)| = 1 Purpose: solution of eigenvalue problems ➊, ➋ for dense matrices “up to machine precision” • skew-Hermitian matrices: A = −AH ➤ σ(A) ⊂ iR MATLAB-function: eig ➤ Normal matrices can be diagonalized by unitary similarity transformations n,n d = eig(A) : computes spectrum σ(A) = {d1,..., dn} of A ∈ C [V,D] = eig(A) : computes V ∈ Cn,n, diagonal D ∈ Cn,n such that AV = VD Symmetric real matrices can be diagonalized by orthogonal similarity transformations Remark 5.2.1 (QR-Algorithm). → [20, Sect. 7.5] In Thm. 5.1.7: – λ1,..., λn = eigenvalues of A Note: All “direct” eigensolvers are iterative methods – Columns of U = orthonormal basis of eigenvectors of A 5º½ 5º¾ Ôº ¾86 Ôº ¾88 d = eig(A,B) : computes all generalized eigenvalues Idea: Iteration based on successive unitary similarity transformations [V,D] = eig(A,B) : computes V ∈ Cn,n, diagonal D ∈ Cn,n such that AV = BVD diagonal matrix , if A = AH , (0) (1) Note: (Generalized) eigenvectors can be recovered as columns of V: A = A −→ A −→ ... −→ upper triangular matrix , else. (→ Thm. 5.1.6) AV = VD ⇔ A(V):,i = (D)i,iV:,i , (superior stability of unitary transformations, see Rem. ??) if D = diag(d1,..., dn). Code 5.2.2: QR-algorithm with shift QR-algorithm (with shift) 1 function d = e i g q r (A, t o l ) Remark 5.2.4 (Computational effort for eigenvalue computations). 2 n = size (A, 1 ) ; in general: quadratic conver- 3 while (norm( t r i l (A,−1)) > t o l ∗norm(A)) Computational effort (#elementary operations) for eig(): gence 4 s h i f t = A( n , n ) ; cubic convergence for normal 5 [Q,R] = qr ( A − s h i f t ∗ eye ( n )) ; eigenvalues & eigenvectors of A ∈ Kn,n ∼ 25n3 + O(n2) matrices 6 A = Q’ ∗A∗Q; only eigenvalues of A ∈ Kn,n ∼ 10n3 + O(n2) ( [20, Sect. 7.5,8.2]) 7 end → eigenvalues and eigenvectors A AH Kn,n 3 2 3 8 d = diag (A) ; = ∈ ∼ 9n + O(n ) O(n )! H Kn,n 4 3 2 only eigenvalues of A = A ∈ ∼ 3n + O(n ) ) only eigenvalues of tridiagonal A = AH ∈ Kn,n ∼ 30n2 + O(n) Computational cost: O(n3) operations per step of the QR-algorithm Note: eig not available for sparse matrix arguments # Exception: Library implementations of the QR-algorithm provide numerically stable d=eig(A) for sparse Hermitian matrices eigensolvers (→ Def.??) 5º¾ △ 5º¾ Ôº ¾89 Ôº ¾9½ " ! Example 5.2.5 (Runtimes of eig). △ Code 5.2.6: measuring runtimes of eig Remark 5.2.3 (Unitary similarity transformation to tridiagonal form). 1 function e i g t i m i n g 2 H Successive Householder similarity transformations of A = A : 3 A = rand (500 ,500) ; B = A’ ∗A; (➞ =ˆ affected rows/columns, =ˆ targeted vector) 4 C = gallery ( ’ t r i d i a g ’ ,500 ,1 ,3 ,1) ; 5 times = []; 6 for n=5:5:500 0 0 0 0 0 0 0 0 0 0 7 An = A( 1 : n , 1 : n ) ; Bn = B( 1 : n , 1 : n ) ; Cn = C( 1 : n , 1 : n ) ; 0 0 0 0 0 0 0 8 t1 = 1000; for k =1:3 , t i c ; d = eig (An) ; t1 = min ( t1 , toc ) ; end −→ −→ −→ 0 9 t2 = 1000; for k =1:3 , t i c ;[ V,D] = eig (An) ; t2 = min ( t2 , toc ) ; end 0 t3 = 1000; for k =1:3 , t i c ; d = eig (Bn) ; t3 = min ( t3 , toc ) ; end 0 0 0 0 0 0 1 t4 = 1000; for k =1:3 , t i c ;[ V,D] = eig (Bn) ; t4 = min ( t4 , toc ) ; end transformation to tridiagonal form ! (for general matrices a similar strategy can achieve a 2 t5 = 1000; for k =1:3 , t i c ; d = eig (Cn) ; t5 = min ( t5 , toc ) ; end 3 times = [ times ; n t1 t2 t3 t4 t5 ]; similarity transformation to upper Hessenberg form) 4 end 5 6 figure ; this transformation is used as a preprocessing step for QR-algorithm ➣ eig.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    14 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us