10. Schur Decomposition
Total Page:16
File Type:pdf, Size:1020Kb
L. Vandenberghe ECE133B (Spring 2020) 10. Schur decomposition • eigenvalues of nonsymmetric matrix • Schur decomposition 10.1 Eigenvalues and eigenvectors a nonzero vector x is an eigenvector of the n × n matrix A, with eigenvalue λ, if Ax = λx • the eigenvalues are the roots of the characteristic polynomial det¹λI − Aº = 0 • eigenvectors are nonzero vectors in the nullspace of λI − A for most of the lecture, we assume that A is a complex n × n matrix Schur decomposition 10.2 Linear indepence of eigenvectors suppose x1,..., xk are eigenvectors for k different eigenvalues: Ax1 = λ1x1;:::; Axk = λk xk then x1,..., xk are linearly independent • the result holds for k = 1 because eigenvectors are nonzero • suppose it holds for k − 1, and assume α1x1 + ··· + αk xk = 0; then 0 = A¹α1x1 + α2x2 + ··· + αk xkº = α1λ1x1 + α2λ2x2 + ··· + αkλk xk • subtracting λ1¹α1x1 + ··· + αk xkº = 0 shows that α2¹λ2 − λ1ºx2 + ··· + αk¹λk − λ1ºxk = 0 • since x2,..., xk are linearly independent, α2 = ··· = αk = 0 • hence α1 = ··· = αk = 0, so x1,..., xk are linearly independent Schur decomposition 10.3 Multiplicity of eigenvalues Algebraic multiplicity • the multiplicity of the eigenvalue as a root of the characteristic polynomial • the sum of the algebraic multiplicities of the eigenvalues of an n × n matrix is n Geometric multiplicity • the geometric multiplicity is the dimension of null¹λI − Aº • the maximum number of linearly independent eigenvectors with eigenvalue λ • sum is the maximum number of linearly independent eigenvectors of the matrix Defective eigenvalue • geometric multipicity never exceeds algebraic multiplicity (proof on page 10.7) • eigenvalue is defective if geometric muliplicity is less than algebraic multiplicity • a matrix is defective if some of its eigenvalues are defective Schur decomposition 10.4 Example 2 1 1 0 0 3 6 7 6 0 1 0 0 7 A = 6 7 ; det¹λI − Aº = ¹λ − 1º2¹λ − 2º2 6 0 0 2 0 7 6 7 6 0 0 0 2 7 4 5 • eigenvalue λ = 1 has algebraic multiplicity two and geometric multiplicity one • eigenvectors with eigenvalue 1 are the nonzero multiples of ¹1;0;0;0º • eigenvalue λ = 2 has algebraic and geometric multiplicity two • eigenvectors with eigenvalue 2 are nonzero vectors in the subspace 2 0 3 2 0 3 6 7 6 7 6 0 7 6 0 7 null¹2I − Aº = spanf6 7 ; 6 7g 6 1 7 6 0 7 6 7 6 7 6 0 7 6 1 7 4 5 4 5 • maximum number of linearly independent eigenvectors is three; for example, ¹1;0;0;0º; ¹0;0;1;0º; ¹0;0;0;1º Schur decomposition 10.5 Similarity transformation two matrices A and B are similar if B = X−1AX for some nonsingular matrix X • similarity transformation preserves eigenvalues and algebraic multiplicities: det¹λI − Bº = det¹λI − X−1AXº = det¹X−1¹λI − AºXº = det¹λI − Aº • if x is an eigenvector of A then y = X−1x is an eigenvector of B: By = ¹X−1AXº¹X−1xº = X−1Ax = X−1¹λxº = λy • similarity transformation preserves geometric multiplicities: dim null¹λI − Bº = dim null¹λI − Aº Schur decomposition 10.6 Geometric and algebraic multiplicities suppose α is an eigenvalue with geometric multiplicity r: dim null¹αI − Aº = r • define an n × r matrix U with orthonormal columns that span null¹αI − Aº • complete U to define a unitary matrix X = UV and define B = XH AX: UH AU UH AV αIUH AV B = XH AX = = V H AU V H AV 0 V H AV • the characteristic polynomial of B is det¹λI − Bº = ¹λ − αºr det¹λI − V H AVº this shows that the algebraic multiplicity of eigenvalue α of B is at least r Schur decomposition 10.7 Diagonalizable matrices the following properties are equivalent 1. A is diagonalizable by a similarity: there exists nonsingular X, diagonal Λ s.t. X−1AX = Λ 2. A has a set of n linearly independent eigenvectors (the columns of X): AX = XΛ 3. all eigenvalues of A are nondefective: algebraic multiplicity = geometric multiplicity not all matrices are diagonalizable (as are real symmetric matrices) Schur decomposition 10.8 Outline • eigenvalues of nonsymmetric matrix • Schur decomposition Schur decomposition every A 2 Cn×n can be factored as A = UTUH (1) • U is unitary: UHU = UUH = I • T is upper triangular, with the eigenvalues of A on its diagonal • the eigenvalues can be chosen to appear in any order on the diagonal of T complexity of computing the factorization is order n3 Schur decomposition 10.9 Proof by induction • the decomposition (1) obviously exists if n = 1 • suppose it exists if n = m and A is an ¹m + 1º × ¹m + 1º matrix • let λ be any eigenvalue and u a corresponding eigenvector, with kuk = 1 • let V be an ¹m + 1º × m matrix that makes the matrix u V unitary: uH uH Au uH AV λuHu uH AV λ uH AV A u V = = = V H V H Au V H AV λV Hu V H AV 0 V H AV • V H AV is an m × m matrix, so by the induction hypothesis, VT AV = U˜T˜U˜ H for some unitary U˜ and upper triangular T˜ • the matrix U = u VU˜ is unitary and defines a similarity that triangularizes A: uH λ uH AVU˜ λ uH AVU˜ UH AU = A u VU˜ = = U˜ HV H 0 Q˜ HV H AVQ˜ 0 T˜ Schur decomposition 10.10 Real Schur decomposition if A is real, a factorization with real matrices exists: A = UTUT • U is orthogonal: UTU = UUT = I • T is quasi-triangular: 2 T11 T12 ··· T1m 3 6 7 6 0 T22 ··· T1m 7 T = 6 : : : : 7 6 : : : : : 7 6 7 6 0 0 ··· Tmm 7 4 5 the diagonal blocks Tii are 1 × 1 or 2 × 2 • the scalar diagonal blocks are real eigenvalues of A • the eigenvalues of the 2 × 2 diagonal blocks are complex eigenvalues of A Schur decomposition 10.11.