
Chapter 7 The Singular Value Decomposition (SVD) ✬ ✩ 1 The SVD produces orthonormal basesofv’s andu’s for the four fundamental subspaces. 2 Using those bases,A becomes a diagonal matrixΣ and Av i =σ iui :σ i = singular value. T 1 3 The two-bases diagonalizationA=UΣV often has more information thanA=XΛX − . T T T T 4UΣV separatesA into rank-1 matricesσ 1u1v + +σ rurv .σ 1u1v is the largest! ✫ 1 ··· r 1 ✪ 7.1 Bases and Matrices in the SVD The Singular Value Decomposition is a highlight of linear algebra.A is anymbyn matrix, 1 square or rectangular. Its rank isr. We will diagonalize thisA, but not byX − AX. The eigenvectors inX have three big problems: They are usually not orthogonal, there are not always enough eigenvectors, andAx=λx requiresA to be a square matrix. The singular vectorsofA solve all those problems in a perfect way. Let me describe what we want from the SVD : the right bases for the four subspaces. Then I will write about the steps tofind those bases in order of importance. The price we pay is to have two sets of singular vectors,u’s andv’s. Theu’s are in Rm and thev’s are inR n. They will be the columns of anmbym matrixU and annby n matrixV . I willfirst describe the SVD in terms of those basis vectors. Then I can also describe the SVD in terms of the orthogonal matricesU andV. (using vectors) Theu’s andv’s give bases for the four fundamental subspaces : u1,...,u r is an orthonormal basis for the column space T ur+1,...,u m is an orthonormal basis for the left nullspaceN(A ) v1,...,v r is an orthonormal basis for the row space vr+1,...,v n is an orthonormal basis for the nullspaceN(A). 381 382 Chapter 7. The Singular Value Decomposition (SVD) More than just orthogonality, these basis vectors diagonalize the matrixA: “A is diagonalized” Av1 =σ 1u1 Av2 =σ 2u2 ... Avr =σ rur (1) Those singular valuesσ 1 toσ r will be positive numbers :σ i is the length ofAv i. Theσ’s go into a diagonal matrix that is otherwise zero. That matrix isΣ. (using matrices) Since theu’s are orthonormal, the matrixU with thoser columns has U TU=I. Since thev’s are orthonormal, the matrixV hasV TV=I. Then the equations Avi =σ iui tell us column by column thatAV r =U rΣr: (mbyn)(nbyr) σ1 AV =U Σ A v v = u u . (2) r r r 1 ·· r 1 ·· r · (mbyr)(rbyr) · σr This is the heart of the SVD, but there is more. Thosev’s andu’s account for the row space and column space ofA. We haven r morev’s andm r moreu’s, from the − − nullspaceN(A) and the left nullspaceN(A T). They are automatically orthogonal to the firstv’s andu’s (because the whole nullspaces are orthogonal). We now include all the v’s andu’s inV andU, so these matrices become square. We still haveAV=UΣ. (mbyn)(nbyn) σ1 Av equalsUΣ A v1 v r v n = u1 u r u m · . (3) ·· ·· ·· ·· · σr (mbym)(mbyn) The newΣ ism byn. It is just ther byr matrix in equation (2) withm r extra zero − rows andn r new zero columns. The real change is in the shapes ofU andV. Those − are square orthogonal matrices. So AV=UΣ can becomeA=UΣV T. This is the T Singular Value Decomposition. I can multiply columnsu iσi fromUΣ by rows ofV : T T T SVD A=UΣV =u 1σ1v + +u rσrv . (4) 1 ··· r Equation (2) was a “reduced SVD” with bases for the row space and column space. Equation (3) is the full SVD with nullspaces included. They both split upA into the same T r matricesu iσivi of rank one: column times row. 2 T T We will see that eachσ i is an eigenvalue ofA A and also AA . When we put the singular values in descending order,σ σ ...σ >0, the splitting in equation (4) 1 ≥ 2 ≥ r gives ther rank-one pieces ofA in order of importance. This is crucial. T 1 Example 1 When isΛ=UΣV (singular values) the same asXΛX − (eigenvalues) ? Solution A needs orthonormal eigenvectors to allowX=U=V.A also needs eigenvaluesλ 0 ifΛ=Σ. SoA mustbea positive semidefinite(or definite) symmetric ≥ 1 T T matrix. Only then willA=XΛX − which is alsoQΛQ coincide withA=UΣV . 7.1. Bases and Matrices in the SVD 383 Example 2 IfA=xy T (rank1) with unit vectorsx andy, what is the SVD ofA? T Solution The reduced SVD in (2) is exactlyxy , with rankr=1. It hasu 1 =x and v1 =y andσ 1 = 1. For the full SVD, completeu 1 =x to an orthonormal basis ofu’s, and completev 1 =y to an orthonormal basis ofv’s. No newσ’s, onlyσ 1 = 1. Proof of the SVD We need to show how those amazingu’s andv’s can be constructed. Thev’s will be orthonormal eigenvectors ofA TA. This must be true because we are aiming for ATA=(UΣV T)T(UΣV T) =VΣ TU TUΣV T =VΣ TΣV T. (5) On the right you see the eigenvector matrixV for the symmetric positive (semi) definite matrixA TA. And(Σ TΣ) must be the eigenvalue matrix of(A TA): Eachσ 2 isλ(A TA)! NowAv i =σ iui tells us the unit vectorsu 1 tou r. This is the key equation (1). The essential point—the whole reason that the SVD succeeds—is that those unit vectors u1 tou r are automatically orthogonal to each other (because thev’s are orthogonal): � � � � T T T 2 T Avi Avj vi A Avj σj T Key stepu i uj = = = vi vj = zero. (6) σi σj σiσj σiσj Thev’s are eigenvectors ofA TA (symmetric). They are orthogonal and now theu’s are also orthogonal. Actually thoseu’s will be eigenvectors of AA T. Finally we complete thev’s andu’s tonv’s andmu’s with any orthonormal bases for the nullspacesN(A) andN(A T). We have foundV andΣ andU inA=UΣV T. An Example of the SVD Here is an example to show the computation of three matrices inA=UΣV T. � � 3 0 Example 3 Find the matricesU,Σ,V forA= . The rank isr=2. 4 5 With rank2, thisA has positive singular valuesσ 1 andσ 2. We will see thatσ 1 is larger T T thanλ max = 5, andσ 2 is smaller� thanλ�min = 3. Begin with�A A and� AA : 25 20 9 12 ATA= AAT = 20 25 12 41 2 2 Those have the same trace (50) and the same eigenvaluesσ 1 = 45 andσ 2 = 5. The square roots areσ 1 = √45 andσ 2 = √5. Thenσ 1σ2 = 15 and this is the determinant ofA. A key step is tofind the eigenvectors ofA TA (with eigenvalues45 and5): � �� � � �� �� � � � 25 20 1 1 25 20 1 1 =45 =5 20 25 1 1 20 25 −1 −1 Thenv 1 andv 2 are those (orthogonal!) eigenvectors rescaled to length1. 384 Chapter 7. The Singular Value Decomposition (SVD) � � � � 1 1 1 1 Right singular vectorsv 1 = v2 = − .u i = left singular vectors. √2 1 √2 1 Now computeAv andAv which will beσ u = √45u andσ u = √5u : 1 2 � � 1 1 � 1 � 2 2 2 3 1 1 1 Av1 = = √45 =σ 1 u1 √2 3 √10 3 � � � � 1 3 1 3 Av2 = − = √5 − =σ 2 u2 √2 1 √10 1 The division by √10 makesu 1 andu 2 orthonormal. Thenσ 1 = √45 andσ 2 = √5 as expected. The Singular Value Decomposition isA=UΣV T : � � � � � � 1 1 3 √45 1 1 1 U= − Σ= V= − . (7) √10 3 1 √5 √2 1 1 U andV contain orthonormal bases for the column space and the row space (both spaces are justR 2). The real achievement is that those two bases diagonalizeA: AV equalsUΣ. Then the matrixU TAV=Σ is diagonal. The matrixA splits into a combination of two rank-one matrices, columns times rows : � � � � � � √ √ T T 45 1 1 5 3 3 3 0 σ1u1v +σ 2u2v = + − = = A. 1 2 √ 3 3 √ 1 1 4 5 20 20 − An Extreme Matrix Here is a larger example, when theu’s and thev’s are just columns of the identity matrix. So the computations are easy, but keep your eye on the order of the columns. The matrix A is badly lopsided (strictly triangular). All its eigenvalues are zero.AA T is not close to ATA. The matricesU andV will be permutations thatfix these problems properly. 0 1 0 0 eigenvaluesλ=0,0,0,0 all zero ! 0 0 2 0 only one eigenvector (1,0,0,0) A= 0 0 0 3 singular valuesσ=3,2,1 0 0 0 0 singular vectors are columns ofI 7.1. Bases and Matrices in the SVD 385 We always start withA TA and AAT. They are diagonal (with easyv’s andu’s): 0000 1000 0100 0400 ATA= AAT = 0 040 0 090 0 0 09 0 0 00 T T 2 2 2 Their eigenvectors (u’s for AA andv’s forA A) go in decreasing orderσ 1 >σ 2 >σ 3 of the eigenvalues. These eigenvaluesσ 2 = 9,4,1 are not zero! 0 010 3 0 0 01 0100 2 0 010 U= Σ= V= 1000 1 0100 0 0 01 0 1000 T Thosefirst columnsu 1 andv 1 have1’s in positions3 and4. Thenu 1σ1v1 picks out the biggest numberA 34 = 3 in the original matrixA. The three rank-one matrices in the SVD come exactly from the numbers3,2,1 inA.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages9 Page
-
File Size-