Autumn 2019 Math 104 - HW 7 Solutions (If you find any errors, please let your instructor know ASAP)

Problem 1 (Chapter 9, Exercise 2). Suppose A is skew-Hermitian, i.e., AH = −A. Prove that all its eigenvalues are pure imaginary.

Solution. Let B = iA, then BH = −iAH = iA = B and therefore B is Hermitian. Since B has all real eigenvalues {λ1, ··· , λn}, all the eigenvalues of A are of the form {−iλ1, ··· , −iλn}, and thus all pure imaginary.

Problem 2 (Chapter 9, Exercise 3). If A is an n × n Hermitian with eigenvalue λ and corresponding right eigenvector x, show that x is also a left eigenvector for λ. Prove the same result if A is skew-Hermitian.

Solution. Let x be a right eigenvector of A with eigenvalue λ, then by definition

Ax = λx Taking Hermitian on both sides,

xH AH = λx¯ H

• If A is Hermitian, then AH = A, and all eigenvalues are real, λ¯ = λ, so

xH A = λxH

i.e. x is also a left eigenvector of A with the same eigenvalue λ.

• If A is skew-Hermitian, then AH = −A, and by Problem 9.2, all eigenvalues are purely imaginary, λ¯ = −λ, so −xH A = −λxH i.e. x is also a left eigenvector of A with the same eigenvalue λ.

Problem 3. Suppose A is positive semidefinite. Can you a find a square root of this matrix? In other words, can you find a matrix B such that B2 = A? If yes, explain how you would construct it. If no, explain why no such matrix exists.

Solution. Yes, we can. Notice that A can be written as A = V ΣV > where V is unitary, Σ is . The diagonal elements of Σ are all the eigenvalues of A, and therefore are all√ non-negative. > Consider B = VDV where D is a with diagonal elements Dii = Σii. Then we have B2 = VDV >VDV > = A. Problem 4. Suppose A is positive semidefinite. Show that the maximum eigenvalue of A, denoted by λmax, is given by the so-called Rayleigh quoient w>Aw sup > w6=0 w w

Solution. Let w0 be an eigenvector of λmax, then we have

> w0 Aw0 > = λmax. w0 w0 On the other hand, denote the spectral representation of A by

n X > λivivi , i=1 then we have w>Aw Pn λ khw, v ik2 λ Pn khw, v ik2 = i=1 i i ≤ max i=1 i = λ . w>w kwk2 kwk2 max Therefore, we have w>Aw sup > = λmax, w6=0 w w as desired. Problem 5. (a) Consider an n × n positive definite matrix A with a largest eigenvalue greater than the second largest. Consider the following algorithm: start with a random vector x(0) ∈ Rn such (0) that kx k2 = 1 for t = 1, 2, 3, ··· recursively define:

(t) (t−1) (t−1) x = Ax /kAx k2.

What does this algorithm converge to? Solution. We claim that the algorithm will ‘almost’ coverge to the eigenvector with respect to the largest eigenvalue. Pn T Consider the spectral representation of A = i=1 λivivi where {v1, ··· , vn} is an orthonormal basis of A, λ1 > λ2 ≥ · · · ≥ λn are all the eigenvalues. Then we have

n X Ax = λihx, viivi. i=1 Therefore, by applying A and normalizing n times, we have

n 1 X x(k) = λkhx(0), v iv , C i i i k i=1 where C is the normalizing constant. (0) Suppose hx , v1i= 6 0 (which means the initial vector is not orthogonal to v1), we claim that λkhx(0),v i | 1 1 |→ 1. Cn k (0) pPn 2 Let cik = λi hx , vii, then we have Ck = i=1 cik. Notice that

(0) c1k λ1 k hx , v1i | |= | | | (0) | → ∞ cik λi hx , vii for all i > 1, as λ1 is the unique largest eigenvector. (0) (k) (0) (k) Therefore,as k → ∞, if hx , v1i > 0, x → v1, if hx , v1i < 0, x → −v1 . In either case, the algorithm converges to the eigenvector with respect to the largest eigenvalue. (0) (The edge case is hx , v1i = 0, which has probablity 0 if we choose a direction uniformly at random. In that case, the algorithm will converge to vk or −vk, where k is the smallest integer such that x is not orthogonal with vk.)