MATH 225 Summer 2005 Linear Algebra II Solutions to Assignment 3 Due: Wednesday July 27, 2005
Department of Mathematical and Statistical Sciences University of Alberta
Question 1. [p 326. #14] 4 0 −2 Let A = 2 5 4 , 0 0 5 (a) Findthe eigenvalues and corresponding eigenvectors of A. (b) If possible, diagonalize the matrix A, that is find an invertible matrix P and a diagonal matrix D such that A = P DP −1.
Solution:
(a) The characteristic polynomial of A is
4 − λ 0 −2 4 − λ 0 2 p(λ) = det(A − λI) = 2 5 − λ 4 = (5 − λ) = (4 − λ)(5 − λ) , 2 5 − λ 0 0 5 − λ
and the eigenvalues of A are λ1 = 4, λ2 = λ3 = 5.
To find the corresponding eigenvectors we solve the homogeneous systems (A−λ)x = 0 when λ = λ1, λ2,
and λ3.
For λ1 = 4 : We row reduce the matrix A − 4I
1 0 0 −2 1 2 2 2 1 4 → 0 0 1 0 0 1 0 0 0 1 and the solution is x3 = 0, x1 = − 2 x2. An eigenvector corresponding to λ1 = 4 is x1 = (1, −2, 0).
For λ2 = λ3 = 5 : We row reduce the matrix A − 5I
−1 0 −2 1 0 2 2 0 4 → 0 0 0 0 0 0 0 0 0 and the solution is x1 = −2x3. The general solution is
x = (−2x3, x2, x3) = x2(0, 1, 0) + x3(−2, 0, 1),
and a basis for the eigenspace E5 = Null(A − 5I) is {(0, 1, 0), (−2, 0, 1)}. There are two linearly inde-
pendent eigenvectors corresponding to the eigenvalue λ = 5, namely, x2 = (0, 1, 0) and x3 = (−2, 0, 1). (b) Since A has a basis of eigenvectors, it is diagonalizable and a diagonalizing matrix P and a diagonal matrix D such that A = P DP −1 are given by
1 0 −2 4 0 0 P = −2 1 0 and D = 0 5 0 . 0 0 1 0 0 5
Question 2. [p 326. #27]
Show that if A is both diagonalizable and invertible, then so is A−1.
Solution: Since A is diagonalizable, there exists an invertible matrix P and a diagonal matrix D such that
− A = P DP 1.
Since A is invertible, then λ = 0 is not an eigenvalue of A, so that all of the diagonal entries of D are nonzero, and so D is also invertible. Therefore,
− − −1 − − − − A 1 = P 1 D 1P 1 = P D 1P 1, and A−1 is also diagonalizable with the same diagonalizing matrix P, and the diagonal matrix is made up of the inverses of the eigenvalues of A.
Question 3. [p 326. #28]
Show that if an n × n matrix A has n linearly independent eigenvectors, then so does AT . [Hint: Use the Diagonalization Theorem.]
Solution: If A is an n × n matrix and has n linearly independent eigenvectors, then A is diagonalizable, so there exists an invertible matrix P and a diagonal matrix D such that
− A = P DP 1, and taking the transpose of both sides of this equation, we have
−1 T −1 T −1 −1 AT = P DP = P DT P T = P T DP T = QDQ −1 where Q = P T is invertible. Therefore, AT is diagonalizable, and so by the Diagonalization Theorem, AT has n linearly independent eigenvectors. Question 4. [p 334. #22]
Show that if A is an n×n matrix which is diagonalizable and B is similar to A, then B is also diagonalizable.
Solution: If A is diagonalizable, then there exists an invertible matrix P and a diagonal matrix D such
that A = P DP −1.
If A is similar to a matrix B, then there exists an invertible matrix Q such that B = QAQ−1, and therefore
− − − − − B = Q P DP 1 Q 1 = (QP )D P 1Q 1 = (QP )D(QP ) 1, where QP is invertible, so B is also diagonalizable.
Question 5. [p 334. #24]
Show that if A and B are square matrices which are similar, then they have the same rank.
Solution: First we note that if P is invertible, then we can reduce P to the identity I using elementary row operations, and the same elementary row operations will reduce P A to A. Since elementary row operations do not change the row space, then Row(A) = Row(P A), so that
rank(A) = dim(Row(A)) = dim(Row(P A)) = rank(P A).
a similar argument shows that for any invertible matrix P, we have
rank(AT ) = dim(Row(AT )) = dim(Row(P T AT )) = rank(P T AT ) = rank((AP )T ) = rank(AP).
Now, since A and AT have the same rank, we have shown that for any invertible matrix P, we have
rank(A) = rank(P A) = rank(AP ).
If A and B are similar, then there exists an invertible matrix S such that A = SBS−1, so that
− − rank(A) = rank(SBS 1) = rank(BS 1) = rank(B)
from the above. Question 6. [p 334. #25]
The trace of a square matrix A is the sum of the diagonal entries in A and is denoted by tr(A). (a) Show that if A and B are any n × n matrices, then tr(AB) = tr(BA). (b) Show that if A and B are similar, then tr(A) = tr(B). (c) If A is an n × n matrix with characteristic polynomial
n n n−1 p(λ) = (−1) λ + a1λ + · · · + an−1λ + an , what is the coefficient of λn−1? (d) Show that the trace of an n × n matrix A equals the sum of the eigenvalues of A.
Solution:
(a) If A and B are n × n matrices, then
n n n tr(AB) = (AB)ii = aij bji i=1 i=1 j=1 X X X n n n = bjiaij = (BA)jj j=1 i=1 ! j=1 X X X = tr(BA),
and tr(AB) = tr(BA).
(b) If A and B are similar, then there exists an invertible matrix S such that A = SBS−1, and therefore
− − tr(A) = tr(SBS 1) = tr(BSS 1) = tr(BI) = tr(B).
(c) If the eigenvalues of A are λ1, λ2, . . . , λn, then the characteristic polynomial of A factors as
p(λ) = (λ1 − λ)(λ2 − λ) · · · (λn − λ)
and the coefficient of λn−1 is equal to
n−1 (−1) (λ1 + λ2 + · · · + λn)
and comparing this to the coefficient of λn−1 in the standard form of the characteristic polynomial, we have
a1 = − (λ1 + λ2 + · · · + λn) . (d) In the expansion of the determinant of A, there are exactly n! terms. Suppose a term T in det(A)
contains aij where i 6= j, that is,
T = ()a1∗a2∗ · · · aij · · · an∗,
then T cannot contain the factors aii or ajj , or it would have two factors from the same row or the same column. Therefore, except for the “diagonal” term
a11a22 · · · ann,
every term in det(A) has n − 2 or fewer diagonal entries as factors.
This means that in the expansion of the determinant det(A − λI), every term contains n − 2 or fewer factors with a λ, except for the diagonal term, which contains exactly n factors with a λ.
Therefore, the chacteristic polynomial of A can be written as
p(λ) = det(A − λI) = (a11 − λ)(a22 − λ) · · · (ann − λ) + pn−2(λ)
where pn−2(λ) is a polynomial in λ of degree n − 2 or less.
The coefficient of λn−1 is therefore
n−1 n−1 (−1) (a11 + a22 + · · · + ann) = (−1) tr(A),
and therefore tr(A) = λ1 + λ2 + · · · + λn.
Question 7. [p 341. #23]
Let A be an n × n real matrix with the property that AT = A, let x be any vector in Cn and let q = xT Ax, show that q = q, that is, q is a real number. Hint: Justify each of the equalities below:
T q = xT Ax = xT Ax = xT Ax = xT Ax = xT AT x = q. Solution: The only equality that is not immediately evident (at least to me) is
T xT Ax = xT Ax , however, note that the matrix on the left hand side is a 1 × 1 matrix, and so its transpose is again the same 1 × 1 matrix. Question 8. [p 341. #24]
Let A be an n × n real symmetric matrix, that is, A has real entries and AT = A. Show that if Ax = λx for some nonzero vector in Cn, then, in fact, λ is real and the real part of x is an eigenvector of A. Hint: Compute xT Ax and use question 7. Also, examine the real and imaginary parts of Ax.
Solution: Suppose that Ax = λx, where x ∈ Cn and x 6= 0, then
q = xT Ax = xT λx = λxT x,
and T q = λxT x = λxT x = λ xT x = λxT x, since xT x is a 1 × 1 matrix. From the previous problem, q is real, so that q = q, and
(λ − λ)xT x = 0,
and since x 6= 0, then
T 2 2 2 x x = |x1| + |x2| + · · · + |xn| 6= 0, so we must have λ − λ = 0, that is, λ is real.
Let x = u + iv, where u, v ∈ Rn, then Ax = λx implies that
Au + iAv = λu + iλv
and equating real and imaginary parts, we have
Au = λu and Av = λv,
so that the real part of x is an eigenvector of A corresponding to the eigenvalue λ (as is the imaginary part
of x).
Question 9. [p 371. #2]
Let A and B be n × n matrices. Show that if x is an eigenvector of the matrix product AB and Bx 6= 0, then Bx is an eigenvector of BA.
Solution: We have ABx = λx, so that
BA(Bx) = λ(Bx),
and if Bx 6= 0, then Bx is an eigenvector of BA. Question 10. [p 372. #17]
Let A be a 2 × 2 real matrix, show that the eigenvalues of A are both real if and only if
2 (tr(A)) − 4 · det(A) ≥ 0.
Hint: The characteristic polynomial of a 2 × 2 matrix A is given by
2 p(λ) = λ − tr(A) · λ + det(A).
Solution: Completing the square, we have
2 2 tr(A)2 tr(A)2 tr(A) (tr(A)) p(λ) = λ2 − tr(A) · λ + + det(A) − = λ − + det(A) − , 4 4 2 4 and the eigenvalues of A are both real if and only if
2 (tr(A)) det(A) − ≤ 0, 4 that is, if and only if 2 (tr(A)) − 4 · det(A) ≥ 0.