<<

18.06 Recitation 10 Solutions May 15, 2018

For each of the following statements, decide if it is true or false. If false, give a counterexample. Assume that the dimensions of all of the matrices make sense.

1. Suppose that A is a 2 × 2 with an eigenvalue of 1 and corresponding eigenvector 1 2T . Then, for every vector v, Akv → 1 2T . False. For example, the matrix 3 −1 A = 2 0 1 1 1 has eigenvalues 1 and 2 with eigenvectors and . For v = , Akv will grow to infinity. 2 1 1 A statement that is true is: ”Suppose that A is a 2 × 2 matrix with an eigenvalue of 1 and corresponding eigenvector 1 2T , and A has another eigenvalue λ with |λ| < 1. Then, for every vector v, Akv → c 1 2T for some constant c depending on v.

2. (A) = rank(AAT ) = rank(AT A) True. If Av = 0, we also have that AT Av = 0. Conversely, if AT Av = 0, then kAvk = vT AT Av = 0, and so Av = 0. This tells us that N(A) = N(AT A). By the rank-nullity theorem, we then have that rank(A) = rank(AT A). Since rank(A) = rank(AT ) = rank(AAT ) by similar reasoning, we have that the statement is true.

3. If a matrix A has repeated eigenvalues, then it is not diagonalizable. False. For example, the matrix 1 1 . 0 1

4. The eigenvalues of an A are all 1 or −1. False. For example, the matrix  0 1 −1 0 has eigenvalues i and −i. What is true is that all eigenvalues of an orthogonal matrix have that |λ| = 1.

5. If the vectors v1,..., vn are orthonormal to one another, then they are also linearly indepen- dent.

True. If c1v1 + ... + cnvn = 0, then taking the inner product of both sides with vi gives us that ci = 0. This is true for all i. 6. The sum of two positive definite matrices is positive definite. True. If A and B are positive definite, then for any v 6= 0, we have that

vT (A + B)v = vT Av + vT Bv > 0,

since this is true for both A and B. Also, the sum of two symmetric matrices is still symmetric. 7. If P is a , then it is symmetric. True. If P = A(AT A)−1AT is the projection onto C(A), then we can check explicitly that P T = P .

8. The following are vector spaces:

(a) The set of triples (a, b, c) such that a + b + c = 1. False. Linearity condition (if v, w are in the space, is cv + dw in the space) doesn’t hold. (b) The set of invertible 2 × 2 matrices. False. Linearity fails. (c) The null space of a matrix A. True. Check linear combinations remain in space. (d) The space of polynomials of degree at most 3. True. Check linear combinations remain in space. df (e) The set of functions f(x) such that 2f + dx = 0. True. Check linear combinations remain in space.

9. Let T be the linear transformation acting on the space of polynomials of degree ≤ 3, that d takes each polynomial to its derivative (T (p(x)) = dx p(x)). This transformation is invertible. False. The transformation is not injective. For example, both the functions p(x) = 1 and q(x) = 2 map to the zero function under T .

10. If A has distinct eigenvalues, then it has distinct singular values. False. For example,  0 1 −1 0 has eigenvalues i, −i but singular values 1, 1.

11. If two matrices have the same eigenvalues, then they are similar. False. For example, 1 0 1 1 and 0 1 0 1 both have eigenvalues 1, 1 but are not similar because the first matrix is diagonalizable but the second one is not.

12. If two diagonalizable matrices have the same eigenvalues, then they are similar. True. They are both similar to the same and therefore are similar to each other.

13. If two matrices are similar, then they have the same eigenvalues. True. If A = XBX−1, then B having an eigenvalue-eigenvector pair of λ, v means that A has an eigenvalue-eigenvector pair of λ, Xv. 14. If two matrices are similar, then they have the same singular values. False. For example, 0 c 0 0 has singular values 0 and c, but all matrices of this form are similar.

15. A cannot be similar to the matrix A + I. True. If A has an eigenvalue-eigenvector pair λ, v, then (A + I)v = (λ + 1)v. Therefore, the eigenvalues of A + I are the eigenvalues of A, plus one. Since the sets of eigenvalues for the two matrices are different, the matrices are not similar.

16. If A is row reduced to reduced , then the pivot columns of A form a basis for the column space of A. False. The matrix 1 1 1 1 reduces to 1 1 . 0 0

The pivot column is 1 0T , which is not a basis for the column space of the first matrix.

17. Let T be a linear transformation. If we know that relative to the input basis {vi} and the 2 output basis {wi}, T is represented by the matrix A, then T is represented by the matrix A2.

False. This is only true when vi = wi for all i. 18. A that is not the identity always has complex eigenvalues. False. If the rotation matrix is −1 0  , 0 −1 (rotation by 180 degrees) then the eigenvalues are both −1. But other than the 0 and 180 degree rotations, every rotation matrix in 2-dimensions always has non-real eigenvalues.

19. If a matrix has all nonzero eigenvalues, then it is invertible. True. The is the product of the eigenvalues and is non-zero.

20. If a matrix does not have a basis of eigenvectors, then it is not invertible. False. For example, 1 1 . 0 1

21. If A and B have the same four fundamental subspaces, then A is a multiple of B. False. For example, 1 1 1 1 A = and B = . 0 1 1 0 22. If A is a , then C(A) = C(AT ). False. For example, 1 1 . 0 0

23. If the columns of a matrix are linearly dependent, then so are the rows. False. For example, 1 1 1 . 0 1 1

24. If all of the eigenvalues of a matrix are 0, then the matrix is the . False. For example, 0 1 . 0 0

25. If all of the singular values of a matrix are 0, then the matrix is the zero matrix. True. Such a matrix would have rank 0 and therefore be the zero matrix.

26. If a matrix A is positive definite, then any matrix similar to A is also positive definite. False. The matrices 1 0 1 1 and 0 2 0 2 are similar. The first is positive definite, but the second is not because it is not symmetric.

27. If the eigenvalues of A are real, then the eigenvalues of eA are necessarily positive. A λ True. The eigenvalues of e are e i where the λi are the eigenvalues of A. If λi is real, then eλi is positive.

28. The matrix eA is always invertible. −A λ True. The matrix has e as its inverse. Also, the eigenvalues are e i , where λi are the eigenvalues of A. Each eλi is never zero.

29. Suppose that you row reduce the matrix A to get the matrix B. Then,

(a) C(A) = C(B). False. For example, 1 1 1 1 and . 1 1 0 0 (b) N(A) = N(B). True. We are solving for vectors v such that Av = 0. This corresponds to solving a system of linear equations. Row reductions are just adding/scaling these linear equations which does not change the solutions.

30. A matrix with real eigenvalues and n real eigenvectors is always symmetric. False. For example, 1 1 . 0 2 31. A matrix with real eigenvalues and n orthonormal eigenvectors is always symmetric. True. Such a matrix A can then be written as A = XΛX−1, where X is orthogonal. So A = XΛXT , and AT = (XΛXT )T = XΛT XT = XΛXT = A. 32. The inverse of an invertible is always symmetric. True. If AT = A, then (A−1)T = (AT )−1 = A−1. 33. If A is positive definite, then A−1 is also positive definite. −1 True. If the eigenvalues of A are λi > 0, then the eigenvalues of A are 1/λi > 0. 34. A positive definite matrix cannot have a 0 on its main diagonal. True. If 0 is the ith entry down the diagonal, then let x be the vector of all 0’s except for a 1 in the ith entry. Then, xT Ax = 0, and so A could not be positive definite. 35. Suppose that you are trying to solve Ax = b. If A has dimensions m × n (m equations, n variables), then: (a) if m < n, the system is always solvable. False. Consider Ax = b where A is the 2 × 3 zero matrix, and b is nonzero. (b) if m > n, the system is never solvable. False. Consider Ax = b where A is the 3 × 2 zero matrix, and b is zero. (c) if m = n, the system has at most one solution. False. Consider Ax = b where A is the 2 × 2 zero matrix, and b is zero. 36. The singular value decomposition of a matrix is unique. False. For example, 1 0 2 0 1 0 −1 0 2 0 −1 0 = . 0 1 0 1 0 1 0 1 0 1 0 1

37. If T is an invertible linear transformation, then there exists an input and output basis such that T with respect to these bases is represented by the .

True. Pick any basis {vi} for your domain to be your input basis. Pick {T vi} to be your output basis. 38. If AB = I, then BA = I. False. For example, 1 0 1 0 1 0 0 1 0 0 1 0 1 0 0 0 0 = , but 0 0 = 0 0 0 . 0 0 1   0 1   0 0 1   0 1 0 1 0 0 1

39. The of polynomials has a finite basis. False. Any finite basis would only span a subset of the polynomials up to a certain degree. 40. The product of two Markov matrices is Markov. False. Let eT be the vector of all ones of the appropriate length. If M and N are Markov matrices, then eT M = eT N = eT , and so eT MN = eT N = eT . This implies that the columns of MN still sum to 1. The entries are all nonnegative since the product of two matrices with nonnegative entries is still a matrix with nonnegative entries.