<<

(c)2015 UM Math Dept licensed under a Creative Commons By-NC-SA 4.0 International License.

Math 217: True False Practice Professor Karen Smith

1. For every 2 × 2 A, there exists a 2 × 2 matrix B such that det(A + B) 6= det A + det B.

Solution note: False! Take A to be the zero matrix for a counterexample.

  2. If the change of matrix SA→B = ~e4 ~e3 ~e2 ~e1 , then the elements of A are the same as the element of B, but in a different order. Solution note: True. The matrix tells us that the first element of A is the fourth element of B, the second element of basis A is the third element of B, the third element of basis A is the second element of B, and the the fourth element of basis A is the first element of B.

6 6 3. Every linear transformation R → R is an isomorphism. Solution note: False. The zero map is a a counter example.

4. If matrix B is obtained from A by swapping two rows and det A < det B, then A is invertible. Solution note: True: we only need to know det A 6= 0. But if det A = 0, then det B = − det A = 0, so det A < det B does not hold.

5. If two n × n matrices have the same , they are similar. Solution note: False: The only matrix similar to the zero matrix is itself. But many 1 0 matrices besides the zero matrix have determinant zero, such as . 0 0

6. If all the entries of an invertible matrix A are integers, the same is true for A−1.

1 0 Solution note: False: Take . 0 2

7. There is no square A such that det(7A) = 7 det(A).

Solution note: False: the zero matrix is a counter example

8. If hx, yi = hx, zi for vectors x, y, z in an inner product , then y − z is orthogonal to x.

Solution note: True: 0 = hx, yi−hx, zi = hx, y−zi, so the x and y−z are orthogonal. m ⊥ 9. Suppose A is a m × n matrix whose columns span a subspace V of R . If ~b ∈ V , then A~x = ~b has a unique least squares solution.

1 0 0 Solution note: False. Let A be any matrix with a non-trivial , such as . 0 1 0 3 Its columns span the xy plane in R and the ⊥ The vector ~e3 is in V .

10. If the columns of a matrix A are orthonormal, then A is orthogonal.

Solution note: False. Take the 3 × 2 matrix whose first two columns are ~e1,~e2. Orthogonal matrices are SQUARE.

11. In any , ||f||2 = hf, fi for all f.

Solution note: True. Definition.

12. In any inner product space, if ||f|| = ||g||, then f = g.

2 Solution note: False! Take ~e1 and ~e2 in R with the standard inner (dot) product.

13. If f and g are elements in an inner product space satisfying ||f|| = 2, ||g|| = 4 and ||f +g|| = 5, then it is possible to find the exact value of hf, gi

Solution note: True: 52 = ||f + g||2 = hf + g, if + g = ||f||2 + 2hf, gi + ||g||2 = 22 + 2(hf, gi) + 42. This can be solved for hf, gi, which is 5/2.

14. If for some vectors y, z in an inner product space, hx, yi = hx, zi for all x, then y = z.

Solution note: True. hx, yi = hx, zi implies hx, y−zi = 0 for all x. Now take x = y−z. So hy − z, y − zi = 0, which means y − z = 0.

15. There is no invertible 5 × 5 matrix such that A such that det(5A) = 5 det(A).

Solution note: TRUE: det(5A) = 55 det(A). This is never equal to 5 det(A) because det A 6= 0.

1  1  5 3 16. There is a linear transformation R → R sending ~e1 and ~e2 to 0 , and ~e3 to −1 . 2 2

Solution note: True. Take any 3×5 matrix whose first three columns are the specified vectors.

17. There exist vectors ~x,~y in an inner product space such that ||~x|| = 1, ||~y|| = 2, and ||~x+~y|| = 3 and hx, yi = 0. 3×3 3 18. There exists a 3 × 3 matrix P such that the linear transformation T : R → R defined by T (M) = MP − PM is an isomorphism.

Solution note: False: the matrix I3 is in the kernel for any choice of P .

19. If A is a n × d matrix with factorization A = QR, then the columns of A and of Q span the n same subspace of R . Solution note: True: the columns of Q are an for the space spanned by the columns of A.

20. If A has orthogonal columns, and A = QR is its QR-factorization, then R is diagonal. Solution note: True: When doing Gram-Schmidt, we only wind up each col- umn of A by its length to make it unit length—the columns are already . 1 So if v1, . . . , v are the columns of A, then each column of Q is u1 = vi. So the d ||vi|| i-th column of the matrix R is vi written in the orthonormal basis of u1, . . . , ud. Obviously, this is a 0u1 + ··· + ||vi||ui + 0ui+1 + ... 0ud, so R is diagonal: the only non-zero entries are in the spots ii.

n ~ 21. If (~v1, . . . , vd) is a basis for the subspace V of R and b ∈ V , then the least squares solutions   ~   ~ of ~v1 ~v2 . . . ~vd ~x = b are exact solutions to ~v1 ~v2 . . . ~vd ~x = b.

~   Solution note: True: since b ∈ V , it is in the span of the columns of ~v1 ~v2 . . . ~vd . So it is consistent, so the least squares solutions are actual solutions.

n ~ ⊥ 22. If (~v1, . . . , vd) is a basis for the subspace V of R and b ∈ V , then the only least squares   ~ solutions of ~v1 ~v2 . . . ~vd ~x = b is the zero vector.

Solution note: TRUE. Because ~b ∈ V ⊥, the projection of ~b to V is zero. This means that the least squares solutions are the actual solutions to A~x = ~0. But any such solution ~x is a linear relation on the columns on A. We assumed these columns are linearly independent, so the only such solution is the trivial solution ~x = 0 . If the of A were LESS than d, then the claim would have been FALSE because the kernel would have been of at least 1.

23. If all entries of a 7 × 7 matrix A are 7, then det A = 77. Solution note: False! The columns are linearly dependent, so the determinant is 0.

24. If all the entries of an invertible A are integers, the same is true for A−1.

1 0 Solution note: False: Take . 0 2

25. Every surjective linear transformation V → V is an isomorphism.

Solution note: False. The derivative map on R[x] is a counterexample. If V is finite dimensional, the statement is true by rank-nullity. T 26. A matrix A is orthogonal if and only if A A = In.

1 0 Solution note: FALSE: A = 0 1 is a counterexample. The statement would be 0 0 true if A were square.

27. If a matrix A is orthogonal, then A is invertible. Solution note: True. Theorem in the book

28. If A and B are orthogonal matrices of the same size, then AB = BA.

1 0  0 −1 Solution note: False. Let A = and B = . 0 −1 1 0

7 ⊥ 29. There exists a subspace V of R such that V and V have the same dimension. Solution note: False! V and V ⊥ have which add up to 7. So they can’t be equal or the sum would be even.

0 1 −b 30. The matrix 0 b 1  has rank 3 for every value of b. 1 0 0

Solution note: True. The determinant is b2 + 1, which can never be 0.

31. There is a linear transformation V →T V satisfying T (x) 6= x for every non-zero vector x and 1 0 1 −7 0 0 13 1  whose B-matrix [T ]B in some basis is   . 0 1 0 0  0 −3 5 6

Solution note: False! The first vector in the basis B is taken to itself, as one look at the first column of [T ]B will tell you.

32. If A is a B-matrix for a linear transformation, then A is invertible. Solution note: False. The zero matrix of size 2 × 2 is a the matrix of the zero transformation in any basis, but not invertible.

33. If the image of a linear transformation is infinite dimensional, then the source is also infinite dimension. Solution note: True. We give a proof by contradiction. If the source were finite dimensional, then rank-nullity says that the dimension of the image is at most the dimension of the source, hence also finite dimensional. 34. The differentiation map from the space R[x] of all polynomials to itself is an isomorphism. Solution note: False: The kernel is non-zero since it includes all constant functions.

35. The map of R[x] to itself defined by f 7→ xf is an injective linear transformation. Solution note: True: Say f and g have the same image. This means xf = xg, so x(f − g) = 0. But for a non-zero polynomial, multiplying by x keeps it non-zero. So f − g = 0 which means f = g.

2 36. The map of R[x] to itself defined by f 7→ f is an injective linear transformation. Solution note: FALSE! Not linear! Note that (1 + x)2 6= 1 + x2, which menas that squaring does not respect addition!

3 5 37. If the B-matrix of a linear transformation T some basis B is , then there exists a 0 4 non-zero vector f such that T (f) = 5f.

Solution note: False! If this were true, then its B-coordinates [a b]T would satisfy 3 5 a 5a = . Trying to solve for a, b, we see that this system is inconsistent, 0 4 b 5b which means there is no such vector f.

38. If A is an 8 × 11 matrix with im A of dimension six, then (ker AT )⊥ has dimension 2.

Solution note: False, because im A = (ker AT )⊥.

39. The least squares solutions to A~x = ~b are the projection of b onto the image of A. Solution note: False! Here is a 1 by 1 counterexample: the linear system 2x = 4 has solution x = 2 (which is also the least squares solutions since the system is consistent) but the projection of 4 onto the span of 2 (in R) is 4 (again, because the system is consistent).

40. The least squares solutions to A~x = ~b are the actual solutions to the system A~x = b∗ where b∗ is such that ||~b −~b∗|| ≤ ||~b − y|| for all y in the span of the columns of A.

Solution note: True. The least squares solutions to A~x = ~b are the actual solutions to the system A~x = b∗ where b∗ is the projection to the image of A (which can also be described as the vector in the image of A closest to b). The expression||~b−~b∗|| ≤ ||~b−y|| exactly means that ~b∗ is closer to ~b than y.

41. If the columns of A are orthogonal, then AT A is diagonal.

T T Solution note: True: The ij-th entry of A A is vi vj = vi · vj (where v1, . . . , vd) are T the columns of A. Since vi · vj = 0 (perpendicular), A A is diagonal. T 42. If the columns of A are orthonormal, then AA = In for some n.

3 Solution note: False! Say that A has columns e1, e2 in R , so A is 3×2. The 1 0 0 3 T is 2 × 3 and has column (in R ) e1, e2, AA is the 3 by 3 matrix which 0 1 0. 0 0 0

43. Suppose Q is orthogonal and A is an arbitrary matrix of the same size. Then QAQT is similar to A. Solution note: True. Since QT = Q−1, we have QAQT = QAQ−1, so this is similar to A by definition of similar.

44. If A is symmetric and invertible, so is A−1.

Solution note: True! We need to check (A−1)T = A−1. We know A = AT . Since −1 −1 T −1 T T AA = In, transposing both sides we have (AA ) = I, so A ) A = In. This mean that the inverse of AT is (A−1)T . So (A−1)T = (AT )−1 = A−1.

45. If A and B are symmetric invertible matrices, then ABA−1 is also symmetric and invertible. Solution note: False. A product of invertible matrices is invertible but if we check (ABA−1)T = (A−1)T BT AT = A−1BA, we do not seem to get that this matrix is equal to its transpose. For a counterexample, we need a matrix not equal to its inverse and 2 0  2 3 1/2 0  not commuting with B. Say A = and B = . Then A−1 = , 0 −1 3 2 0 −1 and 2 0  2 3 1/2 0   4 6  1/2 0   2 −6 ABA−1 = = = 0 −1 3 2 0 −1 −3 −2 0 −1 −3/2 2

which is not symmetric.

46. Let ~v1,~v2,~v3 be the columns of a matrix A whose determinant is D. Then the matrix whose columns are ~v1, 2~v2 + ~v1, 3~v1 + 4~v2 + 5~v3 has determinant 60D.

Solution note: False. By in the second column, the determinant is det[~v1, 2~v2, 3~1+ 4~v2 + 5~v3] + det[~v1,~v1, 3~1 + 4~v2 + 5~v3], which is det[~v1, 2~v2, 3~1 + 4~v2 + 5~v3]. Expanding in the third column, this is det[~v1, 2~v2, 3~1] + det[~v1, 2~v2, 4~v2] + det[~v1, 2~v2, +5~v3] which is

n 47. If the columns of A are an orthonormal basis for R , then A is orthogonal. Solution note: TRUE! This is the definition.

48. If multiplication by a square matrix A sends every pair of perpendicular vectors to another pair of perpendicular vectors, then A is orthogonal. Solution note: False! Scaling by 3 preserves perpendicularity but not lengths of vectors. This map is linear but not orthogonal. 49. If A and B are orthogonal matrices, then so is A2B. Solution note: True. The product of two orthogonal matrices is orthogonal, so the same is true for the product of three orthogonal matrices (A(AB)).

50. Any four dimensional has infinitely many three dimensional subspaces.

Solution note: True. Let ~v1,~v2,~v3~v4 be a basis. Then the subspaces Va, spanned by (~v1,~v2,~v3 + a~v4) for a ∈ R, give infinitely many different three dimensional subspaces as a ranges through all the real numbers.

n×n 51. The set of orthogonal n × n matrices is a subspace of R . Solution note: False. Does not contain the zero mmatrix.

n×n 52. The set of symmetric n × n matrices is a subspace of R . Solution note: True: The sum of two symmetric matrices is symmetric. Scaling a produces another symmetric matrix.

2×2 53. There is a two-dimensional subspace of R whose non-zero elements are all invertible. 1 1 1 1 Solution note: True. The span of A = and B = fits the bill. To see this, 1 0 0 1 a + b a  note that an arbitrary element in the space is of the form aA + bB = . b a + b This is invertible except when a = b = 0. Indeed, the determinant is (a + b)2 − ab, which is never zero unless both a and b is not zero.

n T T 54. For any m × n matrix A and any vector ~b ∈ R , the system A A~x = A ~b is consistent. Solution note: True. This is the normal equation to find the least squares solutions.

55. If A is orthogonal, then A~x = ~b is consistent. Solution note: True: An is invertible, so there is a unique solution.

56. If A and B are bases for a finite dimensional vector space V , the change of basis matrix SB→A is invertible.

Solution note: True: SB→A has inverse SA→B.

  57. If the change of basis matrix SA→B = ~e4 ~e3 ~e2 ~e1 , then the elements of A are the same as the element of B, but in a different order.

58. For any matrix A, the matrix AAT is symmetric.

Solution note: True: Symmetric means equal to its transpose, and compute (AAT )T = (AT )T AT = AAT . 59. If A and B are symmetric, then also AB2A is symmetric.

Solution note: True: (AB2A)T = AT BT BT AT = AB2A.

60. If A is orthogonal, then also AT is orthogonal. Solution note: True: Theorem in the book.

61. There is a 5 × 5 matrix of rank 4 and determinant 1. Solution note: False. Determinant 1 means rank must be 5 .

    62. If the determinant of ~v1 ~v2 . . . ~vd is 4, then the determinant of 3~v2 + ~v1 6~v2 . . . ~vd is 18. Solution note: FALSE. Using the alternating and linearity in the first column we see the determinant is −(4 × 18) .

63. Any two vector spaces of dimension six are isomorphic. Solution note: True. Theorem in the book.