True False Questions from 4,5,6
Total Page:16
File Type:pdf, Size:1020Kb
(c)2015 UM Math Dept licensed under a Creative Commons By-NC-SA 4.0 International License. Math 217: True False Practice Professor Karen Smith 1. For every 2 × 2 matrix A, there exists a 2 × 2 matrix B such that det(A + B) 6= det A + det B. Solution note: False! Take A to be the zero matrix for a counterexample. 2. If the change of basis matrix SA!B = ~e4 ~e3 ~e2 ~e1 , then the elements of A are the same as the element of B, but in a different order. Solution note: True. The matrix tells us that the first element of A is the fourth element of B, the second element of basis A is the third element of B, the third element of basis A is the second element of B, and the the fourth element of basis A is the first element of B. 6 6 3. Every linear transformation R ! R is an isomorphism. Solution note: False. The zero map is a a counter example. 4. If matrix B is obtained from A by swapping two rows and det A < det B, then A is invertible. Solution note: True: we only need to know det A 6= 0. But if det A = 0, then det B = − det A = 0, so det A < det B does not hold. 5. If two n × n matrices have the same determinant, they are similar. Solution note: False: The only matrix similar to the zero matrix is itself. But many 1 0 matrices besides the zero matrix have determinant zero, such as : 0 0 6. If all the entries of an invertible square matrix A are integers, the same is true for A−1. 1 0 Solution note: False: Take : 0 2 7. There is no square A such that det(7A) = 7 det(A). Solution note: False: the zero matrix is a counter example 8. If hx; yi = hx; zi for vectors x; y; z in an inner product space, then y − z is orthogonal to x. Solution note: True: 0 = hx; yi−hx; zi = hx; y−zi; so the x and y−z are orthogonal. m ? 9. Suppose A is a m × n matrix whose columns span a subspace V of R . If ~b 2 V , then A~x = ~b has a unique least squares solution. 1 0 0 Solution note: False. Let A be any matrix with a non-trivial kernel, such as : 0 1 0 3 Its columns span the xy plane in R and the ? The vector ~e3 is in V . 10. If the columns of a matrix A are orthonormal, then A is orthogonal. Solution note: False. Take the 3 × 2 matrix whose first two columns are ~e1;~e2. Orthogonal matrices are SQUARE. 11. In any inner product space, jjfjj2 = hf; fi for all f. Solution note: True. Definition. 12. In any inner product space, if jjfjj = jjgjj, then f = g. 2 Solution note: False! Take ~e1 and ~e2 in R with the standard inner (dot) product. 13. If f and g are elements in an inner product space satisfying jjfjj = 2; jjgjj = 4 and jjf +gjj = 5, then it is possible to find the exact value of hf; gi Solution note: True: 52 = jjf + gjj2 = hf + g; if + g = jjfjj2 + 2hf; gi + jjgjj2 = 22 + 2(hf; gi) + 42. This can be solved for hf; gi; which is 5=2. 14. If for some vectors y; z in an inner product space, hx; yi = hx; zi for all x, then y = z. Solution note: True. hx; yi = hx; zi implies hx; y−zi = 0 for all x. Now take x = y−z. So hy − z; y − zi = 0, which means y − z = 0. 15. There is no invertible 5 × 5 matrix such that A such that det(5A) = 5 det(A). Solution note: TRUE: det(5A) = 55 det(A). This is never equal to 5 det(A) because det A 6= 0. 213 2 1 3 5 3 16. There is a linear transformation R ! R sending ~e1 and ~e2 to 405 ; and ~e3 to 4−15 : 2 2 Solution note: True. Take any 3×5 matrix whose first three columns are the specified vectors. 17. There exist vectors ~x;~y in an inner product space such that jj~xjj = 1, jj~yjj = 2, and jj~x+~yjj = 3 and hx; yi = 0. 3×3 3 18. There exists a 3 × 3 matrix P such that the linear transformation T : R ! R defined by T (M) = MP − PM is an isomorphism. Solution note: False: the matrix I3 is in the kernel for any choice of P . 19. If A is a n × d matrix with factorization A = QR, then the columns of A and of Q span the n same subspace of R : Solution note: True: the columns of Q are an orthonormal basis for the space spanned by the columns of A. 20. If A has orthogonal columns, and A = QR is its QR-factorization, then R is diagonal. Solution note: True: When doing Gram-Schmidt, we only wind up scaling each col- umn of A by its length to make it unit length|the columns are already perpendicular. 1 So if v1; : : : ; v are the columns of A, then each column of Q is u1 = vi: So the d jjvijj i-th column of the change of basis matrix R is vi written in the orthonormal basis of u1; : : : ; ud. Obviously, this is a 0u1 + ··· + jjvijjui + 0ui+1 + ::: 0ud, so R is diagonal: the only non-zero entries are in the spots ii. n ~ 21. If (~v1; : : : ; vd) is a basis for the subspace V of R and b 2 V , then the least squares solutions ~ ~ of ~v1 ~v2 : : : ~vd ~x = b are exact solutions to ~v1 ~v2 : : : ~vd ~x = b. ~ Solution note: True: since b 2 V , it is in the span of the columns of ~v1 ~v2 : : : ~vd : So it is consistent, so the least squares solutions are actual solutions. n ~ ? 22. If (~v1; : : : ; vd) is a basis for the subspace V of R and b 2 V , then the only least squares ~ solutions of ~v1 ~v2 : : : ~vd ~x = b is the zero vector. Solution note: TRUE. Because ~b 2 V ?, the projection of ~b to V is zero. This means that the least squares solutions are the actual solutions to A~x = ~0. But any such solution ~x is a linear relation on the columns on A. We assumed these columns are linearly independent, so the only such solution is the trivial solution ~x = 0 . If the rank of A were LESS than d, then the claim would have been FALSE because the kernel would have been of dimension at least 1. 23. If all entries of a 7 × 7 matrix A are 7, then det A = 77. Solution note: False! The columns are linearly dependent, so the determinant is 0. 24. If all the entries of an invertible square matrix A are integers, the same is true for A−1. 1 0 Solution note: False: Take : 0 2 25. Every surjective linear transformation V ! V is an isomorphism. Solution note: False. The derivative map on R[x] is a counterexample. If V is finite dimensional, the statement is true by rank-nullity. T 26. A matrix A is orthogonal if and only if A A = In. 21 03 Solution note: FALSE: A = 40 15 is a counterexample. The statement would be 0 0 true if A were square. 27. If a matrix A is orthogonal, then A is invertible. Solution note: True. Theorem in the book 28. If A and B are orthogonal matrices of the same size, then AB = BA. 1 0 0 −1 Solution note: False. Let A = and B = . 0 −1 1 0 7 ? 29. There exists a subspace V of R such that V and V have the same dimension. Solution note: False! V and V ? have dimensions which add up to 7. So they can't be equal or the sum would be even. 20 1 −b3 30. The matrix 40 b 1 5 has rank 3 for every value of b. 1 0 0 Solution note: True. The determinant is b2 + 1, which can never be 0. 31. There is a linear transformation V !T V satisfying T (x) 6= x for every non-zero vector x and 21 0 1 −73 60 0 13 1 7 whose B-matrix [T ]B in some basis is 6 7 : 40 1 0 0 5 0 −3 5 6 Solution note: False! The first vector in the basis B is taken to itself, as one look at the first column of [T ]B will tell you. 32. If A is a B-matrix for a linear transformation, then A is invertible. Solution note: False. The zero matrix of size 2 × 2 is a the matrix of the zero transformation in any basis, but not invertible. 33. If the image of a linear transformation is infinite dimensional, then the source is also infinite dimension. Solution note: True. We give a proof by contradiction. If the source were finite dimensional, then rank-nullity says that the dimension of the image is at most the dimension of the source, hence also finite dimensional. 34. The differentiation map from the space R[x] of all polynomials to itself is an isomorphism. Solution note: False: The kernel is non-zero since it includes all constant functions. 35. The map of R[x] to itself defined by f 7! xf is an injective linear transformation. Solution note: True: Say f and g have the same image.