<<

More on Orthogonal Transformations and Orthonormal Bases

Math 217 Professor Karen Smith (c)2015 UM Math Dept licensed under a Creative Commons By-NC-SA 4.0 International License. Inquiry: What is the point of the change of and why is it called that? 1 1 4 1 9 A. Let V be the subspace of R spanned by B = {  ,  }. In the book, they applied the Gram 1 9 1 1 1/2 −1/2 1/2  1/2  Schmidt process to B to get the vectors A = {  ,  }. 1/2  1/2  1/2 −1/2 1. Is A also a basis for V ? What was the point of doing Gram-Schmidt? 2. Find the change of basis matrix S from B to A. Remember: this is the matrix whose columns are the elements of B written in the basis A. 2 −3 1 0  1  1 3. Use Gram-Schmidt to orthogonalize the basis C = {v1, v2, v3} = {  ,   ,  } for a 0  0  3 0 0 4 4 three-dimensional subspace W of R . Call the new basis D = {u1, u2, u3}. 4. Find the change of basis matrix from C to D. How does the orthonormality of D make this computation much easier? 5. What do you notice about the change of basis matrices in (2) and (4)? 6. Conjecture a general theorem about the change of basis matrix when the new basis is obtained from the old by the Gram Schmidt process. Can you prove your conjecture?

B. Consider the two basis B and A for the V from problem A. Let S be the change of basis matrix S from B to A.

1. Write the element ~v1 in B-coordinates, and also in A-coordinates. That is, find the column- vectors [~v1]B and [~v1]A.

2. Compute the S[~v1]B. How does it compare to [~v1]A?

3. Write the element ~v2 in B-coordinates, and also in A-coordinates. That is, find the column- vectors [~v2]B and [~v2]A. What does this have to do with multiplication by the matrix S?

4. Write the vector ~v = a~v1 + b~v2 in B-coordinates (so as a column [~v]B). What happens when we multiply this column by the change of basis matrix S? Explain. 5. Explain the name “change of basis matrix from B to A”?

C. Consider the vector space W in Problem A3, with its bases C and D. Find a matrix S such that for every vector ~x in W we have S[~x]C = [~x]D. Now find a matrix S˜ such that for all ~x in W S˜[~x]D = [~x]C? What is the relationship between S and S˜? Inquiry: How does an orthogonal transformation look?

n D. State the definition of an orthogonal transformation of R . E. Prove that an orthogonal transformation is injective. cosθ −sinθ 1 0  0 1 3/5 1 F. Which of the following are orthogonal matrices: , , , , sinθ cosθ 0 −1 1 0 4/5 0 √ 3 0   a 1 − a2  3 1 , √ (where 0 < a < 1), , 0 −2 1 − a2 −a −2 5

n G. TRUE OR FALSE. Justify. In all problems, T denotes a linear transformation from R to itself, and A is its matrix in the . n 1. If T is orthogonal, then x · y = T x · T y for all vectors x and y in R . 2. If T sends every pair of orthogonal vectors to another pair of orthogonal vectors, then T is orthogonal. 3. If T is orthogonal, then T is invertible. 4. An orthogonal projection is orthogonal. 5. If A is the matrix of an orthogonal transformation T , then the columns of A are orthonormal. 6. The of an is orthogonal. 7. The product of two orthogonal matrices (of the same size) is orthogonal. 8. If A is the matrix of an orthogonal transformation T , then AAT is the identity matrix.

−1 T n 9. If A = A , then A is the matrix of an orthogonal transformation of R .

n H. Theorem. Let B = {~v1, . . . ,~vd} be a basis for a subspace V of R . Let A = {~u1, . . . , ~ud} be the produced from B using the Gram-Schmidt process. Then we have a matrix equality (called the QR factorization of the matrix [~v1, . . . ,~vd])

[~v1 ~v2 . . . ~vd] = [~u1 ~u2 . . . ~ud]R where R is the change of basis matrix from the old basis B to the orthonormal basis A. 1. Verify this for each of the two vector spaces V and W from Problem A. 2. We can view this as a way to “factor” any n × d matrix M of d as a product of n × d matrix Q with orthogonal columns by a d × d upper triangular matrix R. Explain.  2 2  3. Find the QR-factorization of the matrix M =  1 7  . −2 −8

2 1 0  4. Find the QR-factorization of the matrix A = 0 1 1  . 0 0 −8 5. Prove that if B is an n × n , then there exists an orthogonal matrix Q such that QB is upper triangular.