
Math 217: Orthogonal Transformations Professor Karen Smith T Definition: A linear transformation Rn −! Rn is orthogonal if it preserves dot products| that is, if ~x · ~y = T (~x) · T (~y) for all vectors ~x and ~y in Rn. T Theorem: Let Rn −! Rn be a linear transformation. Then T is orthogonal if and only if it preserves the length of every vector|that is, jjT (~x)jj = jj~xjj for all ~x 2 Rn. ************************************************************************************************************************************ A. Discuss the definition and theorem with your table: What, intuitively and geometrically, really is an orthogonal transformation? What does an orthogonal transformation of R3 do to the unit cube? How does this differ from a general (non-orthonormal) linear transformation? Solution note: The unit cube is taken to another unit cube, also with one vertex at the origin. In general, a linear transformation can stretch the cube to a parallelopiped (if rank T = 3) or smash the cube to 2 dimensional parallelogram (if rank T is 2), or even a linear segment (if rank T = 1.). B. Which of the following maps are orthogonal transformations? Short, geometric justifica- tions are preferred. 1. The identity map R3 ! R3 2. R2 ! R2 given by rotation counterclockwise through θ in R2 3. The reflection R3 ! R3 over a plane (though the origin). 4. The projection R3 ! R3 onto a subspace V of dimension 2 5. Dilation R3 ! R3 by a factor of 3. cosθ −sinθ 6. Multiplication by : sinθ cosθ 3 1 7. Multiplication by : −2 5 Solution note: 1. Yes, dot product is obviously the same before or after doing nothing. 2. Yes, rotation obviously preserves the LENGTH of every vector, so by the theorem, this means rotation is an orthogonal transformation. 3. Yes, reflection obviously preserves the LENGTH of every vector, so by the theorem, this means reflection is an orthogonal transformation. 4. NO, projection will typically shorten the length of vectors. 5. NO, dilation by three obviously takes each vector to a vector of length 3 times as long. 6. Yes, this is rotation again. 3 p 7. No. The vector ~e is sent to ; which has length 13, not 1 like ~e . 1 −2 1 T C. Prove that orthogonal transformations preserve lengths: if Rn −! Rn is an orthogonal transformation, then jj~xjj = jjT (~x)jj for all ~x 2 Rn. Is the converse true? What is the book's definition of an orthogonal transformation and why is it equivalent to ours? Solution note: Say that T is orthogonal. Take arbitrary ~x 2 Rn. We want to show jj~xjj = jjT (~x)jj. By definition jj~xjj = (~x · ~x)1=2 and jjT (~x)jj = (T (~x) · T (~x))1=2. Since T is orthogonal, we know that ~x · ~x = T (~x) · T (~x), so the result follows. The converse is also true. This is the Theorem at the top! The book DEFINES an orthogonal transformation as one which preserves lengths. This is equivalent to ours by the Theorem above. T D. Let Rn −! Rn be an orthogonal transformation. 1. Prove that T is injective. [Hint: consider the kernel.] 2. Prove that T is an isomorphism. 3. Prove that the matrix of T (in standard coordinates) has columns that are orthonormal. 4. Is the composition of orthogonal transformations also orthogonal? Prove it! Solution note: 1. Let ~v 2 ker T . We want to show that ~v = 0. We know that T (~v) = 0, so jjT (~v)jj = 0. Since T is orthogonal, it preserves lengths (by the theorem), so also jj~vjj = 0. This means ~v = 0 since no other vector has length 0. So the kernel of T is trivial and T is injective. 2. We already know T is injective, so we just need to check it is surjective. By rank- nullity, n = dim ker T + rankT . So by (1), we have rankT = n, so T is surjective. Thus T is a bijective linear transformation, that is, an isomorphism. 3. Let A be the matrix of T . The columns of A are T (~e1);:::;T (~en) by our old friend the "Crucial Theorem." We need to show these are orthonormal. Compute for each i: T (~ei) · T (~ei) = ~ei · ~ei = 1 (where we have used the fact that T preserves dot product). Also compute for each pair i 6= j: T (~ei) · T (~ej) = ~ei · ~ej = 0. This means that T (~e1);:::;T (~en) are orthonormal. 4. Yes! Assume T and S are orthonormal transformations with source and target Rn. To show that S ◦ T is orthogonal, we can show it preserves LENGTHS. Take any vector ~x 2 Rn. Then jj~xjj = jjT (~x)jj since T is orthogonal (by the Theorem, or by the book definition). So also jjT (~x)jj = jjS(T (~x))jj since S is orthogonal. Putting these together we have jj~xjj = jjS(T (~x))jj: Since ~x was arbitrary, we see that S ◦ T preserves every length. QED. Note: This is a different proof that what I did in class, using the Theorem instead of the definition. E. Recall that if A is an d × n matrix with columns ~v1; : : : ;~vn, then the transpose of A is T T T the n × d matrix A whose rows are ~v1 ; : : : ;~vn : T Definition: An n × n matrix A is orthogonal if A A = In. 1. Let A = cosθ −sinθ . Compute AT A. Is A orthogonal? Is 0 −1 orthogonal? sinθ cosθ −1 0 3 2. Suppose that A = [~v1 ~v2 ~v3] is a 3 × 3 matrix with columns ~v1;~v2;~v3 2 R . Use block 2 3 ~v1 · ~v1 ~v1 · ~v2 ~v1 · ~v3 T multiplication to show that A A = 4~v2 · ~v1 ~v2 · ~v2 ~v2 · ~v35 ~v3 · ~v1 ~v3 · ~v2 ~v3 · ~v3 3. Prove that a square matrix A is orthogonal if and only if its columns are orthonormal.1 " 3=5 4=5 0# 4. Is B = −4=5 3=5 0 orthogonal? Find B−1. [Be clever! There is an easy way!] 0 0 1 5. Prove the Theorem: Let T : Rn ! Rn be a linear transformation with (standard) matrix A. Then T is orthogonal if and only if A is orthogonal. 1You can write out just the 3 × 3 case if you want. But be sure to think about what happens for n × n matrices. Solution note: 1. Yes. It is easy to check that AT A is the identity matrix in both cases (we use sin2 + cos2 = 1 for the first). 2 T 3 2 T T T 3 ~v1 ~v1 ~v1 ~v1 ~v2 ~v1 ~v3 T T T T T 2. Multiply A A = 4~v2 5 ~v1 ~v2 ~v3 = 4~v2 ~v1 ~v2 ~v2 ~v2 ~v35, which produces the de- T T T T ~v3 ~v3 ~v1 ~v3 ~v2 ~v3 ~v3 sired matrix since for any n×1 matrices (column vectors) ~w and ~v, the matrix product ~wT ~v is the same as the dot product ~w · ~v: 3. This follows immediately from the n-dimensional version of (2). Let A have columns T T ~v1; : : : ;~vn. Then the ij-th entry of A A is ~vi ~vj = ~vi·~vj. So the columns are orthonormal if and only if AT A is the identity matrix. "3=5 −4=5 0# 4. Yes. The inverse is the transpose, which is 4=5 3=5 0 0 0 1 5. We need to show two things: 1). If T is orthogonal, then A has orthonormal columns. Assume T is orthogonal. The columns of A are [T (~e1) T (~e2) :::T (~en)]. To check they are orthonormal, we need two things: T (~ei) · T (~ei) = 1 for all i, and T (~ei) · T (~ej) = 0 if i 6= j. For the first, note 2 2 T (~ei) · T (~ei) = jjT (~ei)jj = jj~eijj = 1; with the first equality coming from the definition of the length of the vector and the second coming from the definition of T being orthogonal. For the second, T (~ei)·T (~ej) = ~ei · ~ej by the Theorem above. This is zero since the standard basis is orthonormal. 2). Assume A has orthonormal columns. We need to show that for any ~x 2 Rn, jjT (~x)jj = jj~xjj. Since the length is always a non-negative number, it suffices to show jjT (~x)jj2 = jj~xjj2. That is, it suffices to show T (~x) · T (~x) = ~x · ~x. For, this we take an arbitrary ~x and write it in the basis f~e1; : : : ;~eng. Note that X 2 2 ~x · ~x = (x1~e1 + ··· + xn~en) · (x1~e1 + ··· + xn~en) = (xi~ei) · (xj~ej) = x1 + ··· + xn: ij here the third equality is using some basic properties of dot product (like "foil"), and the third equality is using the fact that the ~ei are orthonormal so that (xi~ei)·(xj~ej) = 0 if i 6= j. On the other hand, we also have T (~x) · T (~x) =T (x1~e1 + ··· + xn~en) · T (x1~e1 + ··· + xn~en) X = (xiT (~ei)) · (xjT (~ej)) ij X 2 2 = xixj T (~ei) · T (~ej) = x1 + ··· + xn ij with the last equality coming from the fact that the T (~ei)'s are the columns of A and hence orthonormal. QED. F. Prove the very important Theorem on the top of the first page. [Hint: consider x + y.] Solution note: We need to show two things: 1) If T is orthogonal, then T preserves lengths; and 2) If T preserves lengths, then T is orthogonal. We already did (1) in C. We need to do 2. Assume that for all ~x, jjT (~x)jj = jj~xjj.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages6 Page
-
File Size-