<<

206 Solutions for HWK 24c Section 6.5 p330

Notes: A square is said to be orthogonal iff (it is invertible and) its is its inverse. So a square matrix A is orthogonal iff AT A = I. It’s quite easy to check that this condition is equivalent to the row vectors of A forming an orthonormal set. It is also equivalent to the column vectors of A forming an orthonormal set. (Try this proof on your own. If you get stuck, see the proof – for Thm 6.5.1 – that begins at the bottom of p322).

According to Theorem 6.5.5, the transition matrix from one orthonormal to another is al- ways an . Thus (for orthonormal bases) if one knows the transition matrix for translating in one direction, it is easy to write down the transition matrix for translating in the other direction: just transpose.

Problem 1, 6.5, p330. Let § 4 0 3 5 − 5 A =  9 4 12  − 25 5 − 25    12 3 16   25 5 25    (a) Show that the matrix A is orthogonal in three ways: (i) by calculating AT A, (ii) by verifying that the row vectors form an orthonormal set in Euclidean R3, and (iii) by verifying that the column vectors form an orthonormal set in Euclidean R3.

(b) Find the inverse for the matrix A.

Solution. This is just an exercise to reinforce the equivalence of the three conditions spelled out in Theorem 6.5.1. What you’ll see is that you basically end up doing the same arithmetic three times. Pay attention to how/why this happens, and you’ll see how the general proof should go.

(i) 4 9 12 4 3 5 25 25 5 0 5 − − 1 0 0 AT A =  0 4 3   9 4 12  = 0 1 0 5 5 − 25 5 − 25       0 0 1  3 12 16   12 3 16   5 25 25   25 5 25     − −   

(ii) Call the row vectors r1, r2, r3. The following dot products should be zero (so check that they are): r r , r r , r r 1 · 2 1 · 3 2 · 3

Note that this also tells us that each of the following dot products is also zero:

r r , r r , r r 2 · 1 3 · 1 3 · 2 Page 1 of 3 A. Sontag May 7, 2002 Math 206 HWK 24c Solns contd 6.5 p330

The remaining three dot products r r , r r , and r r should all be 1. Check that they are. 1 · 1 2 · 2 3 · 3 Note that the arithmetic that would be used to compute each of the 9 dot products mentioned above is exactly the arithmetic used to compute the 9 entries of AAT .

(iii). Call the columns c1, c2, c3 and follow the same outline as used for the rows. Note that the 9 possible dot products (9, when one pays attention to order) are exactly the dot products needed to compute the 9 entries of AT A.

From an earlier theorem we know that if AT A = I then so does AAT , and vice versa. So if the row vectors form an orthonormal set, then so will the column vectors, and vice versa. This completes the reasoning that’s outlined, for a general square matrix A, in the proof of 6.5.1.

(b). This is a piece of cake. The inverse of A is the transpose of A. There’s no need to compute anything. Just write down the transpose, and that’s the inverse.

Problem 16, 6.5, p330. Let a rectangular x0y0-coordinate system be obtained by rotating a § 3π rectangular xy-coordinate system counterclockwise through the angle θ = 4 .

(a) Find the x0y0-coordinates of the point whose xy-coordinates are (x, y) = ( 2, 6). − (b) Find the xy-coordinates of the point whose x0y0-coordinates are (x0, y0) = (5, 2).

Solution. Example 6 shows that the transition matrix for switching from xy-coordinates to x0y0-coordinates will be

3π 3π 1 cos θ sin θ cos 4 sin 4 1 1 = 3π 3π = − sin θ cos θ sin cos √2 1 1 − − 4 4 − −

Moreover, the transition matrix for switching in the opposite direction will be the transpose of this one. (This is because the transition matrix for switching from an orthonormal basis to another orthonormal basis is an orthogonal matrix; its transpose is its inverse. In other words, the two transition matrices are transposes of each other, since transposes are inverses in this case.)

(a) From the transition matrix already obtained, we know that

x0 1 1 1 x = − y0 √2 1 1 y − −

Thereofre the point whose xy-coordinates are x = 2, y = 6 has x0y0-coordinates given by −

x0 1 1 1 2 1 8 √2 8 4 = − − = = = √2 y0 √2 1 1 6 √2 4 2 4 2 − − − − − Page 2 of 3 A. Sontag May 7, 2002 Math 206 HWK 24c Solns contd 6.5 p330

or (x0, y0) = (4√2, 2√2) −

(b) Given that the x0y0-coordinates are x0 = 5 and y0 = 2, the xy-coordinates are given by

T 1 1 1 5 1 1 1 5 1 7 √2 7 − = − − = − = − √2 1 1 2 √2 1 1 2 √2 3 2 3 − − − or √2 √2 (x, y) = ( 7, 3) = ( 7 , 3 ) − − 2 2

Problem 23, 6.5, p330. What conditions must a and b satisfy for the matrix § a + b b a − a b b + a − to be orthogonal?

Solution. For the given matrix to be orthogonal means that

T a + b b a a + b b a 1 0 − − = a b b + a a b b + a 0 1 − − which can be rewritten as

a + b a b a + b b a 1 0 − − = b a b + a a b b + a 0 1 − −

Carrying out the multiplication on the left-hand side, we can write this condition as

(a + b)2 + (a b)2 0 1 0 − = 0 (b a)2 + (a + b)2 0 1 − which is the same as 2(a2 + b2) 0 1 0 = 0 2(a2 + b2) 0 1 1 Thus the given matrix will be orthogonal iff 2(a2 + b2) = 1 or a2 + b2 = . 2

Page 3 of 3 A. Sontag May 7, 2002