In This Handout, We Discuss Orthogonal Maps and Their Significance from A

In This Handout, We Discuss Orthogonal Maps and Their Significance from A

In this handout, we discuss orthogonal maps and their significance from a geometric standpoint. Preliminary results on the transpose The definition of an orthogonal matrix involves transpose, so we prove some facts about it first. Proposition 1. (a) If A is an ` × m matrix and B is an m × n matrix, then (AB)> = B>A>: (b) If A is an invertible n × n matrix, then A> is invertible and (A>)−1 = (A−1)>: Proof. (a) We compute the (i; j)-entries of both sides for 1 ≤ i ≤ n and 1 ≤ j ≤ `: m m m > X X > > X > > > > [(AB) ]ij = [AB]ji = AjkBki = [A ]kj[B ]ik = [B ]ik[A ]kj = [B A ]ij: k=1 k=1 k=1 (b) It suffices to show that A>(A−1)> = I. By (a), A>(A−1)> = (A−1A)> = I> = I: Orthogonal matrices Definition (Orthogonal matrix). An n × n matrix A is orthogonal if A−1 = A>. We will first show that \being orthogonal" is preserved by various matrix operations. Proposition 2. (a) If A is orthogonal, then so is A−1 = A>. (b) If A and B are orthogonal n × n matrices, then so is AB. Proof. (a) We have (A−1)> = (A>)> = A = (A−1)−1, so A−1 is orthogonal. (b) We have (AB)−1 = B−1A−1 = B>A> = (AB)>, so AB is orthogonal. The collection O(n) of n × n orthogonal matrices is the orthogonal group in dimension n. The above definition is often not how we identify orthogonal matrices, as it requires us to compute an n × n inverse. Instead, let A be an orthogonal matrix and suppose its columns are v1;:::; vn. Then we can compute 0 > 1 − v1 − 0 j j 1 > B . C v ··· v A A = @ . A @ 1 nA = (vi · vj)1≤i;j≤n; > j j − vn − so comparing with the identity matrix I, we obtain 1 Proposition 3. An n×n matrix A is orthogonal if and only if its columns form an orthonormal basis of Rn. By considering A>, we also have that A is orthogonal if and only if its rows (or rather, their transposes) form an orthonormal basis of Rn. The importance of orthogonal matrices from a geometric perspective is that they preserve dot products, and hence lengths and angles. Theorem 1. Let A 2 O(n) and x; y 2 Rn. Then (Ax) · (Ay) = x · y. Conversely, if A is an n × n matrix preserving dot products, then A 2 O(n). Proof. For the forward direction, (Ax) · (Ay) = (Ax)>(Ay) = x>A>Ay = x>y = x · y. For the reverse, if A is an n × n matrix preserving dot products, then by considering Aei for standard n basis vectors ei, the columns of A form an orthonormal basis of R , hence A 2 O(n). Corollary 1. 1. If A 2 O(n) and x 2 Rn, then kAxk = kxk. 2. If A 2 O(n) and x; y 2 Rn, then Ax ? Ay if and only if x ? y. Example: The two-dimensional orthogonal group O(2) a c Before continuing with general results, we describe the matrices in O(2). Let A = be an b d orthogonal 2 × 2 matrix. By Proposition 3, the equations a; b; c; d must satisfy are a2 + b2 = c2 + d2 = 1 and ac + bd = 0: From the last equation, (c; d) = (tb; −ta) for some t 2 R, so then 1 = c2 + d2 = t2(b2 + a2) = t2: This means that t = ±1, which gives, for a2 + b2 = 1, a b a −b A = or A = : b −a b a The solutions to a2 + b2 = 1 can be parametrised by a single real parameter θ via a = cos θ and b = sin θ, so in conclusion, cos θ − sin θ cos θ sin θ O(2) = [ : sin θ cos θ sin θ − cos θ These are matrices that we have seen before: the first set consists of counterclockwise rotations about the origin by angle θ, whereas the second set consists of reflections about lines through the origin making angles θ=2 with the positive x-axis. Algebraically, another way to separate these two sets is that the first set consists of matrices with determinant 1 and the second set consists of matrices with determinant −1. 2 The special orthogonal group The fact that the only determinants were ±1 in the two-dimensional case is not a coincidence. Proposition 4. If A 2 O(n), then det A = ±1. Proof. Since det A> = det A, we have 1 = det I = det(AA>) = (det A)(det A>) = (det A)2: Definition (Special orthogonal group). The subset SO(n) = fA 2 O(n) j det A = 1g is the special orthogonal group in dimension n. Example: The three-dimensional special orthogonal group SO(3) In three dimensions, the elements of the whole orthogonal group O(3) do not admit as simple a description as in two dimensions, but it turns out that there is a simple description of SO(3). Let A 2 SO(3), so that det A = 1. We analyse the complex eigenvalues of A. To do this, given a matrix M with complex number entries, define My to be its conjugate transpose, i.e. My is obtained from M by taking the transpose and taking the complex conjugate of every entry. If n y v 2 C isp a vector, then v v is a non-negative real number, so we can define the magnitude of v by kvk = vyv. The magnitude satisfies kcvk = jcjkvk for all v 2 Cn and c 2 C, and if v has all real entries, then the magnitude of v as a complex number vector is the same as the magnitude of v as a real number vector. Proposition 5. Let v 2 Cn and A 2 O(n; R). (Here we use the notation O(n; R) to emphasise that A has real entries.) Then kAvk = kvk. Proof. It suffices to compare square magnitudes. We have kAvk2 = (Av)y(Av) = vyAyAv: Since A is real, Ay = A>, so the middle product simplifies to I and we are left with vyv = kvk2. Corollary 2. Let λ 2 C be a complex eigenvalue of A 2 O(n; R). Then jλj = 1. Proof. Let v be an eigenvector of A with eigenvalue λ. Then kvk = kAvk = kλvk = jλjkvk; so jλj = 1 since kvk 6= 0. Returning to the specific case that A 2 SO(3; R), the eigenvalues satisfy det(A − λI) = 0, which upon expanding the left hand side is a cubic with real coefficients. This shows that A has at least one real eigenvalue, which by Corollary 2 must be ±1. Together with the fact that the product of the roots of the eigenvalue equation, counted with multiplicity, is det A = 1, we can show that A must have 1 as an eigenvalue. 3 • If A has only real roots, then each of the three roots is ±1 and their product is 1. They cannot all be −1, as the product would be −1, so at least one of the roots is 1. • If A has non-real complex roots, then since the coefficients are real, the two non-real roots must be a conjugate pair λ, λ¯. Their product is λλ¯ = jλj2 = 1 by Corollary 2, so the last root must be 1. Let u be an eigenvector of A with eigenvalue 1, and if necessary, rescale so that kuk = 1. Since u solves the linear system (A − I)u = 0, which has real coefficients, we can take u to have real coordinates, so although we passed to complex numbers above, we can now return to the 3 3 setting of R . Extend the singleton list (u) to an orthonormal basis B = (u; u2; u3) of R . If S is the matrix whose columns are the vectors of B, then the matrix of A with respect to B is B = S−1AS. Since the columns of S form an orthonormal basis, S is itself an orthogonal matrix, so by Proposition 2(b), B is orthogonal. Moreover, since Au = u, 01 ∗ ∗1 B = @0 ∗ ∗A : 0 ∗ ∗ The first column must be orthogonal to the second and third columns, so 01 0 01 B = @0 ∗ ∗A : 0 ∗ ∗ The second and third columns are an orthonormal list, so this together with det B = det A = 1 shows that the bottom 2 × 2 matrix is a two-dimensional rotation matrix. Hence A is a matrix of rotation about the u-axis, and the conclusion is that SO(3) is the collection of all rotations about lines through the origin. Reflections about hyperplanes A linear map is orthogonal if it preserves dot products, or equivalently, if its matrix (with respect to the standard basis) is an orthogonal matrix. A useful class of orthogonal maps is that of reflections. In 2 dimensions, the most important reflections are those about lines through the origin (dimension-1 subspaces), whereas in 3 dimensions, the most important reflections are those about planes through the origin (dimension-2 subspaces). To generalise, we define Definition. A (linear) hyperplane in Rn is a subspace of dimension n − 1. Equivalently, a linear hyperplane is a subspace of Rn defined by a single linear equation. Proposition 6. Let a1; : : : ; an be real numbers, not all zero. Then n V = fx 2 R j a1x1 + ··· + anxn = 0g is a linear hyperplane in Rn. Conversely, every linear hyperplane is of this form. > Proof. Given such an equation, let a = a1 ··· an . Then V is the orthogonal complement of span(a), hence has dimension n − 1.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    8 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us