Chapter 6: Orthogonality §6.2 Orthogonal Sets §6.3 Orthogonal Projections
MTH 222
Linear Algebra
MTH 222 (Linear Algebra) Orthogonality Fall 2020 It’s good to have goals
Goals for today: • Study orthogonal sets and orthogonal matrices. • Use orthogonal projections to measure distance from a point to a line. • Define orthogonal matrices and discuss their properties. • Use orthogonal projections to decompose vectors between a subspace and its complement.
MTH 222 (Linear Algebra) Orthogonality Fall 2020 Orthogonal sets
Definition 1 n A set of vectors {u1,..., up} in R is said to be an orthogonal if ui · uj = 0 for all i 6= j. If, in addition, each ui is a unit vector, then the set is said to be orthonormal.
Example 2 Consider the vectors
3 −1 3 −2 3 8 u = 1 , v = −3 , w = 7 . 3 4 0 Then u · v = u · w = v · w = 0, so the set {u, v, w} is orthogonal. The set is not orthonormal because the vectors are not unit vectors. However, we could replace each one by its associated unit vector to obtain an orthonormal set with the same span.
MTH 222 (Linear Algebra) Orthogonality Fall 2020 Orthogonal sets
Theorem 3 n If S = {u1,..., up} is an orthogonal set of nonzero vectors in R , then S is linearly independent.
Proof.
Write 0 = c1u1 + ··· + cpup. Then
0 = 0 · u1 = (c1u1 + ··· + cpup) · u1 = c1(u1 · u1) + ··· + cp(up · u1) = c1(u1 · u1).
Since u1 · u1 6= 0 (because u1 6= 0), then c1 = 0. Repeating this argument with u2,..., up gives c2 = ··· = cp = 0. Hence, S is linearly independent.
MTH 222 (Linear Algebra) Orthogonality Fall 2020 Orthogonal sets
The following strategy is suggested by the previous theorem:
n Let {u1,..., up} be an orthogonal basis for a subspace W of R . Let u ∈ W and write y = c1u1 + ··· + cpup. Then y · ui = ci (ui · ui ), and so
y · ui ci = , i = 1,..., p. ui · ui
Example 4 Define the set nh 1 i h −1 i h 2 io S = 0 , 4 , 1 . 1 1 −2 The vectors in S are orthogonal and hence the set S is linearly independent. Since S 3 3 consists of 3 linearly independent vectors in R , it is a basis for R .
h 8 i h c1 i Let x = −4 . Then we can find the coordinates [x]S = c2 using the method above. −3 c3 Denote the vectors in S by u1, u2, u3, respectively. Then
x · u1 5 x · u2 3 x · u3 c1 = = , c2 = = − , c3 = = 2. u1 · u1 2 u2 · u2 2 u3 · u3
MTH 222 (Linear Algebra) Orthogonality Fall 2020 Orthogonal projections
2 Let p be a point in R represented by the vector y and let L = Span{u} be a line 2 through the origin in R . Suppose we decompose y as y = ˆy + z
where ˆy ∈ L and z ∈ L⊥.
Let ˆy = αu for some scalar α. Then z is orthogonal to uˆ if and only if
0 = z · u = (y − αu) · u = y · u − (αu) · u = y · u − α(u · u).
y·u y·u Hence, α = u·u and so ˆy = u·u u. Note that if we replace u by cu for any scalar c this definition does not change and thus we have defined the projection for all of L. Definition 5 n Given vectors y, u ∈ R , and L = Span{u}, the orthogonal projection of y onto L is y · u ˆy = proj y = u. L u · u
MTH 222 (Linear Algebra) Orthogonality Fall 2020 Orthogonal projections
Hence, the distance from p to L is the length of z = y − ˆy. Example 6
1 −4 Let y = [ 7 ] and let L = Span{u} where u = 2 . Then y · u 1 ˆy = proj y = u = u = −2 . L u · u 2 1 Hence, the the distance from y to L is √ √ 3 ky − ˆyk = k[ 6 ]k = 45 = 3 5.
Soon we will generalize this to larger subspaces.
MTH 222 (Linear Algebra) Orthogonality Fall 2020 Orthogonal matrices
Definition 7 n If W is a subspace of R spanned by an orthonormal set S = {u1,..., up}, then we say S is an orthonormal basis of W .
n The standard basis is an orthonormal basis of R . Theorem 8 An m × n matrix U has orthonormal columns if and only if UT U = I.
Proof. Write U = u1 ··· un . Then
T T T u1 u1 u1 u2 ··· u1 un T T T T u1 u u1 u u2 ··· u un T 2 2 2 U U = ··· u1 ··· un = . . . . T . .. . un T T T un u1 un u2 ··· un un
T Hence, U U = I if and only if ui · ui = 1 for all i and ui · uj = 0 for all i 6= j.
MTH 222 (Linear Algebra) Orthogonality Fall 2020 Orthogonal matrices
Theorem 9 n Let U be an m × n matrix with orthonormal columns and let x, y ∈ R . Then (1) kUxk = kxk (2) (Ux) · (Uy) = x · y (3) (Ux) · (Uy) = 0 if and only if x · y = 0.
Proof. We will prove (1). The rest are left as an exercise. Write U = u1 ··· un . Then
2 kUxk = Ux · Ux = (u1x1 + ··· unxn) · (u1x1 + ··· unxn)
X X X 2 X 2 2 = (ui xi ) · (uj xj ) = xi xj (ui · uj ) = xi (ui · ui ) = xi = kxk . i,j i,j i i
The first property says that such a matrix (called an orthogonal matrix) preserves length.
MTH 222 (Linear Algebra) Orthogonality Fall 2020 Projections onto subspaces
The next definition generalizes projections onto lines. Definition 10 n n Let W be a subspace of R with orthogonal basis {u1,..., up}. For y ∈ R , the orthogonal projection of y onto W is given by
y · u1 y · up projW y = u1 + ··· + up. u1 · u1 up · up
This definition matches our previous one when W is 1-dimensional. Note that
projW y ∈ W because it is a linear combination of basis elements.
Also note that the definition simplifies when the basis {u1,..., up} is orthonormal. In T n this case, if we let U = u1 ··· up , then projW y = UU y for all y ∈ R .
MTH 222 (Linear Algebra) Orthogonality Fall 2020 Orthogonal Decomposition Theorem
Theorem 11 (Orthogonal Decomposition Theorem) n n Let W be a subspace of R with orthogonal basis {u1,..., up}. Then each y ∈ R can be written uniquely in the form y = ˆy + z where ˆy ∈ W and z ∈ W ⊥. In fact
ˆy = projW y and z = y − ˆy.
Proof.
Note that if W = {0}, then this theorem is trivial. As noted above, projW y ∈ W . We claim z = y − ˆy ∈ W ⊥. We have,
y · u1 z · u1 = (y − ˆy) · u1 = y · u1 − ˆy · u1 = y · u1 − (u1 · u1) = y · u1 − y · u1 = 0. u1 · u1
⊥ It is clear that this holds similarly for u2, ··· , up. By linearity, z · y = 0, so z ∈ W .
To prove uniqueness, let y = w + x be another decomposition with w ∈ W and x ∈ W ⊥. Then w + x = y = ˆy + z, so (w − ˆy) = (z − x). But (w − ˆy) ∈ W and (z − x) ∈ W ⊥. Since W ∩ W ⊥ = {0}, then w − ˆy = 0 so w = ˆy. Similarly, z = x.
MTH 222 (Linear Algebra) Orthogonality Fall 2020 Orthogonal bases
Corollary 12 n Let W be a subspace of R with orthogonal basis {u1,..., up}. Then y ∈ W if and only if projW y = y.
Example 13
h 3 i h 1 i h −3 i Let u1 = −1 , u2 = −1 , and y = 5 . Let W = Span{u1, u2}. Then 2 −2 0
y · u1 y · u2 4 1 h −13 i ˆy = projW y = u1 + u2 = (−1)u1 − u2 = 7 . u1 · u1 u2 · u2 3 3 2 We can decompose y as y = ˆy + z where
1 h 4 i z = y − ˆy = 8 . 3 −2
Theorem 14 (Best Approximation Theorem) n n Let W be a subspace of R and y ∈ R . Then ˆy = projW y is the closest point in W to y in the sense that ky − ˆyk < ky − vk for all v ∈ W, v 6= ˆy.
MTH 222 (Linear Algebra) Orthogonality Fall 2020 Next time
In the next lecture we will: • Use the Gram-Schmidt process to construct an orthogonal basis for any subspace. • Use the method of least squares to find best approximate solutions of inconsistent systems.
MTH 222 (Linear Algebra) Orthogonality Fall 2020