Jim Lambers MAT 773 Fall Semester 2018-19 Lecture 2 Notes
These notes correspond to Sections 0.4-0.5 in the text.
Schwarz and Triangle Inequalities
|hX,Y i| = kXkkY k| cos θ| ≤ kXkkY k
Theorem 11 Suppose V is an inner product space. Then, for all X,Y ∈ V , we have the following:
• Cauchy-Schwarz Inequality: |hX,Y i| ≤ kXkkY k. Equality holds if and only if X and Y are linearly dependent. Moreover, hX,Y i = kXkkY k if and only if X of Y is a nonnegative multiple of the other.
• Triangle Inequality: kX + Y k ≤ kXk + kY k. Equality holds if and only if X or Y is a nonnegative multiple of the other.
Proof: Assume V is a real inner product space, and let t ∈ R. Then 0 ≤ kX − tY k2 = hX − tY, X − tY i = kXk2 − 2thX,Y i + t2kY k2.
The right side is a nonnegative quadratic polynomial in t, so its discriminant must be nonpositive, from which the Cauchy-Schwarz Inequality follows. If V is a complex inner product space, then we can repeat the argument with ke−iφX −tY k ≥ 0, where φ is an argument of hX,Y i; that is, hX,Y i = |hX,Y i|eiφ. We then obtain
0 ≤ ke−iφX − tY k2 = kXk2 − 2t|hX,Y i| + t2kY k2, and the rest of the argument proceeds as in the real case. For the Triangle Inequality, we have, by the Cauchy-Schwarz Inequality,
kX + Y k2 = hX + Y,X + Y i = kXk2 + 2Re{hX,Y i} + kY k2 ≤ kXk2 + 2kXkkY k + kY k2 ≤ (kXk + kY k)2, from which the result follows by taking the square root of both sides. 2
Orthogonality
Definitions and Examples 3 Using the standard inner (dot) product on R , we have hX,Y i = kXkkY k cos θ
1 where θ is the angle between X and Y . It follows that X and Y are perpendicular (orthogonal) if and only if hX,Y i = 0. This leads to the following definition.
Definition 12 Let V be an inner product space.
• X,Y ∈ V are orthogonal if hX,Y i = 0.
• The set of vectors {e1, . . . , eN } is orthonormal if
0 i 6= j he , e i = . i j 1 i = j
That is, each vector has unit length (keik = 1 for i = 1,...,N) and the vectors are pairwise orthogonal.
• Two subspaces V1 and V2 are orthogonal if every vector in V1 is orthogonal to every vector in V2.
Example 13 The line y = x generated by h1, 1i is orthogonal to the line y = −x generated by h−1, 1i. 2
3 Example 14 The line x/2 = −y = z/3 in R , which has the direction h2, −1, 3i, is orthogonal to the plane 2x − y + 3z = 0. 2
Example 15 Any two functions f(t), g(t) ∈ L2([0, 1]) that are nonzero on disjoint subintervals of R 1 [0, 1] are orthogonal, because then f(t)g(t) = 0 on [0, 1], which yields hf, gi = 0 f(t)g(t) dt = 0. 2
Example 16 Let
1 0 ≤ t < 1/2, 1 0 ≤ t < 1, φ(t) = ψ(t) = −1 1/2 ≤ t < 1, . 0 otherwise, 0 otherwise
Then Z 1/2 Z 1 hφ, ψi = 1 dt − 1 dt = 0. 0 1/2 That is, φ and ψ are orthogonal in L2([0, 1]), even though both functions are nonzero on all of [0, 1). These functions will be of importance when we study Haar wavelets. 2
Example 17 The functions f(t) = sin t and g(t) = cos t are orthogonal in L2([−π, π]), because
Z π Z π π 1 1 hf, gi = sin t cos t dt = sin 2t dt = − cos 2t = 0. −π 2 −π 4 −π
2 2 √1 √1 2 Because k sin tk = k cos tk = π, the functions π sin t and π cos t are orthonormal in L ([−π, π]). √1 √1 It can be shown that the functions fn(t) = π sin nt, gn(t) = π cos nt, n = 1, 2,... are orthonor- mal, which will be useful in our study of Fourier series. 2
2 Theorem 18 Suppose V0 is a subspace of an inner product space V , and that {e1, e2, . . . , eN } is an orthonormal basis for V0. If v ∈ V0, then
N X v = hv, ejiej. j=1
Proof: Because {e1, . . . , eN } is a basis for V0, any vector v ∈ V0 can be uniquely expressed as a linear combination of e1, . . . , eN . That is,
N X v = αjej. j=1
To obtain the coefficient αk, we take the inner product of both sides of the above equation with ek, which yields N X hv, eki = αjhej, eki = αk j=1 due to the orthonormality of e1, . . . , eN . 2
Orthogonal Projections
N X v = αjej j=1
Definition 19 Suppose V0 is a finite-dimensional subspace of an inner product space V . For any vector v ∈ V , the orthogonal projection of v onto V0 is the unique vector v0 ∈ V0 that is closest to V ; that is,
kv − v0k = min kv − uk. u∈V0
We also say that v0 is the best approximation of v by a vector in V0.
Theorem 20 Suppose V0 is a finite-dimensional subspace of an inner product space V , and let v ∈ V . Then v0 ∈ V0 is the orthogonal projection of v onto V0 if and only if v − v0 is orthogonal to every vector in V0.
Proof: Suppose that v0 is the orthogonal projection of v onto V0, and let u ∈ V0. Then the function
2 f(t) = kv0 + tu − vk , which is a quadratic polynomial in t, has a minimum at t = 0. It follows that f 0(0) = 0, which yields hu, v0 − vi = 0. For the converse, suppose that hv − v0, ui = 0 for all u ∈ V0. Then, for u ∈ V0, we have
2 2 2 2 kv − uk = kv − v0 + v0 − uk = kv − v0k + 2hv − v0, v0 − ui + kv0 − uk ,
3 but the second term on the right side vanishes because v0 − u ∈ V0, due to V0 being a subspace. We conclude that kv − v0k ≤ kv − uk, and therefore v0 is the orthogonal projection. 2
Theorem 21 Let V be an inner product space and V0 be an N-dimensional subspace of V with orthonormal basis {e1, . . . , eN }. Then the orthogonal projection of v onto V0, denoted by v0, is given by N X v0 = αjej, j=1
where αj = hv, eji.
Proof: Because v0 is the orthogonal projection, by Theorem 20, hv − v0, eki = 0 for k = 1,...,N. That is, hv, eki = hv0, eki. But by Theorem 18, αk = hv0, eki, which yields αk = hv, eki. 2
2 Example 22 Let V0 = span{cos x, sin x} be a subspace of L ([−π, π]). Then an orthonormal basis for V0 is {e1, e2} where cos x sin x e = √ , e = √ . 1 π 2 π
Let f(x) = x. Then the orthogonal projection of f onto V0 is given by
f0 = hf, e1ie1 + hf, e2ie2 = 2 sin x, as can be determined by computing the inner products directly through integration on [−π, π]. 2
Example 23 Let φ(t) and ψ(t) be defined as in Example 16, and let f(t) = t. Then the orthogonal 2 projection of f onto V1 = span{φ, ψ}, as a subspace of L ([0, 1]), is
f1 = hf, φiφ + hf, ψiψ Z 1 Z 1 = xφ(x) dx φ + xψ(x) dx ψ 0 0 Z 1 Z 1/2 Z 1 ! = x dx φ + x dx − x dx ψ 0 0 1/2 = φ/2 − ψ/4 1/4 0 ≤ t < 1/2, = 3/4 1/2 ≤ t < 1. 2
Definition 24 Let V0 be a subspace of an inner product space V . The orthogonal ⊥ complement of V0, denoted by V0 , is the set of all vectors in V which are orthogonal to V0. That is, ⊥ V0 = {v ∈ V : hv, wi = 0 ∀w ∈ V0} .
4 Theorem 25 Let V0 be a finite-dimensional subspace of an inner product space V , and let v ∈ V . Then v has a unique decomposition of the form
v = v0 + v1
⊥ where v0 ∈ V0 and v1 ∈ V0 . That is,
⊥ V = V0 ⊕ V0 .
Proof: Let v0 be the orthogonal projection of v onto V0. Then, by Theorem 20, v1 = v − v0 is ⊥ orthogonal to v0, and therefore v1 ∈ V0 . The uniqueness of v0 implies the same for v1. 2
Example 26 Consider the plane V0 = {2x − y + 3z = 0}. The vectors 1 1 e1 = √ (1, −4, −2), e2 = √ (2, 1, −1) 21 6
3 form an orthonormal basis for V0. Given any vector v = (x, y, z) ∈ R , the vector x − 4y − 2z 2x + y − z v = hv, e ie + hv, e ie = (1, −4, −2) + (2, 1, −1) 0 1 1 2 2 21 6 is the orthogonal projection of√v onto V0. The vector e3 = (2, −1, 3)/ 14 is a unit vector that is perpendicular (orthogonal) to V0. There- fore 2x − y + 3z v = hv, e ie = (2, −1, 3) 1 3 3 14 ⊥ is the orthogonal projection of v onto V0 . 2
Gram-Schmidt Orthogonalization
Theorem 27 Let V0 be a N-dimensional subspace of an inner product space V , and let {v1, v2, . . . , vN } be a basis for V0. Then there exists an orthonormal basis {e1, . . . , eN } for V0 such that for j = 1,...,N, ej is a linear combination of v1, . . . , vj.
Proof: Let N = 1. Then e1 = v1/kv1k. Now, suppose N > 1, and we have an orthonormal basis {e1, . . . , eN−1} for span{v1, . . . , vN−1} such that for j = 1,...,N − 1,, ej is a linear combination of v1, . . . , vj. Then, if we let
N−1 X wN w = v − hv , e ie , e = , N N N j j N kw k j=1 N it can be shown directly that eN is orthogonal to e1, . . . , eN−1. Clearly, it is also a unit vector, and a linear combination of v1, . . . , vN . Therefore, {e1, . . . , eN } is an orthonormal basis for V0. 2
Exercises
Chapter 0: Exercises 9, 10, 11, 13.
5