Lecture 25: 6.3 Orthonormal Bases
Total Page:16
File Type:pdf, Size:1020Kb
Lecture 25: 6.3 Orthonormal Bases Wei-Ta Chu 2008/12/24 Theorem 6.3.2 Theorem 6.3.2 If S is an orthonormal basis for an n-dimensional inner product space, and if (u)s = (u1, u2, …, un) and (v)s = (v1, v2, …, vn) then: 2 2 2 u u1 u2 un 2 2 2 d(u, v) (u1 v1) (u2 v2 ) (un vn ) u, v u1v1 u2v2 unvn Remark By working with orthonormal bases, the computation of general norms and inner products can be reduced to the computation of Euclidean norms and inner products of the coordinate vectors. 2008/12/24 Elementary Linear Algebra 2 Example If R3 has the Euclidean inner product, then the norm of the vector u=(1,1,1) is However, if we let R3 have the orthonormal basis S in the last example, then we know from that the coordinate vector of u relative to S is (u)s= (1, -1/5, 7/5) The norm of u yields 2008/12/24 Elementary Linear Algebra 3 Coordinates Relative to Orthogonal Bases If S = {v1, v2, …, vn} is an orthogonal basis for a vector space V, then normalizing each of these vectors yields the orthonormal basis v v v S' 1 , 2 ,, n v1 v2 vn Thus, if u is any vector in V, it follows from theorem 6.3.1 that v v v v v v u u, 1 1 u, 2 2 u, n n v1 v1 v2 v2 v n vn or u, v1 u, v2 u, vn u 2 v1 2 v2 2 vn v1 v 2 vn The above equation expresses u as a linear combination of the vectors in the orthogonal basis S. 2008/12/24 Elementary Linear Algebra 4 Theorem 6.3.3 Theorem 6.3.3 If S = {v1, v2, …, vn} is an orthogonal set of nonzero vectors in an inner product space, then S is linearly independent. 2008/12/24 Elementary Linear Algebra 5 Proof of Theorem 6.3.3 Assume that k1v1+k2v2+…+knvn = 0. To demonstrate that S is linearly independent, we must prove that k1=k2=…=0. For each vi in S, k1v1+k2v2+…+knvn, vi= 0,vi=0 or, equivalently k1v1,vi+ k2v2,vi+…+knvn,vi=0 From the orthogonality of S it follows that vj,vi=0 when j is not equal to i, so the equation reduces to kivi,vi=0 Since the vectors in S are assumed to be nonzero, vi,vi ≠0. Therefore, ki=0. Since the subscript i is arbitrary, we have k1=k2=…=kn=0. 2008/12/24 Elementary Linear Algebra 6 Theorem 6.3.4 Theorem 6.3.4 (Projection Theorem) If W is a finite-dimensional subspace of an inner product space V, then every vector u in V can be expressed in exactly one way as u = w1 + w2 where w1 is in W and w2 is in W . u u w2 w2 W w1 w1 W 2008/12/24 Elementary Linear Algebra 7 Projection u w2 W w1 The vector w1 is called the orthogonal projection of u on W and is denoted projWu. The vector w2 is called the component of u orthogonal to W and is denote by projWu. u = projWu + projWu Since w2 = u-w1, it follows that projWu = u –projWu So we can write u = projWu + (u –projWu) u (u –proj u) W W projWu 2008/12/24 Elementary Linear Algebra 8 Theorem 6.3.5 Theorem 6.3.5 Let W be a finite-dimensional subspace of an inner product space V. If {v1, …, vr} is an orthonormal basis for W, and u is any vector in V, then projwu = u,v1v1 + u,v2v2 + … + u,vrvr If {v1, …, vr} is an orthogonal basis for W, and u is any vector in V, then u, v1 u, v2 u, vr projW u 2 v1 2 v2 2 vr Need Normalization v1 v2 vr 2008/12/24 Elementary Linear Algebra 9 Example Let R3 have the Euclidean inner product, and let W be the subspace spanned by the orthonormal vectors v1 = (0, 1, 0) and v2 = (-4/5, 0, 3/5). From the above theorem, the orthogonal projection of u = (1, 1, 1) on W is projwu =<, u v 1 > v 1 <, u v 2 > v 2 1 4 3 4 3 =(1)(0, 1, 0) ( )( , 0, )=( , 1, ) 5 5 5 25 25 The component of u orthogonal to W is 4 3 21 28 proju = u proj u =(1,1,1) ( ,1, ) ( ,0, ) w w 25 25 25 25 Observe that projWu is orthogonal to both v1 and v2. 2008/12/24 Elementary Linear Algebra 10 Finding Orthogonal/Orthonormal Bases Theorem 6.3.6 Every nonzero finite-dimensional inner product space has an orthonormal basis. Remark The step-by-step construction for converting an arbitrary basis into an orthogonal basis is called the Gram-Schmidt process. 2008/12/24 Elementary Linear Algebra 11 Proof of Theorem 6.3.6 Let V be an nonzero finite-dimensional inner product space, and suppose that {u1, u2, …, un} is any basis for V. It suffices to show that V has an orthogonal basis, since the vectors in the orthogonal basis can be normalized to produce an orthonormal basis for V. The following sequence of steps will produce an orthogonal basis {v1, v2, …, vn} for V. Step 1: Let v1 = u1. 2008/12/24 Elementary Linear Algebra 12 Proof of Theorem 6.3.6 u2 v2 = u2 –projW1u2 W1 v1 projW1u2 Step 2: We can obtain a vector v2 that is orthogonal to v1 by computing the component of u2 that is orthogonal to the space W1 spanned by v1. Of course if v2=0, then v2 is not a basis vector. But this cannot happen. Assume v2=0, Which says that u2 is multiple of u1, contradicting the linear independence of the basis S={u1,u2,…,un} 2008/12/24 Elementary Linear Algebra 13 Proof of Theorem 6.3.6 Step 3: To construct a vector v3 that is orthogonal to both v1 and v2, we compute the component of u3 orthogonal to space W2 spanned by v1 and v2. From Theorem 6.3.5(b): As in the Step 2, the linear independence of {u1,u2,…,un} u ensures that v3 ≠ 0 3 v2 v1 W 2008/12/24 Elementary Linear Algebra 14 Proof of Theorem 6.3.6 Step 4: To demonstrate a vector v4 is orthogonal to v1, v2, and v3, we compute the component of u4 orthogonal to the space W3 spanned by v1, v2, and v3. Continuing in this way, we will obtain, after n steps, an orthogonal set of vectors {v1,v2,…,vn}. Since V is n- dimensional and every orthogonal set is linearly independence, the set {v1,v2,…,vn} is an orthogonal basis for V. 2008/12/24 Elementary Linear Algebra 15 Example (Gram-Schmidt Process) Consider the vector space R3 with the Euclidean inner product. Apply the Gram-Schmidt process to transform the basis vectors u1 = (1, 1, 1), u2 = (0, 1, 1), u3 = (0, 0, 1) into an orthogonal basis {v1, v2, v3}; then normalize the orthogonal basis vectors to obtain an orthonormal basis {q1, q2, q3}. Solution: Step 1: Let v1 = u1.That is, v1 = u1 = (1, 1, 1) Step 2: Let v = u –proj u . That is, 2 2 W1 2 u, v v uproj u u 2 1 v 2 2 w1 2 22 1 v1 2 2 1 1 (0, 1, 1) (1, 1, 1) ( , , ) 3 3 3 3 2008/12/24 Elementary Linear Algebra 16 Example (Gram-Schmidt Process) We have two vectors in W2 now! Step 3: Let v = u –proj u . That is, 3 3 W2 3 u,, v u v v uproj u u 3 1 v 3 2 v 3 3 w2 3 32 1 2 2 v1 v 2 1 1/ 3 2 1 1 1 1 (0, 1, 1) (1, 1, 1) ( , , ) (0, , ) 3 2 / 3 3 3 3 2 2 Thus, v1 = (1, 1, 1), v2 = (-2/3, 1/3, 1/3), v3 = (0, -1/2, 1/2) form an orthogonal basis for R3. The norms of these vectors are 6 1 v 3, v , v 1 23 3 2 so an orthonormal basis for R3 is v11 1 1 v 2 2 1 1 q1 ( , , ), q 2 ( , , ), v13 3 3 v 2 66 6 v3 1 1 q3 (0, - , ) v3 2 2 2008/12/24 Elementary Linear Algebra 17 Theorem 6.3.7 Theorem 6.3.7 (QR-Decomposition) If A is an mn matrix with linearly independent column vectors, then A can be factored as A = QR where Q is an mn matrix with orthonormal column vectors, and R is an nn invertible upper triangular matrix. Remark In recent years the QR-decomposition has assumed growing importance as the mathematical foundation for a wide variety of practical algorithms, including a widely used algorithm for computing eigenvalues of large matrices. 2008/12/24 Elementary Linear Algebra 18 QR-Decomposition Suppose that the column vectors of A are u1, u2, …, un and the orthonormal column vectors of Q are q1, q2, …, qn; thus A = [u1 | u2 | … | un] and Q = [q1 | q2 | … | qn] From Theorem 6.3.1, the vectors u1, u2, …, un are expressible in terms of the vectors q1, q2, …, qn … … … … 2008/12/24 Elementary Linear Algebra 19 QR-Decomposition Recalling from Section 1.3 that the jth column vector of a matrix product is a linear combination of the column vectors of the first factor with coefficients coming from the jth column of the second factor.