Inner Products and Norms (Part III)

Inner Products and Norms (Part III)

Inner Products and Norms (part III) Prof. Dan A. Simovici UMB 1 / 74 Outline 1 Approximating Subspaces 2 Gram Matrices 3 The Gram-Schmidt Orthogonalization Algorithm 4 QR Decomposition of Matrices 5 Gram-Schmidt Algorithm in R 2 / 74 Approximating Subspaces Definition A subspace T of a inner product linear space is an approximating subspace if for every x 2 L there is a unique element in T that is closest to x. Theorem Let T be a subspace in the inner product space L. If x 2 L and t 2 T , then x − t 2 T ? if and only if t is the unique element of T closest to x. 3 / 74 Approximating Subspaces Proof Suppose that x − t 2 T ?. Then, for any u 2 T we have k x − u k2=k (x − t) + (t − u) k2=k x − t k2 + k t − u k2; by observing that x − t 2 T ? and t − u 2 T and applying Pythagora's 2 2 Theorem to x − t and t − u. Therefore, we have k x − u k >k x − t k , so t is the unique element of T closest to x. 4 / 74 Approximating Subspaces Proof (cont'd) Conversely, suppose that t is the unique element of T closest to x and x − t 62 T ?, that is, there exists u 2 T such that (x − t; u) 6= 0. This implies, of course, that u 6= 0L. We have k x − (t + au) k2=k x − t − au k2=k x − t k2 −2(x − t; au) + jaj2 k u k2 : 2 2 Since k x − (t + au) k >k x − t k (by the definition of t), we have 2 2 1 −2(x − t; au) + jaj k u k > 0 for every a 2 F. For a = kuk2 (x − t; u) we have: 1 1 −2(x − t; (x − t; u)u) + j (x − t; u)j2 k u k2 k u k2 k u k2 1 1 = −2(x − t; (x − t; u)u) + j (x − t; u)j2 k u k2 k u k2 k u k2 j(x − t; u)j2 j(x − t; u)j2 = −2 + k u k2 k u k2 j(x − t; u)j2 = − 0; k u k2 > which is a contradiction. 5 / 74 Approximating Subspaces Theorem A subspace T of an inner product linear space L is an approximating ? subspace L if and only if L = T T . 6 / 74 Approximating Subspaces Proof Let T be an approximating subspace of L and let x 2 L. We have x − t 2 T ?, where t is the element of T that best approximates x. If y = x − t, we can write x uniquely as x = t + y, where t 2 T and ? ? y 2 T , so L = T T . ? Conversely, suppose that L = T T , where T is a subspace of L. Every x 2 L can be uniquely written as x = t + y, where t 2 T and y 2 T ?, so x − t 2 T ?. Thus t is the element in T that is closest to x, so T is an approximating subspace of L. 7 / 74 Approximating Subspaces Theorem Any subspace T of a finite-dimensional inner product linear space L is an approximating subspace of L. 8 / 74 Approximating Subspaces ? Let T be a subspace of L. It suffices to show that L = T T . ? If T = f0Lg, then T = L and the statement is immediate. Therefore, we can assume that T 6= f0Lg. We need to verify only that every x 2 L can be uniquely written as a sum x = t + v, where t 2 T and v 2 T ?. Let t1;:::; tm be an orthonormal basis of T , that is, a basis such that ( 1 if i = j; (ti ; tj ) = 0 otherwise, for 1 6 i; j > m. Define t = (x; t1)t1 + ··· + (x; tm)tm and v = x − t. 9 / 74 Approximating Subspaces Proof (cont'd) The vector v is orthogonal to every vector ti because (v; ti ) = (x − t; ti ) = (x; ti ) − (t; ti ) = 0: Therefore v 2 T ? and x has the necessary decomposition. To prove that the decomposition is unique suppose that x = s + w, where s 2 T and ? w 2 T?. Since s + w = t + v we have s − t = v − w 2 T \ T = f0Lg, which implies s = t and w = v. 10 / 74 Approximating Subspaces Theorem Let T be a subspace of an inner product space L. We have (T ?)? = T . 11 / 74 Approximating Subspaces Observe that T ⊆ (T ?)?. Indeed, if t 2 T , then (t; z) = 0 for every z 2 T ?, so t 2 (T ?)?. To prove the reverse inclusion, let x 2 (T ?)?. We can write x = u + v, where u 2 T and v 2 T ?, so x − u = v 2 T ?. Since T ⊆ (T ?)?, we have u 2 (T ?)?, so x − u 2 (T ?)?. Consequently, x − u 2 T ? \ (T ?)? = f0g, so x = u 2 T . Thus, (T ?)? ⊆ T , which concludes the argument. 12 / 74 Approximating Subspaces Corollary n ? ? Let Z be a subset of R . We have (Z ) = hZi. 13 / 74 Approximating Subspaces Proof n ? ? Let Z be a subset of R . Since Z ⊆ hZi it follows that hZi ⊆ Z . Let ? now y 2 Z and let z = a1z1 + ··· + apzp 2 hZi, where z1;:::; zp 2 Z. Since (y; z) = a1(y; z1) + ··· + ap(y; zp) = 0; it follows that y 2 hZi?. Thus, we have Z ? = hZi?. ? ? ? ? n This allows us to write (Z ) = (hZi ) . Since hZi is a subspace of R , we have (hZi?)? = hZi, so (Z ?)? = hZi. 14 / 74 Approximating Subspaces Let W = fw1;:::; wng be a basis in the real n-dimensional inner product space L. If x = x1w1 + ··· + xnwn and y = y1w1 + ··· + ynwn, then n n X X (x; y) = xi yj (wi ; wj ); i=1 j=1 due to the bilinearity of the inner product. n×n Let A = (aij ) 2 R be the matrix defined by aij = (wi ; wj ) for 1 6 i; j 6 n. The symmetry of the inner product implies that the matrix A itself is symmetric. Now, the inner product can be expressed as 0 1 y1 B . C (x; y) = (x1;:::; xn)A @ . A : yn We refer to A as the matrix associated with W . 15 / 74 Approximating Subspaces Theorem n Let S be a subspace of R such that dim(S) = k. There exists a matrix n×k A 2 R having orthonormal columns such that S = Ran(A). 16 / 74 Approximating Subspaces Proof Let v1;:::; vk be an orthonormal basis of S. Define the matrix A as A = (v1;:::; vk ). We have x 2 S, if and only if x = a1v1 + ··· + ak vk , which is equivalent to x = Aa. This amounts to x 2 Ran(A), so S = Ran(A). 17 / 74 Approximating Subspaces For an orthonormal basis in an n-dimensional space, the associated matrix is the diagonal matrix In. In this case, we have (x; y) = x1y1 + x2y2 + ··· + xnyn for x; y 2 L. Observe that if W = fw1;:::; wng is an orthonormal set and x 2 hWi, which means that x = a1w1 + ··· + anwn, then ai = (x; wi ) for 1 6 i 6 n. 18 / 74 Approximating Subspaces Let W = fw1;:::; wng be an orthonormal set and let x 2 hWi. The equality x = (x; w1)w1 + ··· + (x; wn)wn is the Fourier expansion of x with respect to the orthonormal set W . Furthermore, we have Parseval's equality: n 2 X 2 k x k = (x; x) = (x; wi ) : i=1 Thus, if 1 6 q 6 n we have q X 2 2 (x; wi ) 6k x k : i=1 19 / 74 Gram Matrices Gram Matrices Definition Let V = (v1;:::; vm) be a sequence of vectors in an inner product space. m×m The Gram matrix of this sequence is the matrix GV = (gij ) 2 R defined by gij = (vi ; vj ) for 1 6 i; j 6 m. Note that GV is a symmetric matrix. 20 / 74 Gram Matrices Theorem Let V = (v1;:::; vm) be a sequence of m vectors in an inner product linear space (L; (·; ·)). If fv1;:::; vmg is linearly independent, then the Gram matrix GV is positive definite. 21 / 74 Gram Matrices Proof m Suppose that V is linearly independent and let x 2 R . We have m m 0 X X x GVx = xi (vi ; vj )xj i=1 j=1 0 m m 1 X X = @ xi vi ; xj vj A i=1 j=1 m 2 X = xi vi > 0: i=1 0 Therefore, if x GVx = 0, we have x1v1 + ··· + xnvn = 0. Since fv1;:::; vmg is linearly independent it follows that x1 = ··· = xm = 0, so x = 0n. Thus, GV is indeed, positive definite. 22 / 74 Gram Matrices Example S Let S = fx1;:::; xng be a finite set, L a C-linear space and L be the linear space that consists of function defined on S with values in L. We defined the linear basis of this space as fe1;:::; eng consisting of the functions ( 1 if x = xi ; ei (x) = 0 otherwise; for 1 6 i 6 n. If E = (e1;:::; en), the Gram matrix of E is positive definite Pn Pn and the inner product of two functions f = i=1 ai ei and g = j=1 bj ej is n n ! X X (f ; g) = ai ei ; bj ej i=1 i=1 n n X X = ai (ei ; ej )bj : i=1 i=1 23 / 74 Gram Matrices The Gram matrix of an arbitrary sequence is positive semidefinite, as the reader can easily verify.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    74 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us