
MATH 205 HOMEWORK #5 OFFICIAL SOLUTION Problem 1: An inner product on a vector space V over F is a bilinear map h·; ·i : V × V ! F satisfying the extra conditions •h v; wi = hw; vi, and •h v; vi ≥ 0, with equality if and only if v = 0. n (a) Show that the standard dot product on R is an inner product. R 1 (b) Show that (f; g) 7! f(x)g(x) dx is an inner product on C ([0; 1]; R). (c) Suppose that F is ordered. Prove that for any v; w 2 V , hv; wi2 ≤ hv; vihw; wi: When does equality hold? What standard inequality in trigonometry does this reflect when n V = R ? (d) We say that two vectors v; w in V are orthogonal if hv; wi = 0. Suppose that T : V ! V is a linear transformation satisfying hT v; wi = hv; T wi for all v; w. Show that eigenvectors of T with different eigenvalues are orthogonal. Solution: Note that if we can show that a bilinear form is symmetric, then it is linear in the first variable if and only if it is linear in the second. Thus showing linearity in one variable is enough. (a) Let v = (a1; : : : ; an) and w = (b1; : : : ; bn). The standard dot product is symmetric, as hv; wi = a1b1 + ··· + anbn = b1a1 + ··· + bnan = hw; vi: Let λ 2 R and u = (c1; : : : ; cn). Then hv+λu, wi = (a1+λc1)b1+···+(an+λcn)bn = a1b1+···+anbn+λ(c1b1+···+cnbn) = hv; wi+λhu; wi: Thus h·; ·i is bilinear. In addition, 2 2 hv; vi = a1 + ··· + an ≥ 0; with equality if and only if ai = 0 for all i. Thus the standard dot product is an inner products, as desired. (b) Note that Z Z hf; gi = f(x)g(x) dx = g(x)f(x) dx = hg; fi; 1 so that h·; ·i is symmetric. Let λ 2 R, h 2 C ([0; 1]; R). Then Z Z Z (f(x) + λh(x))g(x) dx = f(x)g(x) dx + λ h(x)g(x) dx; so that h·; ·i is bilinear. Lastly, suppose f(x) 6= 0, so that there exists x0 such that f(x0) 6= 0. Then there exists some δ such that jf(x)j > jf(x0)=2j for jx − x0j < δ. Then Z 1 Z x0+δ 2 2 2 f(x) dx ≥ jf(x0)=2j dx = jf(x0)=2j δ > 0: 0 x0 Thus if f 6= 0 then hf; fi > 0; clearly if f = 0 hf; fi = 0. Thus the second property of an inner product holds as well, and we are done. 1 2 MATH 205 HOMEWORK #5 OFFICIAL SOLUTION (c) First note that if either v or u is 0 then the inequality is an equality and holds trivially; thus we now consider the case when the two are nonzero. Let w = u − hu; vi=hv; viv. Then we have hw; vi = hu; vi − hu; vi = 0: Now consider 0 ≤ hw; wi = hu; ui − 2hu; vi2=hv; vi + hu; vi2=hv; vi = hu; ui − hu; vi2=hv; vi: Rearranging this, we get hu; uihv; vi ≥ hu; vi2; as desired. Equality will hold exactly when w = 0, which means that u is a scalar multiple of v. Note that in the classical case, this reduces to the inequality cos2 θ ≤ 1. (d) Suppose that u is an eigenvector of T with eigenvalue λ and v is an eigenvector of T with eigenvalue ρ. Then λhu; vi = hT u; vi = hu; T vi = ρhu; vi: Thus if λ 6= ρ we must have hu; vi = 0. In other words, u and v are orthogonal. Problem 2: Show that (F ⊕∞)∗ =∼ F ×∞. Conclude that it is not the case that V and V ∗ are always isomorphic. Solution: Suppose that f 2 (F ⊕∞)∗. Then f is uniquely determined by its value on a basis of ⊕∞ 1 F . Taking the standard basis feigi=1, where ei has a 1 in the ith position and 0s elsewhere, we see that the data of f consists exactly of a scalar in F for each i 2 I. These add and scale pointwise. Thus f corresponds exactly to a vector in F ×∞. Since F ×∞ is never isomorphic to F ⊕∞ we see that V ∗ is not always isomorphic to V . Problem 3: Suppose that F 0 is a field containing F and V is an F -vector space. If we consider F 0 to be an F -vector space, we can form the tensor product F 0 ⊗ V , which is naturally an F -vector space. Show that it is also an F 0-vector space. This is called the change of base of V . Solution: In order to show that F 0⊗V is an F 0-vector space we need to define scalar multiplication. For a pure tensor α ⊗ v we define scalar multiplication by λ 2 F 0 by λ(α ⊗ v) = (λα) ⊗ v: If this is well-defined then it is clearly associative, since multiplication in F 0 is associative. Similarly, it is unital: if λ = 1 then it clearly returns the same vector. We also have λ(α ⊗ v + β ⊗ w) = (λα) ⊗ v + (λβ) ⊗ w and (λ + ρ)(α ⊗ v) = ((λ + ρ)α) ⊗ v = (λα + ρα) ⊗ v = (λα) ⊗ v + (ρα) ⊗ v: Thus if scalar multiplication is well-defined then it satisfies the axioms of a vector space. We have defined scalar multiplication by its action on the generators of F 0 ⊗ V . To show that this is well-defined we need to check that it preserves the relations; in other words, that the scalar MATH 205 HOMEWORK #5 OFFICIAL SOLUTION 3 multiple of any element in A0 stays in A0. We therefore check: λ((α + β) ⊗ v − α ⊗ v − β ⊗ v) = (λ(α + beta)) ⊗ v − (λα) ⊗ v − (λβ) ⊗ v = ((λα) + (λβ)) ⊗ v − (λα) ⊗ v − (λβ) ⊗ v 2 A0: 0 0 0 0 λ(α ⊗ (v + v ) − α ⊗ v − α ⊗ v ) = (λα) ⊗ (v + v ) − (λα) ⊗ v − (λα) ⊗ v 2 A0: λ((aα) ⊗ v − a(α ⊗ v)) = (λaα) ⊗ v − (λa)(α ⊗ v) = (aλα) ⊗ v − (a(λ(α ⊗ v))) = (aλα) ⊗ v − a((λα) ⊗ v) 2 A0: λ(α ⊗ (av) − a(α ⊗ v)) = (λα) ⊗ (av) − a(λ(α ⊗ v)) = (λα) ⊗ (av) − a((λα) ⊗ v) 2 A0: 0 0 Thus multiplication by λ preserves A0, and is thus well-defined on F ⊗ V . Thus F ⊗ V is an F 0-vector space. Problem 4: Prove the universal property for tensor products. In other words, show that for any vector spaces U, V , and W there is a bijection bilinear maps linear maps ! : U × V ! W U ⊗ V ! W Solution: Note that there exists a bilinear map ' : U × V ! U ⊗ V which maps (u; v) to u ⊗ v. Thus we have a function linear maps bilinear maps R: −! U ⊗ V ! W U × V ! W defined by mapping a linear map T : U ⊗ V ! W to the bilinear map T ◦ '. We need to show that this map is a bijection. FIrst we check that R is injective. Recall that S = fu ⊗ v j u 2 U; v 2 V g is a spanning set for U ⊗ V ; thus any linear map T : U ⊗ V ! W is uniquely defined by its values on S. if 0 0 0 T;T : U ⊗ V ! W satisfy T ◦ ' = T ◦ '; then by definition T jS = T jS; thus T = T . Thus R is injective. Now we need to check that R is surjective. Let T : U × V ! W be a bilinear map. We need to show that there exists T 0 : U ⊗ V ! W such that T 0 ◦ ' = T . We define T 0(u ⊗ v) = T (u; v): If this is well-defined then it satisfies the desired condition; thus we need to check that it is zero on A0. We check this on the generators of A0: T 0((u + u0) ⊗ v − u ⊗ v − u0 ⊗ v0) = T (u + u0; v) − T (u; v) − T (u0; v) = T (u + u0; v) − T (u + u0; v) = 0 T 0(u ⊗ (v + v0) − u ⊗ v − u ⊗ v0) = T (u; v + v0) − T (u; v) − T (u; v0) = T (u; v + v0) − T (u; v + v0) = 0 T 0((λu) ⊗ v − λ(u ⊗ v)) = T (λu, v) − λT (u; v) = λT (u; v) − λT (u; v) = 0 T 0(u ⊗ (λv) − λ(u ⊗ v)) = T (u; λv) − λT (u; v) = λT (u; v) − λT (u; v) = 0: Thus it is well-defined and R is surjective. m n Problem 5: Let fuigi=1 and fvjgj=1 be bases of U and V , respectively. Show that a general Pm Pn element w = i=1 j=1 wijui ⊗ vj is the sum of r pure tensors if and only if the m × n matrix (wij) has rank at most r. 4 MATH 205 HOMEWORK #5 OFFICIAL SOLUTION m Solution: Note that if we replace the basis fuigi=1 with the basis (u1; : : : ; λui + uj; ui+1; : : : ; um) then the relationship between the matrix (wij) and the matrix for w written in terms of the new −1 basis is the same, except that column j turns into Cj − λ Ci. Thus we can do column reductions m on the matrix (wij) by replacing the basis fuigi=1. Similarly, we can do row reductions by replacing n the basis fvjgj=1. A matrix has rank r if and only if it can be reduced to a matrix with r 1's, with no two 1's in any row or column, using row and column reductions. If the matrix (wij) has only r entries then we can rewrite w as a sum of r pure tensors.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages6 Page
-
File Size-