Differential Forms Diff Geom II, WS 2015/16
Total Page:16
File Type:pdf, Size:1020Kb
J.M. Sullivan, TU Berlin B: Differential Forms Diff Geom II, WS 2015/16 B. DIFFERENTIAL FORMS instance, if S has k elements this gives a k-dimensional vector space with S as basis. We have already seen one-forms (covector fields) on a Given vector spaces V and W, let F be the free vector space over the set V × W. (This consists of formal sums manifold. In general, a k-form is a field of alternating k- P linear forms on the tangent spaces of a manifold. Forms ai(vi, wi) but ignores all the structure we have on the set are the natural objects for integration: a k-form can be in- V × W.) Now let R ⊂ F be the linear subspace spanned by tegrated over an oriented k-submanifold. We start with ten- all elements of the form: sor products and the exterior algebra of multivectors. (v + v0, w) − (v, w) − (v0, w), (v, w + w0) − (v, w) − (v, w0), (av, w) − a(v, w), (v, aw) − a(v, w). B1. Tensor products These correspond of course to the bilinearity conditions Recall that, if V, W and X are vector spaces, then a map we started with. The quotient vector space F/R will be the b: V × W → X is called bilinear if tensor product V ⊗ W. We have started with all possible v ⊗ w as generators and thrown in just enough relations to b(v + v0, w) = b(v, w) + b(v0, w), make the map (v, w) 7→ v ⊗ w be bilinear. b(v, w + w0) = b(v, w) + b(v, w0), The tensor product is commutative: there is a natural linear isomorphism V⊗W → W⊗V such that v⊗w 7→ w⊗v. (This b(av, w) = ab(v, w) = b(v, aw). is easiest to verify using the universal property – simply factor the bilinear map (v, w) 7→ w ⊗ v through V ⊗ W to The function b is defined on the set V × W. This Cartesian give the desired isomorphism.) product of two vector spaces can be given the structure of a vector space V ⊕ W, the direct sum. But a bilinear map Similarly, the tensor product is associative: there is a natu- b: V × W → X is completely different from a linear map ral linear isomorphism V ⊗ (W ⊗ X) → (V ⊗ W) ⊗ X. Note V ⊕ W → X. that any trilinear map from V × W × X factors through this triple tensor product V ⊗ W ⊗ X. The tensor product space V ⊗ W is a vector space designed exactly so that a bilinear map b: V × W → X becomes a Of special interest are the tensor powers of a single vector ⊗k linear map V ⊗ W → X. More precisely, it can be charac- space V. We write V := V ⊗ · · · ⊗ V. If {ei} is a basis ⊗ · · · ⊗ ⊗k terized abstractly by the following “universal property”. for V, then ei1 eik is a basis for V . In particular if V has dimension m, then V⊗k has dimension mk. There Definition B1.1. The tensor product of vector spaces V is a natural k-linear map Vk → V⊗k and any k-linear map and W is a vector space V ⊗ W with a natural bilinear map Vk → W factors uniquely through V⊗k. V × W → V ⊗ W, written (v, w) 7→ v ⊗ w, with the prop- One can check that the dual of a tensor product is the tensor erty that any bilinear map b: V × W → X factors uniquely product of duals: (V ⊗ W)∗ = V∗ ⊗ W∗. In particular, through V ⊗ W. That means there exists a unique linear we have (V∗)⊗k = (V⊗k)∗. The latter is of course the set map L: V ⊗ W → X such that b(v, w) = L(v ⊗ w). of linear functionals V⊗k → R, which as we have seen is exactly the set of k-linear maps Vk → R. This does not yet show that the tensor product exists, but Definition B1.2. A graded algebra is a vector space A de- uniqueness is clear: if X and Y were both tensor products, L∞ then each defining bilinear map would factor through the composed as A = k=0 Ak together with an associative other – we get inverse linear maps between X and Y, show- bilinear multiplication operation A × A → A that respects ing they are isomorphic. the grading in the sense that the product ω · η of elements ω ∈ A and η ∈ A is an element of A . Often we consider Note that the elements of the form v ⊗ w must span V ⊗ W, k ` k+` graded algebras that are either commutative or anticommu- since otherwise L would not be unique. If {e } is a basis i tative. Here anticommutative has a special meaning: for for V and { f j} a basis for W then bilinearity gives k` ω ∈ Ak and η ∈ A` as above, we have ω · η = (−1) η · ω. X X X i j i j Example B1.3. The tensor algebra of a vector space V is v ei ⊗ w f j = v w ei ⊗ f j. i j i, j ∞ M ⊗k ⊗∗V := V . Clearly then {ei ⊗ f j} spans V ⊗ W – indeed one can check k=0 that it is a basis. This is a valid construction for the space Here of course V⊗1 V and V⊗0 R. Note that the tensor V ⊗ W – as the span of the ei ⊗ f j – but it does depend on the chosen bases. If dim V = m and dim W = n then we product is graded, but is neither commutative nor anticom- note dim V ⊗ W = mn. mutative. A much more abstract construction of V ⊗ W goes through a huge infinite dimensional space. Given any set S , the free B2. Exterior algebra vector space on S is the set of all formal finite linear com- P binations ai si with ai ∈ R and si ∈ S . (This can equally well be thought of as the set of all real-valued functions We now want to focus on antisymmetric tensors, to de- on the set S which vanish outside some finite subset.) For velop the so-called exterior algebra or Grassmann algebra 18 J.M. Sullivan, TU Berlin B: Differential Forms Diff Geom II, WS 2015/16 of the vector space V. is, alternating k-linear maps from Vk correspond to linear Just as we constructed V ⊗ V = V⊗2 as a quotient of a huge maps from ΛkV. (One can also phrase the universality for vector space, adding relators corresponding to the rules for all k together in terms of homomorphisms of anticommu- tative graded algebras.) bilinearity, we construct the exterior power V ∧ V = Λ2V as a further quotient. In particular, letting S ⊂ V⊗V denote So far we have developed everything abstractly and alge- span of the elements v ⊗ v for all v ∈ V, we set V ∧ V := braically. But there is a natural geometric picture of how k- (V ⊗ V)/S . We write v ∧ w for the image of v ⊗ w under the vectors in ΛkV correspond to k-planes (k-dimensional lin- quotient map. Thus v ∧ v = 0 for any v. From ear subspaces) in V. More precisely, we should talk about simple k-vectors here: those that can be written in the form 4 (v + w) ∧ (v + w) = 0 v1 ∧· · ·∧vk. We will see that, for instance, e12 +e34 ∈ Λ2R is not simple. it then follows that v ∧ w = −w ∧ v. If {e : 1 ≤ i ≤ m} is a i ∈ basis for V, then A nonzero vector v V lies in a unique oriented 1-plane (line) in V; two vectors represent the same oriented line if and only if they are positive multiples of each other. Now {ei ∧ e j : 1 ≤ i < j ≤ m} suppose we have vectors v1,..., vk ∈ V. They are linearly is a basis for V ∧ V. independent if and only if 0 , v1 ∧ · · · ∧ vk ∈ ΛkV. Two Higher exterior powers of V can be constructed in the same linearly independent k-tuples (v1,..., vk) and (w1,..., wk) way, but formally, it is easiest to construct the whole ex- represent the same oriented k-plane if and only if the L wedge products v1 ∧ · · · ∧ vk and w1 ∧ · · · ∧ wk are pos- terior algebra Λ∗V = Λ V at once, as a quotient of k itive multiples of each other, that is, if they lie in the same the tensor algebra ⊗∗V, this time by the two-sided ideal ray in ΛkV. (Indeed, the multiple here is the ratio of k- generated by the same set S = {v ⊗ v} ⊂ V ⊗ V ⊂ ⊗∗V. This means the span not just of the elements of S but also areas of the parallelepipeds spanned by the two k-tuples, of their products (on the left and right) by arbitrary other given as the determinant of the change-of-basis matrix for the k-plane.) tensors. Elements of Λ∗V are called multivectors and ele- ments of ΛkV are more specifically k-vectors. We let Gk(V) denote the set of oriented k-planes in V, called the (oriented) Grassmannian. Then the set of simple End of Lecture 30 Nov 2015 k-vectors in ΛkV can be viewed as the cone over Gk(V). (If Again we use ∧ to denote the product on the resulting (still we pick a norm on ΛkV, say induced by an inner product graded) quotient algebra. This product is called the wedge on V, then we can think of Gk(V) as the set of “unit” sim- product or more formally the exterior product.