·~ ...,... .chapter II 1 60

Here a:"~ E v,' a:J E VJ for j =F i, and X EF. Thus, a rjl: v 1 X ••• X v n --+ v

is a multilinear mapping if for all i e { 1, ... , n} and for all vectors a: 1 e V 1> ... , a:1_ 1 eV1_h a:1+ 1eV1+ 1 , ••. , a:.ev•• we have rjl(a: 1 , ... ,a:1_ 1, ··, a 1+ 1, ... , a:.)e HomJ>{V" V). Before proceeding further, let us give a few examples of multilinear maps.

Example 1. .2: Ifn o:= 1, then a function cf>: V 1 ..... Vis a multilinear mapping if and only if r/1 is a linear transformation. Thus, linear tra~sformations are just special cases of multilinear maps. 0

Example 1.3: If n = 2, then a multilinear map r/1: V 1 x V 2 ..... V is what we called .Multilinear Algebra a in Chapter I. For a concrete example, we have w: V x v• ..... F given by cv(a:, T) = T(a:) (equation 6.6 of Chapter 1). 0

'\1 Example 1.4: The , det{A), of an n x n matrix A can be thought of .l as a multilinear mapping cf>: F .. x · · · x P--+ F in the following way: If 1. MULTILINEAR MAPS AND PRODUCTS a: (a , ... , a .) e F" for i 1, ... , o, then set rjl(a: , ... , a:.) det(a J)· The fact "'I 1 = 11 1 = 1 = 1 that rJ> is multilinear is an easy computation, which we leave as an exercise at the In Chapter I, we dealt mainly with functions of one variable. between vector end of this section. 0 · spaces. Those functions were linear in that variable and were called linear transformations. In this chapter, we examine functions of several variables Example 1.5: Suppose A is an algebra over F with multiplication denoted by between vector spaces. If s.uch a function is linear in each of its variables, then a:{J for a., {J eA. Let n ;;;?:: 2. We can then define a fun<'tion J.l: Ao --+A by the function is called a multilinear mapping. Along with any theory of multilinear J.t(a , ... , « ) = a: a: • .. a:n. Clearly J.l is a multilinear mapping. 0 ,. 1 0 1 2 maps comes a sequence of universal mapping problems whose solutions are the fundamental ideas in multilinear algebra. In this and the next few sections, we If rJ>: V 1 x .. · x Vo--+ Vis a ·multilinear map and Tis a linear transformation shall give a .:areful explanation of the pri.ncipal constructions of the subject from v tow, then clearly, TrJ>: v1 X •.• X vn--+ w is again a multilinear map. matter. Applications of the ideas discussed here will abound throughout the rest We can use this idea along with Example 1.5 above to give a few familiar of the book. . examples from analysis. , ,j Let us first give a careful definition of a multilinear mapping. As usual, F will . . denote an arbitrary . Suppose V 1 , ... , V n and V are vector spaces over F. Example 1.6: Let I be an open interval in R. Set C"'(I) = n:-.1 ~x(I). Thus, X • • • X n X '· • X Let rJ>: V 1 V --+ V be a function from the finite product V 1 V 0 to C"'(I) consists of those f e C(I) such that f is infinitely differentiable on I. V. We had seen in Section 4 of Chapter I tltat a typical vector in V1 x · · · x V, i:s Clearly, C"'(l) is an algebra over IR when we define vector addition ann-tuple (a:1 , .•• , a:J with a:1 e V1• Thus, we can think of 4> as a function of then [(f + gXx) = f(x) + g(x)], scalar multiplication [(yf)(x) = yf(x)], and algebra variable vectors a: 1 , ••• , «a. multiplication [(fgXx) = f(x)g(x)] in the usual ways. Let D: C"'(I)-+ C"'(I) be the function that sends a given fe C.,(I) to its derivative f'. Thus, D(O =f. Clearly, Definition 1.1: A function rJ>: V 1 x · · · x Vn -> V is called a multilinear mapping De Homa(C"'(I), C""(I)). if for each • = 1, ... , n, we have Let n eN. Define a map rJ>: {C""(I)}"-+ C"'(I) by r/>(f1 , .. . , f.) = D(f1 • .. fnl· Our comments immediately proceeding this example imply that 4> is a (a) r/>(a:1, ... , «t + ex;, ... , a:.) = r/>(a:l, .. · • «~o · .. , a:J + rJ>(a:1, · .. , a:;, ... , a:J, multilinear mapping. 0 and Example 1.7: Let [a, b] be a closed interval in Rand consider C([a, b]). Clearly, (b) r/>(a: 1, ... , xa1, ... , a:J = xr/>(«1, ... , «~o···· a:.). C([a, b]) is an R-algebra under the same pointwise operations given in Example 59 1.6. We can define a multilinear, real valued function 1/J: C([a, b])"--+ IR by I/J(f1, · .. ,f.)= J~fl '"fn. 0 MV~IIUNtAH MAI':i A NO TENSOR PRODUCTS til ~

Let us denote the collection of multilinear mappings from V 1 x · · · x V. to v by Mul~l X ... X v •• V). If = v1 X ... X v.,. then clearly, z MULTILINEAR ALGEBRA 2 62 MulJ{V 1 x · · · x V•• V) is a subset of the V . In particUlar, if f, ge Mul~ x · · · x V.., V) and y e F, then xf yg is a vectorin A simple 1 x. + VZ. Question 1.9 is called the universal mapping problem for multilinear computation shows that xf + yg is in fact a multilinear mapping. This proves the mappings on v1 X ... X vn. In terms of commutative diagrams, the universal first assertion in the following theorem: problem can be stated as follows: Can we construct a multilinear map tf>: V 1 X · • · X Vn .... V with the property that for any multilinear map Theorem 1.8: Let V 1 , •••• V. anci V be vector spaces over F. Set Z = 1/1: V1 x ··· x v.--+ W there exists a unique TeHom,(V, W) such that the V1 x ··· x v •. Then following diagram is commutative:

(a) x · · · x V ,., is a subspace of Mul~ 1 V) VZ. 1.10: (b) If n ;?; 2, {MuiJ<{V 1 X • • • X Va• V)} n {Hom~ 1 X • • • X V"' V)} = (0). ' lb Proof We need prove only (b). Suppose if,: V 1 x · · · x v ...... V is a multi­ I linear mapping that is also a linear transformation. Fix i = 1, .. : , n. v,x\/v Since ,P is multilinear, we have tP{a 1 , ••• , a.1 + a.1, ••• , a.) = tP{a 1 , ••• , a.J + tP{alt ... , a.). Since t/> is linear, we have t/>(a1 , . .. , a 1 + a.., ... , a.) = tf>(a 1 , • • • , a,. ... , a.J + tP(O, ... , a.,. ... , 0). Comparing the two results w gives tP{a1,. •• , a.)= t/>(0 , ... , a1, ••• , 0). Again since tP is linear, we have , ••• , , ••• , , ... , _ , tf>(a1 aJ=Ej. 1 ,P(O, ... , a1 Cl). Thus, ,P(a 1 a1 1 0, a 1u, Notice that a solution to 1.9 consists of a vector space V and a multilinear map • o • 1 = 0, a.J ,P: V x · · · x V. --+ V. The pair (V, ,P) must satisfy the following property: IfW is I Now n ;?; 2, the a are arbitrary, and so is the index i. Therefore, for any 1 1 any vector space over F and 1/1: V 1 x · · · x V.--+ W is·any multilinear mapping, (a , ... , a.)eV x ··· x v., we have ,P(a , ... , a.)= t/>(0, a , .•• , a.) I 1 1 1 2 then there must exist a unique, linear transformation T: V -+ W such that 1.10 +t/>(a1, 0, ... , 0) = 0 + 0 = 0. 0 commutes. I Before constructing a pair (V, ,P) satisfying the properties in 1.9, let us make Theorem .1.8(b) says that in general (i.e., when n;?; 2) a nonzero, multi­ the observation that any such pair is essentially unique up to isomorphism To linear mapping ,P: V1 x ... x v.-+ V is not a linear transformation from be more precise, we have the following lemma: V1 x ... x v. to V, and vice versa. We must always be careful not to confuse these tWO concepts When dealing with functions from Vl X ... X V tO V. 0 Lemma 1.11: Suppose (V, t/>) and (V', ,P') are two solution.q

Proof Since (V, ,P) is a solution to 1.9 and ,P': V 1 x · · · x V.--+ V' is a multi­ , there exists a unique T 1 E HomF(V, V') such that T 1 ,P = ¢'. Similarly, there exists a unique T:eHom~·. V) such that T 2,P' = tP· Putting ..• ,.._ ,,_,,._,.,,,.,,...., ... ~ ...... '"'''""''"'''rnvvv\ooa.:) "'~ ~ the ~o obvious diagrams together, we get I I 1.13: 64 MULTIUNEAR ALGEBRA

Vx···xV- ~ V Set V = Ufl.J0, the quotien~ space of U by U 0• There is a natural map 4>: V 1 x · · · x V.-+ V given by 4>{ex 1 , ••• , ex.)= b(cx, .. .. ,cx.) + U 0 • Thus, t/J{ex" ... , a.) is just the coset in Ufl.J0 containing the vector bccx, .... ,aJ· We can ·~~ now prove the following lemma: v Lemma 1.15: (V, ¢) is a solution to 1.9. is commutative. Now in diagram 1.13, we can replace T 2 T 1 with lv the identity Proof' Clearly, V is a vector space over F. We must first argue that 4> is a map on V. Clearly, the diagram stays commutative. Since (V, ¢) satisfies 1._9, multilinear mapping. This follows immediately frO!ll 1.14 and the definition of there can be only one linear tnmsformation from V to V making 1.13 U 0 • We have · commutative. We conclude that T 2 T 1 = lv. Similarly, T 1 T 2 = Iv·· and the proof of the lemma is complete. 0 f/>(ex 1 , .•• , ex,+ a;, ... , a.)= bccx,, ... ,a,+~, ... ,cr.J + Uo

Thus, if we find any solution to 1.9, then up to isomorphism we have found ""(b(.,,, ... ,a,, ....a.) + bccx,, ... ,cr;, ... ,cr.J) + U 0 them all. We now tum to the matter of constructing a solution. We need to recall = (be...... , .... ,cr.J + U 0) + (b1.,,, ... ,..;, ... ,a.l + U 0) a few facts about direct sumr.. Suppose t. is a nonempty set. Then we can construct a vector space U over = f/>(ex1 , ..• , at> ... , ex.) + f/>(ex1 , ... , a;, ... , a.) F and a bijective map rp: t.--+ U such that tp{t.) is a of U. To see this, set Also, U = $ 1• 4 F. Thus, U is the direct sum of lt.l copies of F. For each i e t., let b1 be the vector in U defined by b1(j) = 0 ifj

(a , ••• , XCXj, ••• , «rt) b(cx,, ...... ,cr..) + U xbccx,,. .. ,a,, ... ,cr..) + U 4.13 in Chapter I that B = { b11i e t.} is a basis of U. The map rp: t.-+ B given by 1 = 0 = 0 tp{i) = b1 is clearly bijective. = x(bccx,, ... ,cx,, ... ,cr.J + Uo) Now suppose t. itself is a vector space over F. Then in U = E9 1e4F, we have · = xf/>(ex1 , ... , a., ... , a..) vectors of the form b(i, + ... +iJ - b1, - .. • - b1• and b._1 - xb1 for i 1 , ... , i,., i e t. and xeF. We shall employ these ideas in the construction of a solution to 1.9. Thus, 4> is a multilinear mapping. Let V1> ... , V" be vector spaces over F, and, for notational convenience, set Now suppose W is another vector space over F and 1/1: V 1 x · · · x V 0 -+ W Z = V 1 x .. · x V•. A typical element .in the set Z is an n-tuple of the form a multilinear mapping. We must construct a unique linear transforma­ (a , ... , ex.) with a eV • Set U = E9ccx,, ... ,cx.JeZ F. Thus, U is the direct sum of IZI 1 1 1 tion T: V-+ W such that T¢ = 1/1. To do this, we recall that B = copies of F. As we · observed above, U has a· basis of the form {bc...... cr.JI(exl; ... ,ex.)eV, X ... X v.} is a basis of u. It follows from 3.23 of {beer,, ... ,cx.ll + U 0 ) = O(a,, ... ,a,+CX:, ... ,aJ- O(cx1 , ... ,a,, ... ,cx,) -b(cx,, .. .,cr;, ... ,cx,) T 0(oc~ ..... ,a.)) = l/l(a. 1 , .•• , a..) for all (ex,, ... , ex.) E Z. Since t/J(ex1 , ••• , ex.)= and <5(.. , ...... > + U 0 , we have T¢ = ljJ. Finally, suppose T' e Hom~V, W) and T' 4> = 1/1. We must argue T' = T. Since T' 4> = T 4>, we see T = T' on lm ¢. From our defiritions, L(1m 4>) = V. There­ bccx,, ... ,...,, .. .,o,J- XO(cx,, ... ,a,, ... ,a.) fore, T = T', and the proof of Lemma 1.15 is conplete. 0

In 1.14, i can be any index between 1 and n, (a1 , ... , a." .. . , ex.), (ex1 , ... , ex';, Definition 1.16: The vector space U/U is called the tensor product ofV , . •• , V• ) 0 1 . . . , ex0 any elements of Z and x any scalar in F. (over F) and will henceforth be denoted V 1 ®y· .. ®F v•.

When the field F is clear from the context, we shall drop it from our notation

and simply write V 1 ® .. · ® V" f<'f the tensor product of V 1 , ... , V o. MULTILINEAR MAPS AND TENSOR PRODUCTS 65 ..,..

Definition 1.17: A coset 0(11,, ... ,4,.) + U 0 in the tensor product UfU0 = V ® ·· · ® V. will henceforth be written oc ® · · · ® oc,.. 1 1 66 MULTILINEAR ALGEBRA

With these changes in notation, our multilinear map ¢: V 1 x We claim that (V', ¢')satisfies 1.9. To see this, let t/1: V 1 x · · · x V 0 -+ W be an ···xV,.-+V1 ®···®V,. is given by 4>{a1 , ••• ,oc..)=ex1 ®···®oc•. We arbitrary multilinear mapping. Since {~,..... ,JI (p , ••• , PJeB x · · · x B } is a shallreferto¢asthecanonicalmapofV x ... x V intoV ® .. ·®V .Since 1 1 0 1 0 1 0 basis of V', it follows from 3.23 of Chapter I that there exists a unique linear ~ is multilinear, we have the foll_owing relations in V 1 ® · · · ® V ,.: transformation T: V'-+ W such that T(o(ft,, ... ,,.)) = t/I(P 1 , • •• , PJ for all (p1 , ... ,fJ.)eB1 x ... x B.,. Then Trp' = t/1 and clearly Tis the unique linear U S: transformation for which this happens. We now have two pairs (V', ¢') and (V ® · · · ® V•• ¢) satisfying 1.9. Hence, a ® · · · ® (a + cxl) ® ···®ex. 1 1 1 Lemma 1.11 implies there exists an isomorphism SeHom,{V', V1 ® ... ® V.) = «1 ® · · · ® oc1 ® · · · ®a,. + ex1 ® · · · ® a(® · · · ® oc. such that

and 1.21:

ex1 ® · · · ® xoc1.® ···®ex. = x(oc1 ® · · · ® cx1 ® · · · ® oc,;) ~--'-~~~· v· V1 x .. ·~ We also know from our construction of U/U0 that V 1 ® .. · ® V. is spanned by the image of ¢ . Thus, every vector in V 1 ® · · · ® V. is a finite sum of the form 1 ~ Lfadex11 ® .. · ® ex.J. Here a 11 eV1o ••• , cx.1 eV. for all i = 1, ... , r. ~V~® .. ·®V. Finally, Iet us r~state Lemma 1.15 using our new notation.

is commutative. Now for all ({J1 , ... , fJJeB1 x ... x B., we have Theorem 1.19: Let V 1 , ••• , V. and W be vector spaces over F. Suppose P1 ® · · · ® P.. = 4>(fJ1, ... , fJ.) = Srp'(P 1, ... , PJ = S(o(,,. ... ,,J). Since S is an iso­ 1/1: V1 x .. · x V. -+ W is a multilinear mapping. Then there exists a unique· morphism, it maps any basis in V' to a basis in V 1 ® · · · ® V,.. We conclude that linear •.rauformation T e Hom,{V 1 ® .. · ® V•• W) such that B = {P1 ® .. · ® ,8., I {J1 e B1} is a basis of V 1 ® .. · ® v.. 0 · T(cx1 ® · .. ®ex..) = Y,(ex1 , ... , oc..) for all (cx1 , ... , ex..) e V 1 x · .. x V •. 0

Corollary 1.22: Suppose V 1 , •.• , V. are finite-dimensional vector spaces We shall discuss various functorial properties of V 1 ® · · · ® V. in Section over F. Let m1 = dim V,. Then V 1 ® · · · ® V. is finite dimensional, and 2. But at this point, having introduced a new vector space V 1 ® · · · ® V•• we dim(V1 ®··· ®V..)=m1m 2 ···ma. 0 want to at least give a basis of this space. In the exercises at the end of this section, the definition of an algebra , ••• , , ••• , Theorem 1..20: Let V 1 V • be vector spaces over F, and suppose B 1 B0 homomorphism will be needed. The definition of an associative algebra A (with are bases ofV , ••• , V , respectively. Then B {P ® · · · ® .B.I.B e B } is a basis 1 0 = 1 1 1 identity) over the field F was given in Definition 4.19 of Chapter I. ~uppose A1 ofV ®"·®V,.. 1 and A2 are two algebras over F. Proof We prove this theorem by using Lemma 1.11. Consider the set Definition 1.23: A function cp: A 1 -+ A2 is called an algebra homomorphism if B x .. ·x B,={(,8 , ... ,,8JIP,eBI}· Let V'=E9(.B,,... ,JlJeB,x .. ·xB. F. We 1 1 cp e Homp(A1, A2) and cp(ex.B) = cp(cx)cp(fJ) for all a, PE A1. have seen from our previous discussion that V' is a vector space over F with basis {c5(.B,. ... ,1'JI(,81 , ... ,,8Jt; B1 x ... x B.} .. We define a function Example 1.24: Let V be any vector space over F, and consider the two algebras if/ : B1 x···xB,-+V' by t/>o(P 1 , ••• ,pJ=c5<.B, .....,,)- Now B x··· x B.~ 0 1 A1 "" F[X] and A2 = I(V). T hen every T e I(V) determines an algebra homo· V x · .. x V. and each V1 is the linear span of B1• It follows that there 1 morphism cpy: A1 -+ A2 defined as follows: exists a unique multilinear function 4>': V 1 x · · · x V a-+ V' such that

~'(,8" ... ,{J..)= ¢ 0(,81 , ... ,{J..)forall (,8 1 , ... ,{JJeB1 x ... x B, . 1.25:

cp~a 0 + a1X + ... + a.X") = a 0 Iv + a 1T + ... + a0 T"

The fact that cpy is an algebra homomorphism is easy. Note that £XERCISES FOR SECTION 1 til ~I ..,....., . 9>rt1) = Tl =l v. Thus, 'Pr sends the multiplicative identity 1 of F[X] to the multiplicative identity lv of I(V). 0 68 MULTILINEAR ALGEBRA

If the reader prefers matdces to linear transformations, we can construct a (9) A vector a e V 1 ® · · · ® V" ~said to be decomposable if a e lm t/J. Here t/J is similar algebra homomorphism fPc: A1 = F[X] -+ A 3 = M. x .(F) for any matrix the canonical map from V 1 x · · · x V, to V 1 ® · · ® V". Are all vectors in CeA3• Set rpcf.a0 + a 1X + .. · + a.X") = a01 + a 1C + .. · +a,.CO. We shall use V ® · ·· ® v. decomposable? If not, construct an example. these two types of algebra homomorphisms extensively in Chapter m. Other 1 examples of algebra maps will be considered in the exercises at the end of this (1 0) Show that a = a 1 ® ···®a. in V 1 ® · · · ® V. is zero if and only if some a 1 section. i j is zero.

} (11) Suppose Vis a finite-dimensional vector space over F. Let~ - {cc 1 , •• • , «0 I be a basis of V. Show that the isomorphism r(g, g): I(V) -+ M. ~.(F) given EXERCISES FOR SECTION 1 in equation 3.24 of Chapter I is an algebra homomorphism (12) Let V be a vector space over F. For each integer n ;?: 0, we define v®• as , ••• , = (1) Complete the details of Example 1.4, that is, argue t/J(a1 a.) det(a1J) is follows: a multilinear mapping.

(2) In the proof of Theorem 1.20, we used the following fact: ifB1 Is a basis ofV1 F if n = 0 : x · · · x v®•= v and t/J0 B1 B.-+ V' is a set map, then t/Jo has a unique extension to { if n = 1 a multilinear map t/J': V 1 x · · · x V. -+ V'. Give a proof of this fact. V ® · · · ® V (n times) if n;?: 2

(3) Suppose V , • •• , V, and V are finite-dimensional vector spaces 1 Set .:f"(V} = EB::.o v®•. Show that ff"(V) is an associative algebra over F over F with dim .V1 = m1 and dim V = p. Show that when we define multiplication of vectors in .:f"(V) by (a1 ®···®a.) dimF{Mul.F(V 1 X "' X Y0 , V)} = pm1 '" m,. . (P1 ® ... ® P.J = a 1 ® .. ·®a.® P1 ® .. · ®Pro· .:f"(V) is called the tensor ( 4) Let rp: F"' x F" -+ M.., ••(F) be defined as follows: If a = (x 10 ... , x.J e F"' algebra of V. andP = (y , .. . , y.)eF",let

M •• 0 {F) to M •• 0 (F) that is bijective. Show that rp is a bilinear mapping from V x V to V • rp(A, B) is usually 1 2 3 (16) in 15, written A® B and is called the Kronecker product of A and B. With the same notation as Exercise suppose the map e:M. ".(F) -+ M." .(F) given by e(B) = AB is an algebra homomorphism. {6) Show that Mul,{V1 x .. · x V11 , V)~ Hom,{\'1 ® · .. ® V"' V). Does this What can you say about A in this case? give a simple proof of Exercise 3?

(1) Suppose V 1, . ." . , V. are vector spaces over F and for each i = 1, . .. , n, let r, E vr. Show that 1/J: vI X.. . X v"-+ F given by 1/l(a ...... a.) .;, 2. FUNCTORIAL PROPERTIES OF TENSOR PRODUCTS

f1(a1)f2(a2) • • • f.(a.) is a multilinear mapping. In this section, we present a series of tl.eorerns that tell us how to manipulate (8) Give an example of a multilinear mapping rp: V x · · · x V. -+ V such that 1 tensor products and use them in various applications. Our first theorem Im

Theorem 2.1: Let V 1 , •• • , V. and W 1 , ••• , W"' be vector spaces over F. Then L there exists an isomorphism T: (V 1 ® · · · ® V.) ® (W 1 ® · · · ® W .J -+ rv ..<..IUMIAL. t'MUt'~H ptS Ut- I" ENSOR PRODUCTS t;l:l

V1 ® ·· · ® Ya® W1 ® ··· ® Wm such that T((ex1 ® · ·· ® a.J®(P1 ® ··· ® p,J) = a.1 ®···®a..® P 1 ® ··· ® Pm· Herea,eV, andP1eW1forall i = 1, ... ,nand j=l, ... ,m. 70 MULTILINEAR ALGEBRA

The proofs of the theorems in this section can usually be done in two different Proof: By Theorem 2.1, both · of these vector spaces are isomorphic to ways. We can appeal to Lemma l.ll .or use Theorem 1.20. We shall present a V1 ®V1 ®V3. 0 mixture of t,oth types of proof here. Since we are dealing with vector spaces, we could pro·;e every theorem in this section by using Theorem 1.20. The advantage The point of Theorem 2.1 and Corollary 2.2 is that we can drop all to proceeding via Lemma 1.11 (i.e., a basis-free proof) is that this type of proof is parentheses when forming successive tensor products of vector spaces. Our next valid in more general situations (e.g., modules over commutative rings). theorem says that forming tensor products is essentially a commutative operation as well.

Proofof2.1: Let B.,i = 1, ... , n, be a ba.sisofV,. LetC1,j = 1, ... , m, be a basis Theorem 2.3: Let (i , ••. , iJ be a permutation of (1, ... , n). Then there of~1 • Applying Theorem 1.20, we hav<: the following facts: · 1 exists an isomorphism T: V 1 ® .. · ® '{. s; V,, ®· .. ®Vi. such that T(a. ® · · · ® a.J = a , ® · · · ® •• (a) rl = {exl ® · '"®ex,.l(al, ... ,aJeBl X .•• X B,.} is a basis of 1 1 a.1 Vl®···®Vo. . (b) r'\ = {Pt ® ... ® Pm l(fil> .... fi,JeCl X ••• X em} is a basis of Proof: The map Vtt:Vl X •.• X v ...... v~,® .. ·®V;. given by Vtt(exl>

W 1 ® .. ·®W.,. .. . , ex.)= ex1, ® · .. ® cxr. is clearly m•tltilinear. Hence using the universal (c) r3={at® .. ·®a..®Pt®···®fi.,l(exl, . .. , ex., P1 .... , fi,JeB1 x··· mapping property of (V 1 ® · · · ® V •• 1/>), we have a unique linear transformation X B. X cl X ... X Cm} is a basis of v 1 ® ... ® v. ® w 1 ® ... ® w .... T: V1 ®· .. ® V.--+ V11 ® ... ®V1" such that T4> = 1/1 1. Thus, a,.® ·"®ex1• = 1/1 (a , ... , a.)= T4>(a. , ... , exJ = T(ex ® .. ·®ex,.). (d) {(ext® ... ® exJ ®(fit® ... c~ fi,JI(a.J ® ... ®a., P1 ® .. · ® P,Jer1 x 1 1 1 1 Now let v,, X ••• X v,_--+ ® ... ® v,. be the canonical multilinear r 2} is a basis of (V 1 ® · · · ® VJ ® (W 1 ® .. · ® W .,). 4>': v,. map. The map 1/11: V1, x .. · x V1.--+ V 1 ® .. · ® V. given by l/t2(a.1,, ... , ex1.) = Using 3.23 of Chapter I, we can construct a linear transformation t1.1 ® · · · ® tt0 is clearly multilinear. Hence there exists a unique linear t ~·ans­ formation T': V~, ® · .. ® V ...... V ® · .. ® Va such that T' 4>' = 1/1 • Thus, T: (V 1 ® .. · ® VJ ® (W 1 ® · .. ® W ,J -+ V 1 ® .. · ® Vn ® W 1 ® .. · ® W., such 1 1 2 a 1 ® .. ·®a. "" J/J 2(a1,, ... , ex;.) = T'4>'(a. 1,, •• • , ex;) = T'(a, ®· .. ®ex;.). Clearly, thatT((a.1 ® ... ®aJ® (fi1 ® ... ® fi,J) = ex1 ® ... ®a.n ® P1 ® ... ® Pm for all T and T' are inverses. 0 (a:1® .. ·®an, P1 ®· .. ®fi,JeT1 xr2 • Clearly (d) and (c) imply Tis an isomorphism. The fact that T((a.1 ® · · · ® a:J ® (P1 ® · · · ® P,J) = CX We may view F itself as a vector space over F. We can then consider the a:1 ® · .. ® 0 ® fi 1 ® .. · ® Pm for any a.1e V1 and fiJ e W1 is now a straightfor­ ward computation that we leave to the exercises. 0 tensor product V ®.,F. Here Vis an arbitrary vector space over F. Since {1} is a basis off as a vector space over itself, The•nem 1.20 implies V @p F s; V under the map sending ex® x to ax. Similar remarks can be made for F ~ V. Thu:;, we Let us say a word about the proof of Theorem 2.1 via Lemma 1.11. There have the following theorem: is a natural multilinear mapping 1/1: V 1 x · · · x V. x W 1 x · · · x W m -+ (V ® ... ® VJ®(W ® ... ® W,J given by !J;(ex1, ... , = 1 1 a.., Pt> .. ·• P,J Theorem 2.4: V @p F: V;;:;; F ®F V. 0 (a.1 ® · · · ® a.J ® (P1 ® · .. ® p,J. We could then argue that the pair ((V 1 ® · · · ® V J ® (W 1 ® · · · ® W ,J, 1/1) satisfies the universal mapping We next tum our attention to the relations between tensor products property given in 1.9. Lemma 1.11 would then imply and homomorphisms. Suppose that for each i = 1, ... , n, we have a linear (V ® .. ·®VJ®(W1 ®· .. ®WmJ;;:;; Vt ®· .. ®V.®W1 ®"·®Wm via a 1 transformation T 1eHomrl¥1, VI}. Thus, V1 and v; are vector spaces over linear transformation T for which T((a1 ® .. · ® a.J ® (fi 1 ® · .. ® fimJ) = F, and T 1:V1 -+ v; is a linear transformation. We then have a natural a: ® .. ·®ex.® ® .. · ® Pru· We ask the reader to provide the details of this 1 P1 multilinear mapping cp: v 1 X - •. X v n-+ V'1 ® ... ® v~ given by cp(a.l, proof in the exercises at the end of this section. . .. , a.) = T 1 (a1) ® · · · ® T .(aJ. Since each T; is linear, cp is clearly mul­ There is a special case of Theorem 2.1 that is worth noting explicitly. tilinear. It follows from Theorem 1.19 that there exists a unique linear trans·

formation S: V 1 ® · · · ® V• ..... V'1 ® · · · ® V~ such that S¢ = cp. Here Corollary 2.2: (V ® V1) ® V3: V1 ®(V 2 ® V3). 1 4>: V 1 x · · · x V" -+ V 1 ® · · · ® V,. is the canonical multilinear mappi~~ given by ¢(a1, ... , a.)= a 1 ®· .. ®a.,. The map S is called the tensor product of the T1 and is usually written S = T 1 ® · · · ® T.. Since S¢ = (/J, we have (T1 ®"· ®TJ(a1 ®···®a..)=T1(a1)®· .. ®T.(a..) for all (ex1, ... , a.,Je v. ...: · · · x V_. Let us summarize this discussion with the following definition: FUNCTORIAL PROPERTIES OF TENSOR PRODUCTS 71

Definition 2.5: Suppose T 1e HomJ{V" VI) for i = 1, . .. , n. T 1 ® · · · ® T. is the linear transformation from V 1 ® · · · ® V • to V1 ® · · · ® V~ defined by the following equation: 72 MULTILINEAR ALGEBRA

fft ® ··· ® T..Xo:t ® ·· · ® o:J = TtCo:t)® · ·· ® T.(o:.J Now let aeker(T1 ®··· ®T.). Again using Theorem 1.20, o: can be written uniquely in the following fJrm: In our notation. we have deliberately suppressed the field F. When dealing with more than one field, we shall write T ®J, · · · ®J, T : 1 0 2.7: V1 ~ • • • ®J, V o-+ V1 ®J, · · · ®.. V~ instead of the simpler notation used in 2.5. In our next theorem, we gather together some of the more obvious facts about tensor products of linear transformations. o:-- L..,~ -Jq,r-. ..., k. a. lk, ®··· ®a. at. (k,,. ..,lt.}ell 1 >< ... "II. Theorem 2.6: Let V, v;, and Vi, i = 1, .. . , n, be vector spaces over F. Supp'ose T1, TjeHomJ{V., VI) and S1eHomF(Vj, VJ. Then the following assertions are In equatioo 2.7, every Ctt., ....k. e F and all but possibly finitely many of true: these scalars are zero. If we now apply T 1 ® · · · ® T. to a. and use the fact the B is linearly independent over F, we see ck., ... ,k. = 0 for every (a) If each T1 is sutjective, so is T 1 ® · · · ® T •. (k1 , ... , kJ e A1 x .. · x A•. Thus, a. = 0 and T 1 ® .. · ® T. is injective. 0

(b) If each T 1 is injective, so is T 1 ® · · · ® T n.

(c) If each T1 is an isomorphism, so is T 1 ® · · · ® T •. Recall that a complex of vector spaces (over F),

(d) ff1 ® · · · ® TJ(St ® ··· ® SJ = (T1S1) ® ··· ®(T.SJ. 1 1 (e) If each T 1 is an isomorphism, (T1 ® · · · ® T J- = T1 ® ... ® T,;-1. V" ~v :!:_._. V' ----+ 0

(f) T 1 ® .. · ® (TI + T'J ® .. · ® T n = T 1 ® · .. ® T1 ® .. · ® T • + T 1 ® .. · ®lj® .. ·®T• . is said to be exact if T is surjective, and 1m S = ker T. Suppose for each (g) T 1 ® · · · ® xT1 ® · · · ® T • = x(T 1 ® · · · .® T1 ® · · · ® T J. i = 1, ... , n, we have an exact complex of the following form: (h) lv, ~ ... ® Iv. = l{v, GD .. · GDVJ

Proof (c)-(h) are all straightforward and are left to the reader. We prove (a) and 2.8: (b). Vj ~V 1 ~ Vj ----+ 0 (a) It follows from our construction of the tensor product that V1 ® · · · ® V~

is l'iJarned as a vector space over F by all vectors of the form cx'1 ® · · · ® ~ Theorem 26(a) implies T ® .. · ® T . : V ® .. · ® V. -+ V~ ® .. · ® V~ is a sur­ with (cx'1 , ... , ~eV1 X ... XV~ . Let (cx't> .. . ,cx'JeV1 X ... x V~. Since 1 1 jective linear transformation. We want to identify the kernel ofT ® · · · ® T •. each T1 is sutjective, there exists an o:1 e V1 such that T;(aJ =a;. Thus, 1 For each i = 1, . .. , n, we can consider the linear transformation «'t ® ... ®~ = Tt(o:1)® ... ®T.(aJ = (Tt ® . .. ®TJ (o:1 ® ... ®a.J. Iv, ® ... ®5 ® .. ·® Iv.:V ®· .. ®Vj' ® "· ®V.-+ V 1 ® . .. ® Vt® ... ® V•. Thus, V1 ®··· ® V~ = L(Im(f 1 ® · · · ®T.))=Im(T1 ® ··· ®TJ. Hence, 1 1 Let W lm(Iv, ® .. · ® S ® .. · ® lv.). Then W is the subspace of V ® .. · T 1 ® · · · ®To is sutjective. 1 = 1 1 1 ®V. spanned by all vectors of the form {a. 1 ® .. ·®S1(o:j)® .. ·®a..l (b) Suppc.se T 1 is injective for each i = 1, ... , n. Let B1 = {a. 1" 1k e A1} be a (at, .. . • 0!~ .... 'o:.)eV 1 X .. . X v~ X ... X v.}. We can then form the subspace basis of V1• Since T 1 is injective, T 1(BJ = {T1(a1J 1k e A1} is a linearly w = w 1+ ... + WD £; vl ® ... ® v •. We can now prove the following lemma: independent subset of v;. In particular, T 1(BJ is part of a basis of v; (Theorem 2.6, Chapter 1). It now follows from Theorem 1.20 that the set Lemma 2.9: W = ker(T 1 ® · · · ® T .).

} B = {T1 (a.1k,) ® .. · ® T .(a.k.) I (k1, ... , k.) e .1.1 x .. · x A0 Proof Fix i = 1, .. . , n, and consider a typical generator P= o:1 ® · .. ® S (o:;') ® .. ·®a. of W • (f ® .. · ® T .)(,8) = T (et ) ® .. · ® T S (a.;') ® is a linearly independent subset of v~ ® ... ® v~ . 1 1 1 1 1 1 1 · · · ® T .(aJ. Since 28 is exact,. T 1S1(a;') = 0. Thus, (T 1 ® · · · ® T .)({J) = 0. Since ker(T 1 ® · · · ® T .) is a subspace of V 1 ® · · · ® V", we conclude that W = w1 + .. . + wQ £; ker(T I ® ... ® T n>· . I VI .. VIVI \ I.r'\1..1 t\VI J..I''IIC...:t Vr IC:.I'ii~Vn r"nVUU\..1~ J.) !r .The opposite inclusion, ker(T 1 ® · .. ® T.) £ W, is a bit more difficult to

establish. We begin by defin;ng a multilinear mapping 1/J: V'1 x II 74 MULTILINEAR ALGEBRA . • • • X V~-+ (V 1 @ · · · ® V~)fW as follows: , ... , H (a'" ... ,a~eVi.x · ··xV~, then there exists a vector (a1 Set S = T(T1 ® .. · ® T J . Then for all (o: 1 , ... , a.) e V 1 x · · · x V •• we have aJ e V 1 x · · · x V • such . that T1(aJ = a! for all i = 1 , ... , n. This follows ) = , ... , S(o: 1 ®···®a.)= T(T1(o:1 ® · .:® T.(a.)) (cx 1 ® ..: ®ex.)+ W. Thus, S is from the fact that each T1 is swjective. We then define t/l(a'1 ~to be the following coset: r nothing but the natural map from V 1 ® · · · ® V. to (V 1 ® · · · ® V,J(W. In ~I particular, it follows from 5.13 of Chapter I that ker S = W . Since ker(T1®·· · ®T.)£kerS, we conclude that ker(T1® .. ·®T,JsW. This 1.10: completes the proof of the lemma. 0

1/J(a',, ... , a',)= (a1 ®···®a,)+ W We have now proved the following theorem:

Now it is not obvious that 1/1 is well defined. We must check that if(/3 1 , ••• , /3.) is Theorem 2.12: Let a second vector in V 1 x · · · x V. with the property that T1(/3J = aj for all i= 1, .. . , n, then (/31 ®·"®/3.) + W = (a1 ®···®a,)+ W. Yi~v.~v;~o Since T1(/3J = T1(aJ = a! for i = 1, ... , n and each sequence in 2.8 is exact, there exists a J.l1e Y7 such that S1(J.IJ = a1 - /31• In particular, we have the following relations in V 1 ® · · · ® V.: be an exact complex of vector spaces over F for each i = 1, .. . , n. Let W1 = Im(Iv, ® ···® S1 ®··· ®lv.) and set W = W 1 + ··· + W.,. Then 2.11: ~I 0-+W-+V,®···® V. T,® .. ·®T, V',® · ··®V:~-+0

a 1 ®··· ®a.- /31 ® a 2 ®···®a. e W 1 i

/3, ®cx2 ®···®a.- Pt ® P2 ®a3 ® ··· ®cx.eW2 is a short exact sequence. Thus, T 1 ® · · · ® T n is surjective and I W = ker(T 1 ® · · · ® T ,). 0 Pt ® · · · ® 13.-1 ® ex. - Pt ® · · · ® P. e W., There is a special case of Theorem 2.12 that we present as a separate theorem.

Theorem 2.13: Suppose Adding the relations in 2.11 gives cx1 ® ···®ex. -·/11 ® · · · ® P. e W +···+W.=W. Thus, (cx ®···®cx,J+W=(P ® ··· ®P,J+W, and 1 1 1 ~I equation 2.10 gives us a well-defined function 1/J: Vi x · · · x V~-+ ! 0 -+ V" s v T V' -+ 0 (V 1 ® · · · ® VJ(W. The fact that 1/1 is multilinear is obvious. Let t/J: V'1 X ••• X v~ -+ V'1 ® ... ® v~ be the canonical multilinear map. is a short exact sequence of vector spaces over F. Then for any vector space W, Using the universal mapping property of (Vi®··· ® V~, ¢), we con­ ~ clude that there exists a unique linear transformation T: V' ® · · · ® V~ -+ 1 2.14: (V 1 ® · · · ® VJ(W such that T 4> = 1/J. Thus, for all · {a'1 , . .. , a',) e V'1 x···xV~, and any (ato .. . ,cx,JeV1 x···xV. such that T1(cxJ=al, i"" 1, ... , n, we have (cx1 ®···®ex,)+ W = 1/!(aJ., ... , ex~ "" T¢(a'1,. • . , a~== 0-+V"®W~V®W T®lw V' ®W-+0 T(a't ® · ·· ® a'J. Now consider the composite linear transformation given by II is a short exact sequence. Yt®··· ®V. T,®···®T. V',®· ·· ®V~ T (V,®··· ®V.)(W. :t Proof T ® Iw is surjective by Theorem 2.6(a). S ® Iw is injective by Theorem I 2.6(b). H we apply Theorem 2.12 to the two exact complexes: V" s V T V' 0

o w--'~w o ~UNCTORIAL PROPERTIES OF TENSOR PRODUCTS 75 Ill "JI we see th:.tt kc::r(T ® lw) = lm(S ® l w)- Thus 2.14 is exact and the proof of the theorem is complete. 0 76 MULTILINEAR ALGEBRA The next natural question to ask about tensor products is how they behave with respect to direct sums. We answer this question in our next theorem. but We need to define scalar multiplication of vectors in V ®,: K with scalars inK. leave most of the technical details as exercises at the end of this section. Let x e K, and consider the linear map Jl. e Hom~K, K) defined by Jl,(y) = xy. Clearly, Jl. is an F-linear transformation on K. In particular, Iv ®F p., is a well­ Theorem 2.15: Suppose {V.J ie.A} is a collection ofvector spaces over F. Then defined F-linear transformation on v (8)pK. Now if e= IrDl(cx, @pX~ is a for any vector space V we have typical vector in V (8)pK, we define scalar multiplication x~ by the following formula: {EB V,} ® v ~ EB {V, ® V} I• A leA 2.16: xe = (Iv (8)F Jl.}(~.

Thus, xe = Ir= 1(cx; ®r xxJ. Our previous discussion in thi ~ section implies Proof: Let 8J: VJ--+ 9te4 v, and 1tj: 9teA v,-- VJ be the canonical injections and surjections introduced in Definition 4.2 of Chapter I. Then we have the that equation 2.16 gives us a well-defined function from K x following facts: (V ®F K) ..... V ®F K. The fact that this scalar multiplication satisfies axioms VS-V8 in Definition 1.4 of Chapter I is straightforward. Thus, via the (a) n 8 = Iv for all j eA. operations defined above, the F-vector space V (8)pK becomes a vector space 1 1 1 over K. (b) 1tj81 = 0 if i i' j. Throughout the rest of this book, whenever we view V ®F K as a vector (c) For any ~ e El7te4 V1, n1 (~ = 0 except possibly for finitely many j eA. space over K, then addition and scalar multiplication will be as defined above. (d) Lie4 81n1 = I, the identity map on EDt..~~. V1• The process whereby we pass from a vector space V over F to the vector space V ®F K over K is called extending the scalars to K. · Perhaps we should make a few comments about (d). e VI> is a A­ If~ EB~e~~. then ~ . Since F ~ K, Theorem 2.6 implies that the natural map lv (8)pi: tuple with at most finitely many nonzero components. Thus, _L,.~~. 81n1@ is a V @p F -. V <8)p K is injective. Here i: F -+ K is the inclusion map. Now finite sum whose value is clearly e. This is what the statement Lte.\ 8,n, = I V~ V (8)pF by Theorem 2.4. Putting these two maps together gives us a means in (d). · natural, injective map i e HomF { 9 teA V,} ® V. An eaSy computation shows that Imi = V ®F 1 in V ®F K. We note that V (8)..1 is an F -subspace of V ®F K. { E£) 16~~. V1} ® V is the internal direct sum of the subspaces {Im(81 ® l v) I i e .6}. This follows immediately from 2.16. For if xeF, then Thus, { E9te6 v,} ® v = ES.~~~. Im(8, ® ly). Since each 8, is injective, Theorem 2.6 x(a @..1) = IX (8).. x = xcx (8)p 1 e V ®F 1. Thus, when we extend the scalars from implies 81 ® lv is injective. Hence V1 ® V~ lm(81 ® Iv). It now follows that F to K, we produce a K-vector space, V ®F K, which contains V, that is, Imi, as EJ3taA(Vt ® V) ~ E9teA Im(81 ® Iv) = { E9t..A V,} ® V. 0 an F-subspace; and such that V ®r K is the K-linear span of V. We can now construct a K-basis of V ®F K. We next study a construction using tensor products that is very useful in . Suppose V is a vector space over F, and let K be a second :field Theorem 2.17: Let V be a vector space over F, and suppose K is a field containing F. For example, F = Rand K = C. We have seen in Chapter I that K containing F. If B is a basis of V, then {IX ®F 1IIX e B} is a basis of the K-vector is a vector space.(even an algebra) over F. Thus, we can form the tensor product space V(g)pK. V <8>F K of the vector spaces V and K over F. V <8>F K is a vector space over F. We want to point out that there is a natural Proof Let r= {1X(8)ylj1XeB}. Since {1} is subset of K that is linearly K-vector space structure on V ®FK as well. Vector addition in V <8>FK as a K­ independent over F, Theorem 1.20 implies the vectors in r are linearly independent over F. In particular, lr! = IBI, and no element of r is zero. We vector space is the same as before. Namely, if ~ = l:fa 1 (cx1 ®.. x11 and must arguer is linearly independent over K, and LKlf') = V ®r K. Here LK(r) 'I = Lma 1 (P1 ®F y1) are two vectors in V ®.. K (thus IX" p1 e V, and x .. y1 e K), then is all K-linear combinations of the vectors in r. Let us first argue that r is linearly independent over K. Let cx 1 , .. • , ex. e B, k , . .. , ka e K, and suppose k.(IX ®,: 1) = 0. Let C = {z lj e A} be a basis e+ Tf = (Xl @..xl + ... +ex. (8)px. + fJ1'®,y1 + ... + Pm ®rYm 1 L:i'-1 1 1 of K over F. Then each k 1 can be written uniquely 1n the following form: 2.18:

k, = ) x.. z •• i = 1. .... n FUNCTORIAL PROPERTIES OF TE.NSOR PRODUCTS n --- ·=

In Equation 2.18, the x11 are scalars in F and each sum on the right-hand side is finite. Thus, for each i = 1, ... , n. x11 = 0 except possibly for finitely many j eA. 78 MULTILINEAR ALGEBRA We now have

X(T,k)=k(T@,I.J. Here TeHomF(V, V), and k e K. From the discussion preceding Definition 25, we know that T ®F IKe HomF(V ®.. K. V ®F K). We f. k~ex1 ®: 1) = I (ex1 ®FkJ = f. {ex1 ®,. ( l: x,,z,)} = I l: xu(cx1 ®Fz,) 1•1 1•1 1•1 \jeA l~alje6 claim T ®.. IK is in fact a K-linear map on V ®.,K. To see thls, we use equation 2.16. We have (T ®F I.J(k(cx ®.,lc')) = (T ®: I.J(a ®F kk') = T(a) ®.. kk' = Since the vectors ( ..t1 ®.. z1 I oc1e B, z1 e C} are linearly independent over F by k(T(a) @, k'] = k[(T ®., I.J(a ®.,lc')]. Thus, T ®F IKe HomK(V @, K, Theorem 1.20, we conclude that x11 = 0 for all i and j. In particular, V ®.. K). Again by 2.16, k(T ®.. I.J is the K-linear transformation on V ®r K kl = ... = kt\ = 0, and r is linearly independent over K. given by [k(T ®r I.J](a ®r k') = k(T(ex) ®, k') = T(a) ®.. kk'. In particular, To complete the proof, we must show LK(r) = V ®,.K.. Since V ®.-K is Imx ~ HomK(V®,.K, V ®FK). xis clearly an F-bilinear mapping, and. thus, spanned as a vector space over F by vectors of the form a ®,k (« e V, k e K) apd factors through the tensor product HomF(V, V) ®FK. So, we have the following F ~ K, it suffices to show that a ®.-k e LK(r). This last inclusion is easy. Write commutative diagram: a = Ir• 1 x1oc1 with «1o ... , a. e B and x 1 , ••• , x. e F. Then 2.21:

a &,k x1a1) ®Fk = I (x,a, ®Fk) = (a1 ~Xjk) =(f.l .. l 1•1 lealf. I ¢ I Hom,{V,~ = r. x1k(a1 ®,.1) E LK(r) 0 7V)@.K l"l

There are two importa.nt corollaries to Theorem 2.17 that are worth noting here. I HomK(V ®F K, V ®, K)

Corollary 2.19: Suppose Vis a finite-dimensional vector space over F and K is a In 2.21, t/J(T, k) = T ®,k, and t/1 is the unique, F-linear transformatio11 making field containin~t F. Then dimF

Corollary 2.20: Suppose V is a finite-dimensional vector space over F and K is a Finally, we must argue that 1/t is an isomorphism. We do this by repeated field containing F. Then HomF(V, V) ®,K~ HomK(V ®.-K. V ®J,K) as applications of Theorem 2.17. Let ~ = {oc , ... , a.} be basis of V. Define vector spaces over K. 1 T 11 eHomF(V, V) by T 11(ap) =oct if p = j and zero otherwise. It follows from 1 Theorem 3.25 of Chapter I that {Tu 1i,j = 1, ... , n} is a basis of HomF(V, V). Proof If dimFV = n, then dimF(HomF(V, V)) = n by Theorem 3.45 of Chapter Theorem 217 then implies {T @..11i,j=1, ... ,n} is:.. '!..:.-basis of l. Thus dimK(HomF(V, V) @,K)"" n2 by Corollary 2.19. On the other band, the 11 HomF(V, V) ~ K. On the other hand, {a1 ®,.11 i = 1, ... , n} is a basis of same corollary implies dim"(V ®.. K) = n. Consequently, dimK(HomK(V ®.. K, = 2 v ®,. K. Thus, {Su I i,j = 1' ... 'n} (where s,,(ap ®,.1) =a, ®F 1 if p j and V ®FK)} = n by Theorem 3.25 again. Since the K-vector spaces zero otherwise) is a K-basis of HomK(V ®.- K, V ®F K). Now one easily checks HomF(V, v') @pK and HomK(V @..K, V @pK) have the same dimension, they that ljt(T11 ®J, 1) = S11 • Thus,l/1 is an isomorphism of K-vector spaces. are isomorphic by Theorem 3.15 of Chapter I. 0 Let us rephrase some of our last remarks in terms of matrices. Suppose V is a

finite-dimensional vector space over F. Let TeJiomF(V, V). Let~= {a1 , ... , a.} A word about Corollary 2.20 i> in ord-:r here. We proved this result by be a basis ofV over F. Set A =I"(~. ~)(1). Thus, A is the matrix representation of counting dimensions. This type of argument gives us a quick proof of the T relative to ~· Now suppose we extend scalars to a field K 2 F by passing to corollary but tends to obscure the nature of the isomorphism between the two V ®,.K. Theorem 2.17 implies~® 1 = {a1 ®,.1, ... , a" ®,.1} is a K-basis of vector spaces. It is worthwhile to construct an explicit K-linear isomorphism V ®: K. The F -linear map T: V -+ V has a natural extension ljt: HomF(V, V) ®r K-+ Hom"(V ®F K, V ®,. K). We proceed as follows: ljt(T @,.1) =.T ®r IK to a K-linear map on V ®: K. Here 1/t is the isomorphism Consider the map x: HomF(V, V) x K -+ HomK(V ®F K, V ®.. K) defined by in diagram 2.21. We h:tVe seen that V is imbedded in the extension V ®F K as the subspace V ®.-1. U we identify V with V ®F 1, then T ®F IK restricted to V is just T. Thus, we may think ofT ®F IK as an exte~ion ofT. Clearly, r(~ ® 1, FUNCTORIAL PROPERTIES OF TENSOR PRODUCTS 79 r ~ g ® 1)(T ~ IJ = A. Thus, the matrix representation of the extension of T relative to the extended basis is the same as the matrix representation ofT on V. !I 80 MULTILINEAR AlGEBRA I One of the most important e~amples of extending scalars is the com­ plexification of a real vector space. We finish this section with a brief discussion ' ' of that notion. 11 It is often important to decide when an SeHomc{VC, VC) is the com­ L plexification of some TeHomn{V, V). l Defiuition 2.22: Let V be a vector space over R. The tensor product V @..C is Theorem 2.26: Let V be a finite-dimensional vector space o·!er R, and let called the comptexification of V. SeHoiilc{V"-, VC). Then S = rC for some TeHomn{V, V) if and only if the following equation is satisfied: We shall shorten our notation here and let VC denote the complexification of V. Thus, VC = {:E(a1 ®RzJia1eV, z. eC}. Our previous discussion implies 2.27: that VC is a vector space over C with scalar multiplication given .by z'(a ®n z) =a ®R :t:z.lfB is an A-basis ofV, then B ® 1 ={ex ®n 11 exeB} is II S(lv ®11 u) = (lv ®11 u)S a basis of yc over C. i, Proof: If S is a C-linear transformation on VC, then clearly S is an IR-Iinear ' There is an important R-linear map on y c that comes from complex I conjugation on C. Recall that if z = x + iy (x, y e IR, i = .J=lj is a complex transformation on VC. Iv ®R (J is also an IR-linear transformation on VC. Thus, number, then z = x - iy is called the conjugate of :z. Clearly the map u: C-+ C the statement in equation 2.27 is that these two endomorphisms commute as maps in HomR(Vc, VC). given by q(z) = z is an IR-tinear transformation. Thus, lv ®n q E HomR(VC. v~ . Recall that lv ®n u is given by the following equation: Let us first suppose that S is the complexification of some T e Hom11(V, V). Thus, S = T ®RIc. If Lkat (ak ®R zlt)eVc, then 2..23: . !I [S(lv QSlR u)] ( t (a, ®R zJ) = S ( t (exk ®n zJ) = t (T(at) ®R Zt) k=-1 k • l k • J (ly ®R u) ( ±(ext ®n zJ) = ±(at ®n zJ t • l k • l On the other band, Since u is an a-isomorphism of C, Theorem 2.6 implies lv ®n u is an A­ .~ isomorphism ofVC. Note that lv ®R u is not a C-linear transformation ofVC. [(Iv ®R u)S] ( i: (exlt ®n zJ) = (lv ®n u) ( i: (T(exJ ®11 zJ) k • l k•l Definition 1.24: Let V be a vector space over R. and let T e Homn{V, V). The lI f. (T(exJ ®R zJ extension T ® 11 Ic will be called the complexification ofT and written -r:. k • l Thus, ,-£ is the C-linear transformation on VC given by Thus, S satisfies equation 2.27. Conversely, suppose SeHomc{VC. VC) and satisfies equation 2.27. The 2.25: discussion after Corollary 2.20 implies that S = Li·• (TJ ®R wJ, where TJeHomn{V, V) and wJeC. To be more precise, S = I/I(Lj. 1 TJ ®R wJ, where 1/1 is the isomorphism in 2.21, F = R. and K =C. We shall suppress 1/1 here and ,-£ ( i: (at ®n zJ) = i: (T(aJ ®R zJ t • l k • l writeS = L:l'- 1 (TJ ® 11 wJ. Thus, Sis given by the following equation:

Clearly, rC = lji(T ®n 1) where 1/1: Homn(V. V) ® n C-. Homc{VC. VC) is 2.28: the C-linear isomorphism given in 2.21. If dimA(V) < oo, and g is a basis of V, then l(g.g)(IJ = r(g® 1, g® l}(rC). Thus, the matrix representation of the S ( I ext ®11 zk) = f. I TJ(exk) ®n wJzl< complexification of T is the same as that of. T (provided we make these tal jol k•l statements relative t~ g and g ® 1}.

Let ex ® 11 zevc. Then

[S(Iv ®n u}](ex ®R z) = S(a ®R z) = f. (TJ(ex) ®n wJz) t•l I;J\~HCISES FOA SECTION 2 81

On the other !.and,

82 MULTILINEAR ALGEBRA [(Iy ®R u}S)(ex (8)R z) = (Iy ®R u)( t T1(ex) ®A w z) = f T (ex} ®A wz T 1 1 1 J•l J ~ l then k

Since S satisfies 2.27, we have Z{Lja 1 T,(ex} ®A (w - w)) = 0. In particular, 1 1 0-+V'I®V;~Vt®V T,®T> V~®V1-+0 when z ~ 0, D=t (T1.(ex) ®A (w1 - w1)) = 0 for all ex·ev. 2 Now suppcse the real and imaginary parts of WJ are x, and y,, respectively. Thus, x , y e!R and w x iy • Then w - w = 2iy , and is not necessarily exact. 1 1 1 = 1 + 1 1 1 1 Lf=l (f1(ex) ®A (w - )) = 0 implies (T (ex) ®A iy ) = 0 for all exeV. 1 w1 Li= 1 1 1 (5) Complete the details of the proof of Theorem 2.1.". Namely, show Since {ex ®R 11 ex e V} spans VC as a vector space over C, we can now conclude { ffi i

= t (TJ ®A x1) = ( t x1T1) (8)111 Ic = ( t x1T1)c (7) Is Corollary 2.20 true for infinite-dimensional vector spaces V? If so, give a j=l ) • I }=I proof. If not, give an example.

Thus, S is the complexification of Lj. 1 x1T1 eHomA(V, V) and the proof of (8) Verify axioms V5-V8 from Definition 1.4 of Chapter 1 for the scalar Theorem 2.26 is complete. 0 multiplication being defined in equation 2.16.

We ~hall have more to say about the complexification of a real operator Tin (9) Show that (V 1 ®r K) ~ · · · ~ (V. ®F K)~ (V 1 ®F · · · ®F V.) ®F K as Chapter III. K-vector spaces under the map that sends (cx1 ®F k1) ®K · · · ®i (ex. @yk.)- (cx1 ®F··· (8}pcxJ @F(k1k 2 ·• · ko)·

(10) Show that t/f: Homp(V 1 ®F V 2, V 3)-+ HomF(V 1, HomF(V 2, V 3)) is an iso­ morphism. Here t/1 is defined by [r/t(f)(cx1)J(cx2) = f(cx 1 ® cx2). EXERCISES FOR SECTION 2 (11) Show that 71: Homp(V l> V1 ) (8)p V 3 --> Homp(V ~> V 2 ® V}) is an isomorph­ (1) Complete the details of the proo£ of Theorem 2.1 by showing ism. Here 71 is defined by 17(f ® cx3)(cx 1) == f(ex 1) ® ex3• We assume dimV . ··,ex.. Pt •... , P.J = (ex1 ® · · · ® exJ ® (Pt ® · · · ® P.J. (b) n.1 (exi®P1)=0in ~~®W 1 . + (3) Generalize Theorem 2.13 as follows: Suppose C: · · ·-+ V1 1 -+d,., V1 -+ (13) Show that V ® W = 0 if and only if V or W is zero. d, V1_ 1 -+ · · · is an exact chain complex of vector spaces over F. If V is any vector space (over F), show that C ®F V: · · · -+ Let us return to problems about the Kront:cker product A® B of two- V1+1 @yV-+d, ..®Ivyi(8}FV-+d ,®Ivyi-1(8}FV-+ ··· is an exact chain matrices (see Exercise 5 of Section 1). · complex.

5 (14) Suppose V and W are finite-dimensional vector spaces over a field F. Let (4) Show by example that if 0-+ v~ ; ' VI -+T' V'l-+ 0 and 0-+ 52 T e Homp(V, V) and S e Homp(W, W). Suppose A and B are matrix repre­ -+ v 2 -+ T> -+ 0 are two short exact sequences of vector spaces, Y2 Y2 sentations of T and S, respectively. Show that A ® B is a matrix representation ofT ® S on V ®F W.

(15) If A EM. x nCF) and BE Mm x mCF), show that rk(A ®B) == rk(A)rk(B). ,. Al.ltHNAIING MAPS AND EXTERIOR POWERS 83 I'

·(16) In Exercise 1'5, show that det(A ®B)= (det(A))~{det(B))". :1~'. "';I~ (17) Let V = F[X]. Show that V V £;; F[X, Y] under the map that sends ®F :j; 84 MULTILINEAR ALGEBRA f(X) ® g(X) to f(X)g(Y). :,\ 1 Example 3.3: Let n = 5. Suppose a, teS, are given by (18) Let D:F[X]--> F[XJ be the formal derivative. Thus, D(Lfa 0 a 1X ) = 'i: 1 1 Ira 1 ia,x - • :.how that D is a linear transformation on F[X] such that D(VJ £; V n for all n eN. Here V • is the vector ~pace defined in Exercise 1 I: '[1 2 3 4 SJ [1 2 3 4 5] ' 2 3 4 1 r= 54 2 3 1 · (Section 2 of Chapter I). ; a= 5' •'I (19) Interpret the map D ®Don F[X] ® F[X] using the isomorphism given in .1: Then J!xercise 17. Restrict D ® D to V• ® Vm and compute a Kronecker iit product that represents D ®D. 'I. 1 2 3 4 5] 2 3 4 SJ ar = [ S 1 3 4 2 ' ra= [! (20) Generalize Exercise 17 to F[X , ... , XJ. 2 3 5 1 1 1]: ,. Note that a1: -:F ra. 0 ·1: ' 3. ALTERNATING MAPS AND EXTERIOR POWERS The map (a, •)--+ at on s. satisfies the following properties: \·1 In this secti(!n, we study a special class of multilinear maps that are called l·i 3.4: (a) a{t')') = (at)JI for all a, 1:, ye: s•. alternating. Before we can present the main definitions, we need to discuss (b) There exists an element 1 e: s. such that 1, cyclically in the sense that a(i;) = i , 1 2 3 4 1: 1 2 SJ a(i ) i , ... , a(i, _ ) i, a(i,) il> and a(j) j for all jell- {i , ... , i,}. a= [ 2 3 4 1 5 2 = 3 1 = = = 1 ... : .. ':f~ Example 3.5: If n = S, then Then a is the b~jection of A= {1, 2, 3, 4, 5} given by a{l) = 2, a(2) = 3, a(3) = 4, j.:[· a(4) = 1, and a(S) = 5. 0 2 3 4 .II a1 =G 3 4 1 Clearly, the number of distinct permutations of 11 is n!. We shall lets•. denote I'· ~] i':~ the set of all permutations on A = { 1, ... , n}. Thus, IS.I = n!. .. is 5-cycle . Since the elements of s. are functions on A, we can compose any two elements 1 a, -reS., getting a third permutation ar of 11. Thus, we have a function !~. r 2 3 4 s. x Sa-. s. given by (a, 1:) --+ at. The action of at on A is computed from a and ~ <12 r by using equation 3.1 in the obvious way. =G 3 1 4 ~] ' !· ~i is a 3-cycle. J 2 3 4 a=[~ 3 1 s !] "''"'"' ...... ,...... ,..., "'I"\'~ "1•"" """'• c:ruvr\ rvvven:. O '-' is not a cycle. However, (J is the product of two cycles: ~ 1 2 3 4 5][1 2 3 4 5] 86 MULTILINEAR ALGEBRA u= [2 3 1 4 5 1 2 3 5 4 ° transposition applied toP changes the sign of P. Thus, <1(P) = P if and only if a is a product of an even number of transpositions. G(P) = - P if and "nly if u is a When dealing with an r-cycle, (J, which permutes i1 , ••• , i, and leaves fixed aU product of an odd number of transpositions. The proof of the lemma is now other elements of ll, we can shorten our representation of u and write clear. 0 a=(i1 , ••• , i,). Thus, in Ex.ample 3.5, (11 =(1,5,2,3,4), (11 =(1,2,3). and C1 = (1, 2, 3)(4, 5). Definition 3.9: A permutation (JES. is even if (J can be written as a product of an We say two cycles (of S.J are disjoint if they have no common symbol in their even number of transpositions. If u i'l even, we define the sign of (J, sgn(u), by (J (1 representations. Thus, in Example 3.5, 1 and 1 are not disjoint, but (1, 2, 3) and sgn(u) = 1. The permutation (J is odd if (J can be written as a product of an odd (4, 5) are disjoint. It is convenient to extend the definition of cycles to the case number of transpositions. In this case, we set sgn((J) = - 1. r = l. We adopt the convention that for any i e~. the 1-cycle (i) is the identity map. Then it should be clear that any u e s. is a product of disjoint cycles. Clearly, a product of two even permutations is again even. A product of two odd permutations is also even. The product of an even and odd permutation is Example 3.6: Let n = 9 and odd. Note that our definition implies that 1, the identity map ... n !..1, is an even permutation (a product of zero transpositions). 2 3 4 5 6 7 8 9] u=G 3 4 1 6 5 8 9 7 Example 3.10: If n = 4, then (J = (1, 2, 4, 3) = (1, 3)(1, 4)(1, 2) is odd. -r = (1, 2)(3, 4) is even. Thus, sgn(u) = -1 and sgn(-r) = 1. 0 Then u = (1, 2, 3, 4)(5, 6)(7, 8, 9). 0 We can now return to our study of multilinear mappings atid introduce the Any 2-cycle (a, b) e s. is called a transposition. The reader can easily concept of an alternating map. Suppose V and W are vector spaces over a field check that any cycle (i1 , ••• , i,) is a product of transpositions, namely, F. Recall that v· = {(exl, 0 0 0, a.J I ex. E V} . We shall keep n fixed throughout this (i" ... , i,) = (i11 i,)(i1, i,_ 1) • • • (i1, i3)(i1, i2). The factorization of a given cycle as a discussion. product of transpositions is not uriique. Consider the following example: D e.finition 3.11: A multilinear mapping 11: V" -+ W is called alternating if Example 3.7: Let n = 4. Then (1, 2, 4, 3) = (1, 3)(1, 4)(1, 2). Also 11(ex , ••• , ex..) = 0 whenever some a = ex; for i '# j . (1, 2, 4, 3) = (4, 3, 1, 2) = (4, 2)(4, 1)(4, 3). 0 1 1 Thus, a multilinear mapping 11 from V" to W is alternating if 11 vanishes on all Since every permutation is a product of disjoint cycles and every cycle is a n-tuples (ex , • •• , ex.) that contain a repetition. We shall clarify the situation whea product of transpositions, we get every permutation is a product of trans­ 1 n = 1 by adopting the convention that all linear transformations from V toW positions. We know from Example 3.7 that such a factorization is not unique, are alternating. but we do have the following fact: Example 3.12: Let us return to the example !n 1.4. The map Lemma 3.8: Let ueS.,. If (J can be written as a product of an even number of 4>: F" x · · · x F"-+ F given by 4>{ex , • • • , ex.) = det(a ), where ex = (a11 , • • • , a1.) is transpositions, then any factorization of u into a product of transpositions must 1 11 1 multilinear. If any two rows of A eM••• (F) are equal, then det A = 0. Thus, 4> is contain an even number ofterms. Similarly, if u can be written as a product of an an alternating multilinear map. 0 odd number of transpositions, then any factorization of u into a product of transpesitions mu"st contain an odd number of terms. Example 3.13: Suppose 4>: v•-+ W is an arbitrary multilinear mapping. We can construct an alternating map Alt(¢) from 4> with the following definition: Proof: Let X 1 , • • • , x. denote indeterminates over the field IR, and consider the polynomial P(X1 , ••• ,X.)= U1<1(X1 - X1). Here the product is taken over all i and j such that 1 ~ i < j ~ n. If (J e s., titen we de~ne a new polynomial G(P) by Alt(Xex1 , ••• , ex.)= L sgn((J)(ex~111 , ••• , a~<•>) ~..s., a(P) = P(X,.• ... , X•<•>) = ll1 - X.) is alternating. 0

We shall denote the set of all alternating multilinear mappings from V" toW I "~' cna.J\IINU MAPS AND EXTERIOR POWERS IS/

by AltJ{V•, W). C'learly, Alt,:(V", W) is a subspace of MuiF(V x ... x v, W). for iii),I)1 eAltF(V•, W), and x e F, then 'I+ '1 1 and X'J are clearly alternating. Suppose c. = ti,J) is a transposition in s. with i < j. Let 'I eAltF(V•, W), a.nd 88 MULnUNEAR ALGEBRA a=(au ... ,cx.)eV". Then (i) (j) following property? If W is any vector space over F, and 1/t e Alt.{V", W). then there exists a unique TeHom.,(Z, W) such that T71 = 1/t. {a-t11, ••• , aet.J =(a,, ... , a1, .•. , a., ... , a.)

Thus (cx.111 , •.• , a-t•~ is just the vector ~ with its ith and jth components The question posed in 3.17 is called the universal mapping problem for alternating multilinear maps. This problem has an obvious solution, which we (i) interchanged. Supp>se we consider the n-tuple (cx , ••• , ex,+ cx , .•. , shall construct shortly. First, let us point out that any solution to 3.17 is m 1 1 essentially unique. Clf +a 1, ... , a.) with a1 + cx1 in both the ith and jth positions. Since 'I is (i) (j) . Lemma 3.18: Suppose (Z, 'I) and (Z', 'IJ are two solutions to 3.17. Then there exist isomorphisms T :~ Z' and T :Z'~ Z such that alternating, we have 0 == 'I(

314:

11(cxet1>• ••• , cx. • ) = sgn(u)'l(cx , ••• , aJ 1 1 1 \/' Z' Thus, when we interchange two terms in the sequence , • •• , the sign of a1 a. 'l(a 1, ••• , a.) changes. Since every permutation is a product of transpositions. Proof: This proof is identical to that of Lemma 1.11. 0 equation 3.14 immediately implies the following theorem: The next order of business is to construct a solution to 3.17. This is easy using Theorem 3.15: Let 71: v•-+ W be an alternating multilinear mapping. Let u e s •. what we already know about tensor products. Consider y®o = V ®F ·· · @y V. Then for all {a,, ... , a.) E V•, 'l(a•(l)• .. ·, O:•!a~ = sgn(u)'l(alt ... , a.). 0 Let U be the subspace of y®n generated by all vectors of the form a 1 ®···®a., Another useful observation concerning alternating maps is given in the where the n-tuple (a1 , ..• , a.) contains a repetition. Set Z = V*"fU. Let following theorem:

and only if 'I(

Proof This is a straightforward computation which we leave to the Lemma 3.19: The pair (Z, 'I) constructed above is a solution to 3.17. exercises. 0 Proof: Suppose ljt: V"--+ W is any alternating multilinear map. Then because tJt If 11: v•-+ W is an alternating multilinear map and T e Hom.,(W, W'), then is multilinear, there exists a unique linear transformation T 0: y®n-+ W such that clearly T'l e Alt.,(V", W'). This suggests the following analog of the universal mapping problem posed in 1.9. 3.20:

117: Let V be a vector space over F, and fix n eN. Is there a vector space Z over F and an alternating multilinear map '1: v•-+ Z such that the pair (Z, 'I) has the Vx·~/v•• ALTERNATING MAPS AND EXTERIOR POWERS 89 ..,... is commutative. If (a1 , •.• , aJ e V" contains a repetition, then J/!(a1 , ••. , a.) = 0 since · 1/1 is alternating. Thus, 3.20 implies T (a ® · · · ® aJ 0. Since U is 0 1 = 90 MULTILINEAR ALGEBRA generated by all vectors a 1 ®···®a. with (a1 , ... , aJ containing a repetition, we conclude that T (U) = 0. It now follows from the first isomorphism theorem . I 0 Theorem 3.25: Let V and W be vector spaces over F and suppose 1/t: V" --+ W 1s ! (Theorem 5.15 of Chapter I) that T0 induces a linear transformation 0 an alternating multilinear map. Then there exists a unique linear transformati-on I T:V® fU=Z--+W given by T((a1 ® ···®cxJ+U)=T0(a1 ®···®aJ. If TeHom~A"(V), W) such that for all (a1 , ... ,aJeV", l/t(a 1 , .•. ,aJ= i (a1, . .. , ex,.) e V", then T'7{a1 , ... , aJ = T((a1 ® .. · ® aJ + U) = T 0(cx1 ® .. · T(a1 " · • • " aJ. 0 · ® aJ = J/!(cx1 , .•• , aJ. Thus, Having constructed the nth exterior power AHV) of V, the next order of I 3.21: business is to find a basis for this space. Suppose B is a basis of V. As usual, set

B" = {(cx1 , ... , a.)la1 e B}. Let B(n) denote the sunset ofB" consisting of those n· tuples which have distinct entries. Thus, B(n) = .;(a1 , ••• , «Je B" I «1 -¥ a1 when­ ever i # j}. It is possible that B(n) = if>. In this case, n > IBI. But then every wedge product P 1\ • • • " P. in A~) is zero. Thus, AHV) 0, and the empty set v~'/z 1 = is a basis of AHV). So, we can assume with no loss of generality that n ~ IBI. w We define an equivalence relation = on the set B(n) by the following formula; 3.26: (a1, . .. , a.) ={a' 1, ... , a~) il and only if there exists a u e S, such that is commutative. acr(l) = a~, .. .• acr(n) = a~. Finally, we must argue that Tis unique. Suppose T' eHomr(Z, W) makes 3.21 commute. Then T = T' on Im 1]. But Z = L{lm 1]). Thus, T = T', and the proof of Thus, two n-tuples in B(n) are equivalent if some permutation of !he entries in Lemma 3.19 is complete. 0 the first n-tuple gives the second n-tuple. The fact that = is indeed an equivalence relation, that is, that =satisfie{l the axioms in 5.1 of Chapter I is Definition 3.22: The vector space y®ofU is called the nth exterior power of V obvious. We shall let B(n) denote the set of equivalence r:lasses of B(n). Recall (over F) and will henceforth be denoted by M{V). that the elements of B(n) are subsets of B(n). B(n) is the disj(Jint union of the distinct elements of B(n). If {a , .:..:2a.) e B(n), we shall Jet (a , ••• , a.) denote ! When the base field F is clear from the context, we shall simplify our notation 1 1 the equivalence class in B(n) which contains (cx , ••• , a.). Thus, 1 and write A"'Y). Note that A1(V) = V. We define A0(V) =F. 1 B(n) = {(a1 , ... ,o:.)l(a1 , ... ,aJeB(n)}. · , ••• , Definition 3.23: The coset (a 1 ® · · · ® aJ + U in A"(V) will henceforth be Now for each element x e B(n), we can choose an n-tuple {a1 a.) e B(n) I denoted by a 1 " • • • " o:. aod called a wedge product. such that (cx 1 , .. . .• a.)= x. For a given x, there may be many such n-tuples, but they are all representatives of the same equivalence class x. For :ach x e B(n).

Thus, the alternating multilinear map 71: V" ..... N(V) is given by pick a representative of x, say A,, in B(n). Thus, A• is ao n-tuple (a 1 , ••• , «.) e B(n) 1!(1X , ... ,a.) a; " ... "a• . We have already noted that Im71 spans A"(V). 1 = 1 such that (a1 , ••• , a.) = x. We have now defined a set mapping (x -+)...)of B(n) ,. Thus, every vector in A"(V) is a finite linear combination of wedge products to B(n). There are of course many choices for such a map. Choose any such map a " • · • " a •. We also have the following relations when dealing with wedge 1 and set C(n) = {J..i x e B(n)}. Then C(n) is a collection of n-tuples in B(n), one n· 1 products: tuple for every equivalence class X E B(n). We ~n now state the following r theorem: I 3.24: (a) o._" .. · "a.= sgn(u)a.. u>" · .. "a.. 1 ., for all ueS•. (b) a 1 1\ .. • " (xcxJ 1\ • • • " a. = x(a 1 " • • • " a.) for all x e F. Theorem 3.27: Let V be a vector space over F. Suppose B is a basis of V. j (c) a 1 " • • • 1\ (a1 + aD 1\ .. • " a. Construct the set C(n) as above. Then 6. = {o: 1 ~.. :.:: "a. I (1'1. 1 , ... , o:Je C(n)} is • = a 1 " · · · 1\ a 1 1\ .. • 1\ a.+ a1 " · · · 1\ IX) " .. · " cx4 • a basis of AHV). In particular, dim~AHV)) = JB(n)j. 1

Let us restate Lemma 3.19 using our new notation. Proof The set C(n) consists of one representative in B(n) for each equivalence , c~ass x e B(n). Thus, jC(n)l = IB(n)j. In particular, if 6. is a ba.sis of ~~), then .,· dtmr{J\fM) = 16.1 = jC(n)l = IB(n)j. So, w~: n~:ed only argue Cits a baSIS of M(V). The proof we give here is analogous to that of Theorem 1.20. 1 ALTERNATING MAPS AND EXTERIOR POWERS 91

Let 1V = EB(cr,, ...,cr.)eQnJ F ..As we have seen in Section l, 1V is a vector space over F with basis {O(cr,, ... ,cr.J I(ex 1 , ••. , ex,) e C(n)}. Here O(cr,, ...,cr.J is the function from 92 MULTILINEAR ALGEBRA C(n) into F given by O(cr,, ... ,cr.J(({J 1 , ••• , fJ,)) = 1 if (fJ 1 , • •• , fJJ = (ex 1 , ••• , ex,) and zero otherwise. We shall show that 1V~ Af(V). of A is clearly the number of ways of picking an n-element subset from the set We define a map p.0: B" -+ 1V as follows: If ({J 1, •. . , PJ e B" contains a { 1, 2, ... , N}. Therefore, IAI = (~) repetitipn, set p.0({J1 , ... ,{J,) = 0. Suppose (fJ1 , ... ,fJJeB(n). Then Suppose n > N. Then we had noted previously that B(n) =¢,and , \~V) = 0.

({Jl> ... ,fJ.) = xeB(n). If A.,= (a1 , ... ,ajeC(n), then (a1 , ••• ,a.)= Thus, dimF(AF(V)) = (~) = 0. 0 ({J1 ; ... , fJ.). Thus, ({J1 , ... , PJ =: (ex1 , ... , a.). So, there exists a unique ueS. such that ({J,.111 , .. • ,{J,.1.~ = (a1 , ... ,a,). In this case, define There is another corollary that can be derived from Theorem 3.27. Suppose , ••• , p0({J1 {J.) = sgn(a)o1.,,, ... ,cr.J· We can extend the map Jl.o in the usual way (see V is an n-dimensional vector space over F. Let g = {a 1 , ••• , a.} be a basis Exercise 2 at the end of Section 1 in this chapter) to a multilinear map p.: v•-+ 1V. of V. Then we have a nonzero, alternating map ¢: v• -+ F given by Jl.o An easy computation using the definition of shows p. is alternating. ¢(ex 1 , ••• , ex") = det(a11 ), where (a1) .. = (a11 , ••• , a 1.). Our next corollary says that We now claim the pair (1V, p.) satisfies the universal mapping property in 3.1 '7. ¢ is essentially the only alternating map from V" to F. To see this, suppose W is a vector space over F, and 1/1: v• .... W an alternating multilinear map. Using 3.23 of Chapter I, we can define a linear transforma­ Corollary 3.29: Let V be an n-dimensional vector space over F. Then tion T:1V-+ W by T(b(cr,, .... cr.J) = 1/l(a 1 , ... , a.) for all (a 1 , ••• , cx.)'eC(n). dimr:(Altr:(Vn, F)) = 1. If({J1 , ... ,fJJe B" and contains a repetition, then Tp.({J1 , ... ,{J,)=T(0)=0. Proof' The map t/1 constructed above is alternating. Hence ¢ e AltF(V", F). Since 1/1 is alternating, 1/l(fJ 1 , ... , fJ.) = 0. Suppose (fJ 1 , ... , fJ.) e B(n). Suppose r/1 e Altr:(V", F). If '1 denotes the canonical map given in Lemma Then (fJ .. ul• ... , Ptl(n) = (ex1 , ..• , a,)eC(n) for some aes.. Thus, 3.19,then there exists a unique linear transformation T e Homr:(AF(V), F) such , • •• , , ••• , Tp({J 1 PJ = T(sgn(a)c5ccr,, .....,J = sgn(a)T(c51.,, ...... ,J = sgn(a)l/l(ex 1 a,). On that T17 = 1/1. Similarly, there exists a T 1 eHomr:(t\~(V),F) such that T 1'7 = ¢. the other hand, since 1/1 is alternating, Theorem 3.15 implies Now Corollary 3.28 implies dimr:(A"V) = 1. . Consequently, ~(fJ1, . .. , = sgn(a)r/I(Pau>• ... , Pa .~ = sgn(a)r/J(ex , ... , a.). Thus, T J1. and PJ 1 1 1/1 dimF{Homr:(M(V), F)} = 1. Since ¢ "# 0, we conclt~de T 1 "# 0. In particular, are two alternating multilinear maps on V" that agree on B". It follows that {T 1} is a basis for Homr:(t\~(V), F). Therefore T = xT1 for some x e F. We then Tp=r/J. have r/1 = T11 = xT1 71 = x¢. Thus, {¢} is a basis of Altr:(V", F) and the proof of Finally, we must argue T is unique. If T': 1V -+ W is a second linear Corollary 3.29 is complete. 0 transformation. such that T'p. = 1/1, then T'(c5ccr., .... cr.J) = T'p.(ex 1 , ••. , a,)= ift(a1 , ... ,a.)=Tp.(ex;, ... ,a.)=T(8ccr.,.... cr.J) for all (a1 , ... ,a.)eC(n). Thus, T At this point, we could begin to discuss the functorial properties of exterior and T' agree on a Lasis of 1V and, consequently, must be equal. powers. Almost all the results in Section 2 have analogs in our present situation. We can now apply Lemma 3.18 to the pairs (A~. 71) and {1V, f.J.). In Since this is not a text in multilinear algebra per se, we shall leave most of these particular, there exists an isomorphism S: 1V~ A~ such that Sp. = '1· If types of results to the exercises at the end of this section. The reader who wishes , ... 1\ ... A CX , ... , ... (a1 ,a,)eC(n), then cx1 0 ='1(ex1 ,cx,)=Sf.J.(a1 ,cx.)= to read fur~er in this subject matter should consult [5] or [4]. S(occr,, ... ,cr.J). Thus, S is an isomorphism taking the basis {8ccr,, ... ,cr.JI(at> ... , We shall finish this section with a description of the induced map on exterior a.)eC(n)} in "9 to the set A in A;{V). It follows that A is a basis of A~F). 0 powers derived from a given TeHomr:(V, W). Suppose V and W are vector spaces· over F and let T be a linear transformation from V to W. Let

71: V" -+ A~(V) be the natural alternating map given by 71(cx 1 , ••. , a.) = Corollary 3.28: Suppose V is a finite-dimensional vector space over F with cx 1 1\ • • • 1\ a • . The linear transformation T induces an alternating, multi­ dimr:(V) = N. Then dimr:(t\F(V)) = (~. linear mapping r/Jt: V"-+ A~W) given by r/J-r(cx 1 , • •• , a.) = T(ex1) " • • · A T(a.J. The fact that r/lt is indeed alternating is clear. It now follows from Theorem 3.25 Before giving a proof of 3.28,let us discuss its meaning. If0 ~ n ~ N, then (~ that there exists a unique linear transformation S e Homr:(Af(V), A~W)) such is the binomial coefficient N!jn!(N - n)!. If n > N, then (~) = 0. · that s, = r/lr·

Definition 3.30: The unique linear transformation S for which s, = ifit will Proof of Corollary 3.28: If n = 0, the result is trivial. Suppose 1 ~ n ~ N. Let henceforth be denoted t\"(11. 1 B = {a1, ... ,aN} be a basis of V. Theorem 3.27 implies that A~ {«,, A • · • A a,. II" i, < ··· < '•" N} U a b..;, of A~V). Th• oudinalit' Thus, A"(11 e HomF(M(V), t\;(W)), and for all (a , ... , ex.) e v•, I l 1 An(T)(a 1 1\ · .. 1\ a,) = T(ex 1) 1\ • .. 1\ T(ex,). ' Clearly, A"(T1T 2)=t\"(T1)A"(T2j tor T 2 eHomr:(V,W) and T1 eHomF(W,Z~ ~ We also have the important analogs of Theorem 2.6. 1

I EXERCISES FOR SECTION 3 93

Theorem 3.31: Let V and W be vector spaces over F, and let T e HomF(V, W). Then the following assertions are true: 94 MULTILINEAR 1\LGEBRA (a) If T is injective, so is A"(T). (b) If·; is sutjective, so is A •(T). (10) Let V and W be. vector spaces over F. Show that A;(\7$ W) (B,+J=a(A'(V) (c) If Tis a.1 isomorphism, so is A"(T). ~ ~Al(W)).

Proof Consider bases of V and W and apply Theorem 3.27. 0 A"(T) is usually called the nth exterior power ofT. 4. SY M METRIC M APS AND SYMMETRIC POWERS

· In this last section on multilinear algebra, we study another special class of multilinear maps. Let V and W be vector spaces over F and let n eN. EXERCISES FOR SECTION 3 Definition 4.1: A multilinear mapping if>: V"-+ W is said to be symmetric if (1) Show that ever¥ permutation q e s. is a product ·of disjoint cycles. ~(a 1 , •.. ,a.) = 4>(a.. c1l, .. . , a ..1 .J for all (a1 , .. • ,a.)eV" and all qeS• . (2) Elaborate on the details of ·Example 3.13. Specifically, show Alt(·): MulQ, W)-+ AltF(V", W) is a well-defined linear transformation. We shall denote the set of all symmetric multilinear mappings from V" to W by SymF(V", W). Clearly, SymF(V", W) is a subspace of Mu!t\V•, W). Note that (3) Prove Theorem 3.16. SymF(V", W) 2 AltF{V", W) whenever F 2 IF 2• Let us con~ider some examples: (4) The A(V) of V is defined to be the following direct sum: A(V) = Ee:"~oAa(V). 2 . Example 4~2: If n = 2, then SymF(V , W) is just the set of all symmetric bilinear (a) Show that A(V) is an algebra over F when we define the product of two maps from V x V to W. In particular, any inner product is a symmetric elements a= a1 " · ·· 1\ aPeAP(V) and f1 = {1 1 " ··· "f1meA"'(V) by multilinear map. 0 a{J = a 1 1\ • • • " · aP 1\ P1 1\ • • • 1\ Prn· (b) Sho-.y that A(V) is an anticommutative algebra. This means for all Example 4.3: If ~: v•-+ W is any multilinear map, we can construct a cxeN(V) and {JeA"'(V). ap = (-l)Pmpa . symmetric map S(l/>)eSymF(V", W) from~ with the following definition: (c) Show that there exists an injective linear transformation T: V-+ A(V) such that (T(cx))2 = 0 for all aeV. sc~xal, .... a.) = L ...• a e MuiF(V", W) is in fact formation T e HomF(V, A) such that (T(a))l = 0 for all a e V, then there symmetric if ~(a , ••• , a.) remains unaltered whenever two adjacent terms exists a unique F-algebra homomorphism q>: A(V)-+ A such that 1 in (a , ..• , a.) are interchanged. If if> e SymF(V", W), and T e HomrlW, Z), then q>{cx) = T(cx) for all a e V. 1 clearly, T~eSymF(V", Z). As with alternating maps, this suggest the following (6) If dimrtV} = n, what is dimF(A(V))? universal mapping problem fo~ symmetric multilinear maps: (7) Suppose V is a finite-dimensional vector space over F. Show that . {AF(V)}•~ AF(V•). Is this true if Vis infinite dimensional? 4.4: Let V be a vector space over F and n eN. Is there a vector space Z over F and a symmetric multilinear map 4>: v•-+ Z such that the pair (Z, 4>) has the (8) Give an example of a short exact sequence 0 -+ V -+ W -+ Z -+ 0 of vector following property: IfW is any vector space over F, and l/1 eSymF(V", W), then spaces over F such that the corresponding complex there exists a unique Te HomF(Z, W) such that T 4> = l/1? 0 ..... A"V ..... A"W-+ A"Z-+ 0 is not exact.

(9) Suppose Vis c1. vector space over F, and let K be a field containing F. Show As with alternating maps, it is an easy matter to argue that any solution to 4.4 AF(V) ~K~ A:(V @yK) asK-vector spaces. is essentially unique.

Lemma 4.5: Suppose (Z, ~)and (Z', ¢')are two solutions to 4.4. Then thue exist

isomorphisms T 1: Z~ Z' and T 2: Z'~ Z such that SYMMETRIC MAPS ANO SYMMETRIC POWERS 96

(a) T 1T 2 = lz· and T 2T 1 = lz, and the following diagram is commutative: (b) 96 MULTILINEAR ALGEBRA ' To construct a basis of SF{\'), we proceed in the same spirit as Theorem 3.27. Let B be any basis of V. Define an equivalence relation e on B 0 by

(cx 1 , ••• , ex.) e (ct1 , ••• , ex'.) if and only if (ex~111 , ••• , cx-t.J = (cx'1 , ••• , ct.) for some • O'ES0 From each equivalence class of e, pick one representative and call this \;/. set of representatives B*. An argument completely analogous to that ofTheorem Z' 3.27 gives us {(cx1] .. • [exJ l(exl> .. :, ex.)eB*} is a basis of S~. Proof This proof is identical to that of Lemma 1.11. 0 Theorem 4.10: Suppose B is a basis of V. Form B* as described above. Then As the reader can see, much of what we do here is completely analogous to 6 = {(cxtJ· .. (cx.JI(cxt, ... ,cx.)eB•} is a basis ofS;.(V). 0 the case of alternating maps. For this reason, our treatment of the results for Now suppose dim,{V) = N < co. Let B = {ext, ... , exN} be a basis of V. The symmetric maps will be much abbreviated. number of elements in B• is the number of distinct monomials of degree n in the We now construct a solution to 4.4. Set Z = ,V0 "/N where N is the subspace symbols [ex1], .•• , [exN]. Thus, IB•! is equal to the number of distinct products of of y®n generated by all vectors of the form ex1 ®···®ex. - ex~ ® · · · ® ex~<•> the form [ext]"··· (aN]<" with e1 + · · · +eN= n. An easy counting argument {ueS.). Let c/>: V"--. Z be given by c/>(ex , ... , ex.).= ex ® .. ·®ex.+ N. The 1 1 gives us precisely (N-~+") of these monomials. Thus, we have proven the definition of N implies cJ> is a symmetric multilinear map. following corollary:

Lemma 4.6: (Z, c/>) solves 4.4. Corollary 4.11: If dim,{V) = N, then dim,{S;(V)) = (N- ~ +n). 0

Proof Supp<>se 1/1 e Sym,(Vm, W). By Theorem 1.19, there exists a Finally, a linear transformation T: V - W induces a map on symmetric T0 eHomr(V®n W) such that T 0 (ex1 ®···®ex.)= l{t(ex~o ... , ex.). Since 1/1 is powers as follows: Let 4J: yo- S;(V) be the natural symmetric map given by symmetric, ·f vanishes on N. Consequently, T induces a linear transformation 0 0 c/>(cx1 , ••• , a .. ) = (ext]··· [exJ. T induces a symmetric multilinear mapping t/11 T:Z-. W given by T((ex ®"·®ex.)+ N) T (ex ® ... ®ex.). Clearly Tc/> = ~· 1 = 0 1 from va to S;{W) given by l{t-r{cx 1 , ••• , ex.) ... [T(ex1)] • • • [T(cx.)]. By Theorem 4.8, The fact that Tis unique is the same argument as for multilinear or alternating there exists a unique linear transformation S;(T) e HomrtS;(V), S;{W)) such maps. 0 that S;(T)cJ> = "'T· Thus, for all (ex, .... ex.) E v•, S;(T) ((ex t] ... [cxJ) = [T(cx )] • • • [T(ex.)]. Definition 4.7: The vector space y®nfN is called the nth symmetric power of V 1 If T 1 e Hom,(V, W) and T 1 e Hom,{W, Z), then clearly, Sll(T1 T 1) = and will hencef.:>rth be denoted by SUV). The cosets ex 1 ® · · · ®

Theorem 4.8: Let V be a vector space over F, and suppose 1/1 e Sym,(V•, W). (a) If T is injective, then Sf{T) is injective. Then there exists a unique linear transformation TeHc:im,{S~. W) such that (b) If T is surjective, then Sf{T) is surjective. !J!(ex , ... ,

From our construction of S;(V), we see that the set The linear transformation Sf{T) is called the nth symmetric power ofT. } .. , ..... {[ex1 [exJ l(ex 1 ex.) e V"} spans S;(V) as a vector space over F. The following relations are all obvious. EXERCISES FOR SECTION 4

4.9: (a) (ex 1] .. • [ex1 + ctJ .. · [exJ = [extJ" · [exJ "· [exJ + [ex 1] .. ·[ex!]"· [exJ.

(b) [ex 1] • • • [xexa ···(ex.] ... x([ex.J · · · [exa ···[ex.]). (1) Complete the details of Example 4.3, that is, argue S(c/>) is indeed a symmetric multilinear mapping. (c) [ex 1] .. ·[ex.] = [ex~0J .. · [ex.1•1] for all 0' e S0 • ·(2) Suppose V is a vector space over F. Show that the following complex is a short exact sequence:

n__. A !lVI~ v 19\_ v __:!:.___. sUV \----+ o EXERCISES FOR SECTION 4 97 .· !. Here T is given by T(ex 1\ {f) =a® fJ - fJ ®a, and T' is given by T(fJ ® o) = [fJJ(c5J. (3) Is Exercise 2 still true if 2 is replaced by n?

(4) Let A = F[X , ••• , XJ denote the set of all polynomials in the variables 1 I~ X., ... , x. with coefficients in the field F. Thus, F[X1 , ••• , XJ is the set consisting of all finite sums of the following form 'L:.to .... ,to) c(l, ... 1.)(~' ···X~· with c(l,, .... 1•1eF and (i1 , ... , iJe(N v {0})". (a) Show that A is an infinite-dimensional vector space over F with basis 6 = {X~···· X~· I (i 1 , ... , i.) e (N u {OW} when we define addition and scalar multiplication as follows:

1 ( •) L-""1•~ l"~· 1 ••• iJX''1 · · · Xa + "'\.£... d(1 1 ···1.1X" 1 · · · X'•)a = }:(c(I,···•J + ~ ••... 1.JX 11'···X~· and

1 X "' Yh ···X•)- X" ···X'· \.£... c(1 1 • • •10)''1 a - ~XCL, (1 1 ···I.) 1 n • (b) Suppose we define the product of two monomials X!' · · · X!•, and X'/···X~ in A by the formula (X~· .. ·X~·)(X'f· .. X~)=X~·+r, ... x:·+r•. Show that we can extend this definition of multiplication in a natural way to a product on A such that A becomes a commutative algebra over F, that is, fg = gffor all f, g eA.

(c) Let AP = L({Xl"· · X:• I e 1 + ... +e.= p}). Show that dim,:(Ap) = (•-~+P). (d) Show. th:4t A is a graded F-algebra, that is, A = E9';-o AP and ApAq s;;; Ap+q for all p, q ;;, 0. (5) Let V be a vector space over F. The symmetric algebra S(V) is defined to be

the following direct sum: S(V) = $';.0 S~V). Here as usual, S~ = F. (a) Show S(V) is a commutative graded algebra over F when we define products by the formula ([ex,] ... [aJ)([fJ a ... [/Jq]) = [a,] ... [a:J (fJ1] ... [fJJ. (b) Show that there exists a natural, injective linear transformation T e Hom,:(V, SM) such that T(a)T(fJ) = T({f)T(a) for all ex, fJ e V. (6) Show that the pair (S(V), T) constructed in Exercise 5 has the following universal mapping property: If A is any F-algebra, and 1/!e Hom,:(V, A) such that t/l(a)t/1(/f) = t/1(/f)t/l(ex) for all ex, fJ e V, then there exists a unique algebra homo~orphism tp: SM-+ A such that q>T = 1/1.

(7) If dim,:(V) = n, show SM ~ F[X 1 , ••• , XJ as F -algebras.

(8) Let V and W be vector spaces over F. Show that SF(V (;f) W) ~ Eat+J•n {S\,(V) ®FSJ{W)}.

(9) If V is ..l vector space over F and K a field containing F, show Sf(V) ®F K;;;: SHY~ K) as K-vector spaces.