CONSTRUCTION OF THE V ⊗ W

JAMES BROOMFIELD

Abstract. This paper is to present a straightforward approach to con- structing the tensor product of two vector spaces. We will provide both the existence and uniqueness of this construction. Finally, we will link this to the categorical construction.

Preliminaries Given two vector spaces V and W (over a field K), can we give a structure to V × W ? It turns out the answer is yes if we define vector addition to be component-wise and multiplication to be:

c(~v, ~w) = (c~v, c~w)

If we try to define as

(1) c(~v, ~w) = (c~v, ~w) or (2) c(~v, ~w) = (~v, c~w), we would fail the properties of a vector space in the following ways

(1) (1 + 0)(~v, ~w) = 1(~v, ~w) + 0(~v, ~w) = (~v, ~w + ~w) 6= (~v, ~w) or (2) (1 + 0)(~v, ~w) = 1(~v, ~w) + 0(~v, ~w) = (~v + ~v, ~w) 6= (~v, ~w)

So why is it important that we worry about the details above? Given vector spaces V, W, and X, one is often interested in studying bilinear maps from V ×W to X. As it happens, Hom (V ×W, X) is not the set of bilinear maps from V × W to X, so natural question might be:

”In the category of all vector spaces, is there a special vector space, say A, for which the bilinear maps from V ×W → X correspond to the set Hom (A, X)”

The answer to this question is also affirmative, and we denote the corre- sponding vector space by V ⊗ W . 1 2 JAMES BROOMFIELD

1. The Tensor Product Let us start by constructing the Tensor product V ⊗W . After completing this task, we will briefly translate our result to the language of category theory. Theorem 1.1. Let V, W, and X be vector spaces over a field K. Then there is a unique vector space A over K and a f : V × W → A such that for any bilinear map g : V × W → X, there is a unique g : A → X such that g = g ◦ f. That is, the following diagram commutes f V × W A

g g X

Proof: To begin, we denote the free vector space of V × W over K by

 N  X FA = ai(~vi, ~wi) where ~vi ∈ V, ~wi ∈ W, ci ∈ K, and N ∈ N n=1

Next we let M be the subspace of FA generated by the following relations

For all ~v,~v1,~v2 ∈ V ; ~w, ~w1, ~w2 ∈ W ; and c ∈ K:

(1)( ~v1, ~w) + (~v2, ~w) ∼ (~v1 + ~v2, ~w)

(2)( ~v, ~w1) + (~v2, ~w2) ∼ (~v, ~w1 + ~w2)

(3) c(~v, ~w) ∼ (c~v, ~w) ∼ (~v, c~w)

Concretely, M is the subspace of FA spanned by vectors of the form

(~v1 + ~v2, ~w) − (~v1, ~w) − (~v2, ~w)

(~v, ~w1 + ~w2) − (~v, ~w1) − (~v, ~w2) (c~v, ~w) − c(~v, ~w) (~v, c~w) − c(~v, ~w)

Now we choose the candidate for A to be FA/M. Next, define f : V ×W → A by f = π ◦ ι, where ι : V × W,→ FA is the inclusion map and π : FA → A is the canonical projection map. Notice that the map f is bilinear by the equivalence relations above (check!) CONSTRUCTION OF THE TENSOR PRODUCT V ⊗ W 3

Next, if we are given a bilinear map g : V × W → X, there exists a bilinear extensiong ˜ : FA → A, i.e.

 N  N X X  g˜ ai(~vi, ~wi) = aig (~vi, ~wi) n=1 n=1 Sinceg ˜ is bilinear, it is easy to verify that kerg ˜ = M. This stems from the equivalence relations (1), (2), and (3) (once again, check!). A further consequence of kerg ˜ = M, is thatg ˜ induces a map g : A → X given by g˜(v, w)) = g ◦ π(v, w). Assembling these facts, we have the following diagram

ι - π - V × W FA A

g˜ g g - ? X Next we can verify that g is linear and g = g ◦ f. The later statement fol- lows from the fact that f = ι ◦ π, and the former follows from the bilinearity ofg ˜ and the fact that kerg ˜ = M. Further, g is unique since the image of f spans the vector space A. This shows the existence of A

Finally, we must show the uniqueness of A. Suppose that there is another vector space A0 and associated bilinear map f 0 : V × W → A0 having the desired properties. Since f 0 is bilinear, there exists a unique map T : A → A0 such that f 0 = f ◦ T . Likewise, there is a unique map S : A → A0 such that f = f 0 ◦ S. This is summarized below

f f 0 V × W A V × W A0

f 0 f T S A0 A

From this we have f ◦ (T ◦ S) = (f ◦ T ) ◦ S = f 0 ◦ S = f f 0 ◦ (S ◦ T ) = (f 0 ◦ S) ◦ T = f ◦ T = f 0 Therefore S ◦ T is the identity, and this shows that T : A → A0 is an . Thus A is unique up to isomorphism. The vector space that we constructed above is generally denoted V ⊗ W rather than A.  4 JAMES BROOMFIELD

In the language of category theory, a F , from a locally small cat- egory C to the category of sets, S, is representable if and only if there exists an object A ∈ C such that F (X) = Hom (A, X) for all X ∈ C. In the above discussion, we showed that the functor, F , which takes a vector space X to all bilinear maps from V × W , is representable, and the set A in this case is exactly V ⊗ W .

Now that we have the existence of V ⊗ W , we can ask a questions about its . We will see in the following proposition that the basis of V ⊗ W comes naturally from the bases for V and W . However, in doing so we must note that a general tensor in V ⊗ W cannot necessarily be written as the tensor product of two elements i.e. ~v ⊗ ~w. Such are called pure tensors and will constitute a subspace of V ⊗ W .

Proposition 1.2. If {~vi}i∈I and {~wj}j∈J are a basis for the vector spaces V and W , then {~vi ⊗ ~wj}(i,j)∈I×J is a basis for V ⊗ W . Proof. To begin, notice that π is surjective, therefore any ~z ∈ V ⊗ W can be 0 0 0 written as π(z ) where z ∈ FA. However, we know that z can be written as

N 0 X z = ck(~xk, ~yk) k=1

For some ck ∈ K, xk ∈ V , and yk ∈ W . Now since {vi}i∈I and {wj}j∈K are bases for V and W , we can write z0 as N   0 X X X z = ck ai~vi , bj ~wj k=1 i j Next we use the canonical projection π to obtain  N   0 X X X z = π(z ) = π ck a(i,k)~vj , b(j,k) ~wj k=1 i k N X  X X  = ck · π a(i,k)~vj , b(j,k) ~wj k=1 i j N X  X  = ck · a(i,k) · b(j,k) · (~vi ⊗ ~wj) k=1 i,j X = ai · bj · c(i,j) · (~vi ⊗ ~wj) i,j

This shows that {~vi ⊗ ~wj}(i,j)∈I×J spans V ⊗ W . CONSTRUCTION OF THE TENSOR PRODUCT V ⊗ W 5

Now we will show that {~vi ⊗ ~wi}(i,j)∈I×J is linearly independent. To begin, suppose that X a(i,j) · ~vi ⊗ ~wj = 0 i,j Then we see that

X X  X  a(i,j) · ~vi ⊗ ~wj = a(i,j) · ~vi ⊗ ~wj. i,j j i

Since each ~wj is linearly independent, each term must be zero. Thus for each j, we have

 X  a(i,j) · ~vi ⊗ ~wj = 0 i This implies that

 X  X a(i,j) · ~vi ⊗ ~wj = −a(k,j)~vk ⊗ ~wj ⇐⇒ a(i,j) · ~vi = −a(k,j)~vk i6=k i6=k but this is equivalent to showing that X a(i,j) · ~vi = 0 i6=k and since vi are a basis, this can only happen if a(i,j) = 0 for all (i, j) ∈ I ×J. Thus {~vi ⊗ ~wi}(i,j)∈I×J .