Linear Algebra Review 3
Total Page:16
File Type:pdf, Size:1020Kb
INTRODUCTION TO GROUP REPRESENTATIONS JULY 1, 2012 LINEAR ALGEBRA REVIEW 3 Here we describe some of the main linear algebra constructions used in representation theory. Since we are working with finite-dimensional vector spaces, a choice of basis iden- n tifies each new space with some C : What makes these interesting is how they behave with respect to linear transformation, which in turn gives ways to construct new representations from old ones. Let V and W be finite-dimensional vector spaces over C, and choose bases B = fv1; : : : ; vmg for V and C = fw1; : : : ; wng for W . Direct Sum: There are two types of direct sums, internal and external. The internal direct sum expresses a given vector space V 0 in terms of two (or more) subspaces V and W . Definition: We say V 0 is the internal direct sum of subspaces V and W , written V 0 = V + W; if each vector in V 0 can be written uniquely as a sum v0 = v + w with v in V and w in W . Equivalently, this condition holds if and only if B [ C is a basis for V 0: In turn, we also have this condition if dim(V ) + dim(W ) = dim(V 0) and V \ W = f0g: In the special case where V 0 admits an inner product and W = V ?, we call V 0 = V ⊕W an orthogonal direct sum. If B and C are orthonormal bases, then B [C is an orthonormal basis for V 0: On the other hand, the main idea here can be applied to combine two known vector spaces. Definition: The external direct sum of V and W , also written V 0 = V ⊕ W , is first defined as the set of all ordered pairs (v; w) with v in V and w in W . Scalar multiplication is defined by c(v; w) = (cv; cw), and addition is defined by (v; w)+(v0; w0) = (v+v0; w+w0). One checks the other axioms for a vector space. Note that the external direct sum of V and W can be expressed as the internal direct sum of (V; 0) and (0;W ): A basis for V ⊕ W is given by f(vi; 0)g [ f(0; wj)g: If S : V ! V 0 and T : W ! W 0 are linear transformations, we obtain a linear transfor- mation (S; T ): V ⊕ W ! V 0 ⊕ W 0 by (S; T )(v; w) = (Sv; T w): The reader should verify that (S; T ) is linear. This definition allows a natural method for constructing direct sums of representations. Date: July 1, 2012. 1 2 INTRODUCTION TO GROUP REPRESENTATIONS Dual Vector Spaces: Definition: A linear functional on V is a linear transformation f : V ! C, and the vector space of all linear functionals on V is called the dual vector space V ∗: If V also admits an inner product, then one may describe each linear functional f 0 0 uniquely in the form fv(v ) = hv ; vi for some v in V . In fact, one may choose any nonzero v in (Kerf)? and rescale to match values to f. In turn, V ∗ admits an inner product defined by hfv; fwi∗ = hw; vi = hv; wi: If T : V ! W is a linear transformation, then we have an induced linear transformation T ∗ : W ∗ ! V ∗ by defining (T ∗w∗)(v) = w∗(T v): If V and W admit Hermitian inner products, (T ∗f)(v) = f(T v) = hT v; wi = hv; T ∗wi; where the latter T ∗ represents the adjoint of T with respect to the inner product. If we k ∗ are working in coordinate spaces C , then T is conjugate transpose of matrices: w∗T v = w∗(T ∗)∗v = (T ∗w)∗v: Suppose S : W ! X, so that ST : V ! X: If f is in X∗ then [(ST )∗f](v) = f(ST v) = (S∗f)(T v) = [T ∗(S∗f)](v): Thus (ST )∗ = T ∗S∗: For representations, if π∗ is to be a group action on V ∗; we need π∗(gh) = π∗(g)π∗(h): We correct the order using inverses. ∗ ∗ ∗ ∗ Definition: The dual basis B for V with respect to B is the set fv1; : : : ; vmg; where ∗ each vi is defined by ∗ ∗ vi (vi) = 1; and vj (vi) = 0 otherwise. If we have an inner product space with orthonormal basis B, then the corresponding dual ∗ ∗ basis has vectors vi = h·; vii: Note that dim(V ) = dim(V ): If T : V ! W is an invertible linear transformation, then fT v1; : : : ; T vng is a basis for ∗ ∗ ∗ ∗ −1 W with corresponding dual basis f(T v1) ;:::; (T vn) g; where (T vi) (w) = vi (T w): Note that ∗ ∗ −1 ∗ (T vi) (T vj) = vi (T T vj) = vi (vj); confirming the dual basis property. One way to interpret: Suppose we wish to replace V with an isomorphic vector space W: To convert from linear functionals on V to linear functionals on W , we send v∗ to (T −1)∗v∗: For representations, W = V and T = π(g): That is, we define the group action by [π∗(g)v∗](v0) = v∗(π(g)−1v0): Another way to interpret: A group action expresses symmetries on an object X; that is, the group action leaves some quality of X unchanged. If T : V ! V is an isomorphism, then the basis B = fvig is carried to the basis BT = fT vig: Note that the definition of v∗(v0) requires no basis; that is, this quantity remains unchanged no matter how we pass INTRODUCTION TO GROUP REPRESENTATIONS 3 to coordinates. So if we change V by an isomorphism T , interpreted as a change of basis, the dual basis changes by (T −1)∗; and v0(v) is unchanged. For representations, we again arrive at the group action on V ∗: 2 Example: Consider the linear functional f : C ! C defined by f(x; y; z) = x − y: Then x f(x; y; z) = [1 − 1] : y 1 1 −1 If T : 3 ! 2 is represented by the matrix A = ; then C C 2 0 3 2x3 ∗ (T f)(x; y; z) = [1 − 1]A 4y5 = −x + y − 4z: z Linear Transformation Spaces: Definition: The set of all linear transformations T : V ! W is denoted by HomC(V; W ): As a vector space, we define scalar multiplication by (cT )(v) = c(T (v)) and addition by (S + T )(v) = (Sv) + (T v): One verifies the other axioms for a vector space. With respect to choice of bases B and C, we can identify each T with an n × m matrix MT such that [T v]C = MT [v]B: This identification yields an isomorphism of HomC(V; W ) with the matrix vector space M(n; m; C): ∗ In turn, HomC(V; W ) admits a Hermitian inner product defined by hT1;T2i = T race(T1T2 ): With a choice of orthonormal bases for V and W , the norm squared of T1 with associated P 2 matrix MT1 = [ci;j] equals jci;jj : On the other hand, each linear functional is a linear transformation into C; and we have ∗ seen how to identify V with M(1; m; C), the space of row vectors. Now every element of HomC(V; W ) may be written uniquely in the form X ∗ T (v) = ci;jvj (v)wi: i;j Again T is identified with the n × m matrix [ci;j]: Thus we see that fTi;jg is a basis for ∗ HomC(V; W ), where Ti;j(v) = vj (v)wi; and dim(HomC(V; W )) = mn: 0 0 Now suppose T1 : V ! V and T2 : W ! W are linear transformations. Then 0 0 T2TT1 is an element of HomC(V ;W ); and, after choosing bases, the associated matrix is MT2 MT MT1 : Suppose we wish to replace V and W with isomorphic vector spaces V 0 and W 0 using 0 0 T1 : V ! V and T2 : W ! W : Then we replace the element T in Hom(V; W ) with −1 0 0 0 0 T2TT1 in HomC(V ;W ): For representations, V = V , W = W , T1 = π(g) acts on V , 0 0 −1 ∗ and T2 = π (g) acts on W to give σ(g)T = π (g)T π(g) : For the dual space V , we use 0 the trivial action on W = C; so π = I: 4 INTRODUCTION TO GROUP REPRESENTATIONS 3 2 Example: Let T : C ! C be given by T (x; y; z) = (x + y − z; 2x + 3y): Suppose 3 3 T1 : C ! C is given by T1(x; y; z) = (3x + y + z; 2x + y; −y + 2z) and 2 2 T2 : C ! C is given by T2(x; y) = (x + y; 2x − 3y): Then 23 1 13 2x3 1 1 1 1 −1 T TT (x; y; z) = 2 1 0 y 2 1 2 −3 2 3 0 4 5 4 5 0 −1 2 z = (−17x + 8y + z; −26x − 9y − 8z): Tensor Products: If the direct sum is thought of as a means to add vector spaces, then tensoring is the way we multiply vector spaces. We will take a somewhat naive approach here, as our focus is working with concrete examples of representations. In general, tensors are defined in terms of universal mapping properties without reference to coordinates. Tensors are the natural objects for moving from linearity to multilinearity. Definition: With our choices of bases B and C for V and W , we form the set B ⊗ C = fvi ⊗ wjg: The set V ⊗ W is defined as the set of all linear combinations of elements in B ⊗ C: That is, a tensor in V ⊗ W , the tensor product of V and W , is an element X t = ci;j(vi ⊗ wj); i;j and the vector space of tensors has dimension mn: For any v in V and w in W , the monomial tensor v ⊗ w is defined in terms of the basis by applying the following relations: (1) c(v ⊗ w) = (cv) ⊗ w) = v ⊗ (cw); (2) (v + v0) ⊗ w = v ⊗ w + v0 ⊗ w; and (3) v ⊗ (w + w0) = v ⊗ w + v ⊗ w0: Often it is sufficient to work with monomials and check that results extend linearly to general tensors.