
10 MA5P2 The Symmetric Group 3 The characters of GL(V) All our vector spaces and linear maps are over C. If V is a vector space, let GL(V) denote the group of bijective linear maps V V. → A linear representation of GL(V) is a vector space W together with a ho- momorphism GL(V) GL(W). The dimension of such a representation is dim W. From now on→we assume that all representations mentioned are finite-dimensional. A representation φ: GL(V) GL(W) is called polynomial, respectively, rational if for one (hence any) c→hoice of bases of V and W, the entries of the matrix φ(A) are polynomials, respectively, rational functions, in the entries of A. We say that it is homogeneous of degree m if φ(tA) = tm φ(A) for all t C = C r 0 . ∈ ∗ { } Sometimes we identify GL(V) with GL(n, C) where n = dim V. Example 26. Define φ: GL(2, C) GL(3, C) by → a2 2ab b2 a b φ = ac ad + bc bd . c d c2 2cd d2 You can check directly that φ is a group homomorphism (but you don’t learn much if you do). The entries of the matrix on the right hand side are ho- mogeneous polynomials of degree 2 in a, b, c, d, and so φ is a homogeneous polynomial representation of dimension 3 and degree 2. Examples 27. Here are some simple examples of representations illustrating the terms defined above. In all these examples we take A GL(n, C). ∈ (a). φ(A) = A: the defining representation. This is a homogeneous poly- nomial representation of dimension n and degree 1. (b). φ(A) = (det A)m where m Z. This is a rational homogeneous representation of dimension 1 and de∈gree mn. It is polynomial if and only if m 0. ≥ (c). φ(A) = det A √2. Not a rational representation and not homoge- neous. | | 1 (d). φ(A) = A− . Not a representation. m 1 t (e). φ(A) = (det A) (A− ) where t denotes transpose. A homogeneous rational representation of dimension n and degree mn 1. It is polynomial if and only if m 1. − φ ≥1 log det A (f). (A) = ( 0| 1 | ). A representation that isn’t homogeneous or rational, but it is continuous. (g). φ(A) = (σ(ai j) where to each entry ai j of A we apply a field auto- morphism σ of C which is not the identity or complex conjugation (so σ is necessarily discontinous). This representation is not rational and not contin- uous. Exercise 28. Let V∗ be the dual of a finite-dimensional vector space V and let , : V∗ V C be the pairing. We assume that GL(V) acts on V from the lhe·ft·.i × → MA5P2 The Symmetric Group 11 (a) Prove that there exists a GL(V)-action on V∗ from the left, defined by 1 gx, y = x, g− y for all (x, y) V∗ V, g GL(V). Prove that it is rhationial buht not poliynomial. ∈ × ∈ (b) Prove that there exists a GL(V)-action on V∗ from the right, defined by xg, y = x, gy . The concept of polynomial representation consid- ershactionis fromh thei left only, and therefore, the question whether it is polynomial doesn’t make sense. Proposition 29. Let V, W be finite-dimensional complex vector spaces and write n = dim V, d = dim W. Let φ: GL(V) GL(W) be a rational rep- → resentation. Then there exists a d-tuple ( f1,..., fd) of Laurent monomials in n variables x1, ..., x such that for all A GL(V), if A has eigenvalues θ1,...,θ n ∈ n then the eigenvalues of φ(A) are fi(θ1,...,θn) for 1 i d. The fi are unique up to reordering. ≤ ≤ Proof. For A End(W), λ C, define ker(A λ)d to be the eigenspace of A at λ. Then W∈ is a direct su∈m of the eigenspac−es of A. Fix a basis for V and let T GL(V) denote the subgroup of diagonal ele- ⊂ ments. Let W = W1 Wp be such that Wi is T-invariant and nonzero, and p is maximal und⊕er···th⊕ese conditions. We claim: For all t T, all eigenvalues of the restriction φ(t) W are equal. (30) ∈ | i Suppose not. We may suppose it is false for i = 1 and some t T. Let ∈ W1 = X1 Xq be the decomposition of W1 into nonzero eigenspaces of φ(t). So q⊕ ···2.⊕Since T is abelian, X is T-invariant. Then ≥ i q p W = X W j ⊕ i Mj=1 Mi=2 is a direct decomposition in p + q 1 > p nonzero T-invariant subspaces. This is a contradiction and proves (−30). Next we prove: For all t T and all i, the restriction φ(t) W is scalar. (31) ∈ | i If not, then there exists t T and linear independent u, v Wi for some i such that φ(t) preserves ∈the span of u, v and its matrix w∈ith respect to (u, v) is { } 1 1 λ 0 1 for some λ C. For all r Z then, the matrix with respect to (u, v) of φ(tr) is ∈ ∈ 1 r λr . 0 1 This contradicts our assumption that φ is rational and thus proves (31). We conclude that there exists a basis of W with respect to which all elements of φ(T) are diagonal. That is, there are functions fi: T C∗ (1 i d) such that → ≤ ≤ φ(diag(θ1,...,θn)) = diag( f1,..., fd) 12 MA5P2 The Symmetric Group for all diag(θ1,...,θ ) T. We also know that f is given by a rational func- n ∈ i tion. It follows that it is a Laurent monomial in n indeterminates x1,..., xn. Let B be the set of elements A End(V) such that the eigenvalues of ∈ φ(A) are fi(θ1,...,θn) (1 i d). Then B contains all diagonal invertible elements. Also, B is closed≤in E≤nd(V), and hence is all of End(V). Definition 32. In the notation of proposition 29, the character of the repre- sentation φ is defined to be charφ = f1 + + fd. This is a symmetric Laurent polynomial in n variables. ··· Example 33 (Symmetric powers). Let V be a finite-dimensional vector space. k The symmetric group Sk acts on V⊗ by permuting the factors. The k-th k k symmetric power Sym V is defined to be the subspace of V⊗ of Sk-invariant vectors. For v1,..., v V we write k ∈ 1 k [v1,..., vk] := Σ vw(1) vw(k) V⊗ . k! w S ⊗ ··· ⊗ ∈ ∈ k k Then [v1,..., v ] Sym V. n ∈ Suppose henceforth that e1,..., en is a basis of V. Then B := [e , , e ] : 1 i1 i n i1 ··· ik ≤ ≤ ··· ≤ k ≤ is a basis for Symk V. We define [v1,..., vk][w1,..., wℓ] := [v1,..., vk, w1,..., wℓ]. (34) It is easy to check that this is well-defined. This makes the symmetric alge- k bra Sym V := k 0 Sym V into a graded associative commutative algebra. ⊕ ≥ It is naturally isomorphic to the algebra of polynomials C[t1,..., tn] where t1,..., tn is a basis of the dual of V. In line with (34), we usually write v1 v instead of [v1,..., v ]. ··· k k Let A End(V) be such that Aei = θiei for some θi C. Then every basis vector∈ is an eigenvector of with eige∈nvalue θ θ . ei1 eik B A i1 ik ··· k∈ ··· The trace of A in Sym V is therefore the complete symmetric function hk(θ), k and the character of Sym V (as GL(V)-module) is hk(x1,..., xn). k k More common is to define Sym V as a quotient of V⊗ rather than a subspace. We have chosen the present construction because it fits better into theorem 38 below. Example 35 (Alternating powers). This is rather analogous to symmetric k powers. Let V be a finite-dimensional vector space. Then Sk acts on V⊗ . The k-th alternating power of V is the subspace kV V k of vectors v such ∧ ⊂ ⊗ that s(v) = εs v for all s Sk. Instead of alternating some say anti-symmetric or skew-symmetric. ∈ For v1,..., v V we write k ∈ 1 k v1,..., vk := Σ εw vw(1) vw(k) V⊗ . h i k! w S ⊗ ··· ⊗ ∈ ∈ k MA5P2 The Symmetric Group 13 k Then v1,..., v V. h ni ∈ ∧ Suppose from now on that e1,..., en is a basis of V. Then B := e ,..., e : 1 i1 < < i n h i1 ik i ≤ ··· k ≤ is a basis for kV. We define∧ v1,..., v w1,..., wℓ := v1,..., v , w1,..., wℓ . h ki ∧ h i h k i It is easy to check that this is well-defined. This makes the alternating algebra k V := k 0 V into a graded associative algebra. It is super-commutative i∧n the se⊕ns≥e t∧hat ℓ ℓ w v = ( 1)k v w for all v kV, w V. ∧ − ∧ ∈ ∧ ∈ ∧ We usually write v1 v instead of v1,..., v . ∧ ··· ∧ k h ki Let A End(V) be such that Aei = θiei with θi C. Then every basis vector ∈ is an eigenvector of whose∈ eigenvalue is θ θ . ei1 eik B A i1 ik ∧ ··· ∧ k∈ ··· The trace of A in V is therefore the elementary symmetric function ek(θ), ∧ k and the character of V (as GL(V)-module) is e (x1,..., x ). ∧ k n Definition 36. Let V be a finite-dimensional vector space.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages6 Page
-
File Size-