Introduction to Orthogonal Polynomials
Total Page:16
File Type:pdf, Size:1020Kb
Introduction to orthogonal polynomials Michael Anshelevich November 6, 2003 ¹ = probability measure on R with finite moments Z n mn(¹) = x d¹(x) < 1: R Induces a functional on polynomials C[x], Z ' [P (x)] = P (x) d¹(x): R On the polynomials C[x], define the sesquilinear inner product D E h i n k n+k x ; x = '¹ x = m (¹): ¹ n+k n 1 The set fx gn=0 is a basis for C[x]. Gram-Schmidt with respect to the inner product h¢; ¢i¹, get a family of polyno- mials 1 fPngn=0 : Note they have real coefficients. They are orthogonal with respect to ¹: Z '¹ [PnPk] = Pn(x)Pk(x) d¹(x) = 0 R if n 6= k. 1 Hermite polynomials: normal (Gaussian) distribution 1 2 p e¡x =2t dx: 2¼t 0.4 0.3 0.2 0.1 ±4 ±2 0 2 4 x Laguerre polynomials: Gamma distribution 1 ¡x t¡1 e x 1[0;1)(x) dx: 1 ¡(t) 0.35 0.8 0.3 0.25 0.6 0.2 0.4 0.15 0.1 0.2 0.05 0 0 1 2 3 4 5 1 2 3 4 5 x x Jacobi polynomials: 1 (1 ¡ x)®(1 + x)¯ 1 (x) dx: Z(®; ¯) [¡1;1] In particular: Ultraspherical (Gegenbauer), 1 ® = ¯ = ¸ ¡ 2, 1 2 ¸¡1 (1 ¡ x ) 2 1 (x) dx; Z(¸) [¡1;1] 2 1 Chebyshev of the 1st kind, ® = ¯ = ¡2, ¸ = 0, 1 1 q 1[¡1;1](x) dx; ¼ 2 6 1 ¡ x 5 4 3 2 1 ±1 ±0.8 ±0.6 ±0.4 ±0.2 0 0.2 0.4 0.6 0.8 1 x 1 Chebyshev of the 2nd kind, ® = ¯ = 2, ¸ = 1, q 2 2 1 ¡ x 1[¡1;1](x) dx; ¼ 0.6 0.5 0.4 0.3 0.2 0.1 ±1 ±0.8 ±0.6 ±0.4 ±0.2 0 0.2 0.4 0.6 0.8 1 x Kravchuk: binomial distribution, for q = 1 ¡ p, XN ³ ´ N k N¡k p q ±2k¡N (x): k 0.3 k=0 0.25 0.2 0.15 0.1 0.05 ±4 ±2 0 2 4 k 3 Charlier: Poisson distribution X1 k ¡t t e ±k(x): k! 0.25 k=0 0.2 0.15 0.1 0.05 0 1 2 3 4 5 k Meixner: negative binomial (Pascal) distribution X ³ ´ k ¡ 1 t k¡t p q ±k(x): 0.26 t ¡ 1 0.24 k¸t 0.22 0.2 0.18 0.16 0.14 0.12 0.1 0 1 2 3 4 5 6 7 8 k Can describe in various ways: ² Using the measure of orthogonality. ² 2nd order differential or difference equations. ² Explicit formulas using hypergeometric functions. ² Three-term recursion relations. 4 1 Favard’s theorem. (Stone 1932) Let fPngn=0 be a poly- nomial family, that is, Pn has degree n. fPng are orthogonal with respect to some measure ¹ with positive leading coefficients () they satisfy a three- term recursion relation P0(x) = 1; xP0(x) = ®0P1(x) + ¯0P0(x); xPn(x) = ®nPn+1(x) + ¯nPn(x) + γnPn¡1(x); where all ¯n are real, ®n > 0, γn ¸ 0. Moreover, fPng are monic iff all ®n = 1. fPng are orthonormal iff γn = ®n¡1. ¹ has finite support iff γn = 0 for some n. 5 )) Suppose the polynomials are orthogonal with respect to ¹ and have positive leading coefficients. Then ¡ ¢ Pn ? span P0;P1;:::;Pn¡1 : So hxPn;Pki¹ = 0 for n + 1 < k, and also hxPn;Pki¹ = '¹ [xPnPk] = hPn; xPki = 0 for n > k + 1, n ¡ 1 > k. So xPn(x) = ®nPn+1(x) + ¯nPn(x) + γnPn¡1(x) for some real ®n; ¯n; γn. Leading coefficients of Pn;Pn+1 are positive ) ®n > 0. Also, (leading coefficients are 1) , ®n = 1. 6 From the recursion relation, hxPn;Pn¡1i = γn hPn¡1;Pn¡1i : Also, from the recursion xPn¡1(x) = ®n¡1Pn(x) + ¯n¡1Pn¡1(x) + γn¡1Pn¡2(x); it follows that hxPn;Pn¡1i = hPn; xPn¡1i = ®n¡1 hPn;Pni : So γn hPn¡1;Pn¡1i = ®n¡1 hPn;Pni : So γn ¸ 0, and fPng are orthonormal iff γn = ®n¡1. Finally, γn = 0 , hPn;Pni = 0. 7 () Suppose the polynomials satisfy a three-term recur- sion relation xPn(x) = γn+1Pn+1(x) + ¯nPn(x) + γnPn¡1(x): On C[x], define a functional ' by ' [PnPk] = ±nk and extend linearly: for X X A(x) = anPn(x);B(x) = bkPk(x); n k X X X ' [AB] = an¹bk' [PnPk] = an¹bn: n k n Why consistent? Suppose A1B1 = A2B2; to show ' [A1B1] = ' [A2B2] : 8 Factoring over C, enough to show ' [(A(x)(x ¡ a))B(x)] = ' [A(x)((x ¡ a)B(x))] for a 2 C. By linearity, enough to show ' [(Pk(x)x)Pn(x)] = ' [Pk(x)(xPn(x))] : But from the recursion relation, these are γk+1±k+1;n + ¯k±k;n + γk±k¡1;n = γn+1±k;n+1 + ¯n±k;n + γn±k;n¡1: 9 Remains to show ' = '¹ for some measure ¹. Note: ¹ not unique. * + Xn Xn Xn 2 aiPi(x); aiPi(x) = jaij > 0 i=0 i=0 ' i=0 so the induced inner product is positive definite (assume non-degenerate) Let H be the Hilbert space completion of C[x] with respect to the inner product h¢; ¢i', and X be the operator with dense domain C[x] defined by XA(x) = xA(x): Then X is an unbounded symmetric operator. Has a self- adjoint extension? von Neumann: look at the deficiency subspaces ¤ K§ = ker(X ¨ i): A self-adjoint extension exists iff the deficiency indices d§ = dim(K§) are equal. 10 Let µ Xn ¶ Xn j j C ajx = ¹ajx : j=0 j=0 Since hC(P );C(P )i' = hP; P i' = hP; P i' ; C can extend to an operator on H. It commutes with X, so C(K§) = K¨: Indeed, ¤ K+ = f´ 2 H : X ´ = i´g = f´ 2 H : 8P 2 R[x]; h´; XP i = hi´; P ig : But hC(´);X(P )i = h´; CX(P )i = h´; XC(P )i = hi´; C(P )i = h¡i´; P i : We conclude that d+ = d¡: 11 Let X~ be a self-adjoint extension of X. It has a spectral measure: for any subset S ½ R, have a projection E(S). Define ¹(S) = h1;E(S)1i : This is a probability measure. In fact, Z ¿ Z À n n mn(¹) = x d¹(x) = 1; x dE(x) 1 DR E R = 1; X~n1 = h1; xni = ' [xn] : X~ not unique ) ¹ not unique. 12 Hermite: xPn = Pn+1 + tnPn¡1: Laguerre: xLn = Ln+1 + (t + 2n)Ln + n(t + n ¡ 1)Ln¡1: Jacobi: ¯2 ¡ ®2 xPn = Pn+1 + Pn (2n + ® + ¯)(2n + ® + ¯ + 2) 4n(n + ®)(n + ¯)(n + ® + ¯) + Pn¡1: (2n + ® + ¯ ¡ 1)(2n + ® + ¯)2(2n + ® + ¯ + 1) Ultraspherical: n(n + 2¸ ¡ 1) xPn = P + P : n+1 4(n + ¸ ¡ 1)(n + ¸) n¡1 Chebyshev: 1 xPn = P + P : n+1 4 n¡1 13 Kravchuk: for a = (1 ¡ 2p), b = 4pq, a2 ¡ b = 1, xKn = Kn+1 + a(2n ¡ N)Kn + bn(N ¡ n + 1)Kn¡1: Charlier: xPn = Pn+1 + (t + n)Pn + tnPn¡1: Meixner: ¡1 1 ¡2 2 1 for a = p ¡ 2, b = qp , a ¡ b = 4 > 0, xMn = Mn+1 + a(t + 2n)Mn + bn(t + n ¡ 1)Mn¡1: Meixner-Pollaczek: same for a2 ¡ b < 0. Note for La- guerre, a2 ¡ b = 0. 14 Can study zeros, asymptotics, continued fractions. In- stead: Denote feng the basis of H coming from fPng Xen = γn+1en+1 + ¯nen + γnen¡1: X has a tri-diagonal matrix 0 1 ¯ γ B 0 1 C B C Bγ1 ¯1 γ2 C B C B γ2 ¯2 γ3 C : B . C @ γ3 .. ..A ... Thus X is a sum of three operators: creation ¤ A en = γn+1en+1; annihilation Aen = γnen¡1; and preservation Nen = ¯nen: 15 Since ¯n is real, N is symmetric, and D E ¤ hA en; eki = γn+1 en+1; ek = γn+1±n+1;k ® = γk en; ek¡1 = hen; Aen¡1i : A; A¤ are (formal) adjoints of each other. For the Hermite polynomials, N = 0, A; A¤ are the usual annihilation and creation operators. For the Charlier polynomials, A; A¤ are as before, and N is (almost) the number operator. 16 Another way to look at this. Heisenberg Lie algebra Ht has the basis fA; A¤; tg subject to the relation [A; A¤] = t; t scalar: Each Ht has a representation on some Hilbert space H 1 with basis fengn=0: p A en = tn en¡1; q ¤ A en = t(n + 1) en+1: ¤ ¤ (AA ¡ A A)en = [t(n + 1) ¡ tn]en = ten: 17 If we want A + A¤ = X; where XP (x) = xP (x); can take 2 H = L (R; ¹t) for ¹t = Gaussian measure; and en = Pn(x; t) = Hermite polynomials. In this representation, ¤ A = t@x;A = x ¡ t@x: 18 ¤ Oscillator Lie algebra Ost has the basis fA; A ; N; tg subject to the relations [A; A¤] = t; [A; N] = A; [N; A¤] = A¤; t scalar: Each Ost has a representation on some H: p A en = tn en¡1; q ¤ A en = t(n + 1) en+1; N en = (t + n)en: Note e0 is a cyclic vector. In fact, the representation is irreducible. If we want A + A¤ + N = X; 2 can take H = L (R; ¹t) for ¹t = Poisson measure; and en = Pn(x; t) = Charlier polynomials: A = t¢; ¢(P ) = P (x) ¡ P (x ¡ 1); A¤(P ) = xP (x ¡ 1) ¡ tP; N(P ) = x¢P + tP (x ¡ 1): 19 The Lie algebran sl(2; R) ofo real traceless 2 £ 2 matrices has the basis J¡;J+;J0 à ! à ! à ! 0 0 0 1 1 0 J = ;J = ;J = ; ¡ ¡1 0 + 0 0 0 0 ¡1 subject to the relations [J¡;J+] = J0; [J¡;J0] = 2J¡; [J0;J+] = 2J+: Abstract representations, parameterized by t > 0: q J¡ en = n(t + n ¡ 1) en¡1; q J+ en = (n + 1)(t + n) en+1; J0 en = (t + 2n)en: Want J¡ + J+ + cJ0 = X: 20 2 For each t; c, sl(2; R) has a representation on L (R; ¹t;c), for 8 > <>negative binomial (Meixner); c > 1; ¹t;c = Gamma (Laguerre); c = 1; > :::: (Meixner-Pollaczek); 0 < c < 1: Where are the finite-dimensional representations? Take t = ¡N, N > 0 integer.