MOPS: Multivariate Orthogonal Polynomials
Total Page:16
File Type:pdf, Size:1020Kb
MOPS: Multivariate Orthogonal Polynomials (symbolically) Ioana Dumitriu, Alan Edelman, and Gene Shuman February 4, 2008 Abstract In this paper we present a Maple library (MOPs) for computing Jack, Hermite, Laguerre, and Jacobi multivariate polynomials, as well as eigenvalue statistics for the Hermite, Laguerre, and Jacobi ensembles of Random Matrix theory. We also compute multivariate hypergeometric functions, and offer both symbolic and nu- merical evaluations for all these quantities. We prove that all algorithms are well-defined, analyze their complexity, and illustrate their performance in practice. Finally, we also present a few of the possible applications of this library. Contents 1 Introduction 2 1.1 Motivation ................................ 2 1.2 History and connection to Random Matrix Theory . .. 3 2 Multivariate Functions: Definitions, Properties, and Algorithms 6 2.1 Partitions and Symmetric Polynomials . .. 6 2.2 Multivariate Gamma Function . 8 2.3 Jack Polynomials (the Multivariate Monomials) . ..... 8 2.4 Algorithm used to compute the Jack Polynomials . ... 11 2.5 Multivariate binomial coefficients . .. 13 arXiv:math-ph/0409066v1 24 Sep 2004 2.6 Algorithm used to compute the multivariate binomial coefficients . 13 2.7 Multivariate Orthogonal Polynomials . ... 14 2.7.1 JacobiPolynomials. 14 2.7.2 Algorithm used to compute the Jacobi Polynomials . .. 15 2.7.3 LaguerrePolynomials . 15 2.7.4 Algorithm used to compute the Laguerre Polynomials . .. 16 2.7.5 HermitePolynomials. 16 2.7.6 Algorithm used to compute the Hermite Polynomials . .. 17 2.8 Hypergeometricfunctions . 17 3 Software 18 3.1 ThemodelforMOPS .......................... 18 3.2 System requirements and installation guide . .... 19 3.3 Routines: name,syntax,anddescription . ... 20 3.4 Computing Expected Values / Evaluating Integrals . ..... 23 1 4 Complexity bounds and running times 26 4.1 Algorithms that compute polynomials . 26 4.2 Conversionalgorithms . 32 4.3 Algorithms that evaluate integrals . .. 34 4.4 Numericalalgorithms . .. .. .. .. .. .. .. .. .. .. 35 5 Applications 36 6 Copyleft 41 1 Introduction 1.1 Motivation There is no need for us to review the impact that classical orthogonal polynomial and special functions theory has had for applications in mathematics, science, en- gineering and computations. By the middle of the last century, handbooks had been compiled that could be found on nearly everyone’s bookshelf. In our time, handbooks join forces with mathematical software and new applications making the subject as relevant today as it was over a century ago. We believe that the modern day extension of these scalar functions are the mul- tivariate orthogonal polynomials or MOPs along with their special function coun- terparts. The multivariate cases are far richer, yet at this time they are understudied, underapplied, and important applications may be being missed. Algorithms for their computation have not been studied systematically, and software suitable for scientific computing hardly exists. At this time there are no handbooks, no collection of software, and no collection of applications, though since April 2004 entries are being introduced into Eric Weinstein’s Mathworld website1. Development of such software may thus be seen as a whole area of research ripe for study. This paper might be thought of as a first step in this direction; undoubtedly better software will emerge in time. We recall that scalar orthogonal polynomials are defined by a positive weight function w(x) defined on an interval I R. We define the inner product ⊂ <f,g>w = f(x)g(x)w(x)dx ZI and the sequence of polynomials p0(x),p1(x),p2(x),..., such that pk(x) has degree k, and such that <pi,pj >w= 0 if i = j. This sequence is the sequence of orthogonal polynomials with respect to the weight6 function w(x). There is a (scalar) complex version of this inner product (I C) where we use g¯ instead of g; this induces a different set of orthogonal polynomials.⊂ We now define the multivariate version of the inner product, and the correspond- ing orthogonal polynomials. We take any weight function w(x) defined on a segment I, and create an n-dimensional weight function which is symmetric in each of its n coordinates, and incorporates a repulsion factor which depends on a “Boltzmann” constant β (or a temperature factor α = 2/β) which is not seen in the univariate case: n 2/α W (x1,...,x )= x x w(x ) . (1) n | i − j | i 1≤ ≤ i=1 i<jY n Y 1Mathworld, URL http://mathworld.wolfram.com/ 2 α We define multivariate orthogonal polynomials pκ (x1,...,xn) with respect to the weight W (x1,...,xn). The polynomials are symmetric: they take the same value for any permutation of the n coordinates xi, and they satisfy n α α β pκ (x1,...,xn)pµ(x1,...,xn) xi xj w(xi)dx1 . dxn = δκµ, n | − | I i<j j=1 Z Y Y α where κ represents the “multivariate degrees” of pκ (the exponent of the leading term). We begin with our fourth example: symmetric multivariate Hermite polynomi- 2 als. We take w(x) = e−x /2, so that the integral is over all of Rn. The polyno- α a −x mials are denoted Hκ (x). Our second and third examples are w(x) = x e and a1 a2 α,a α,a1,a2 w(x) = x (1 x) . These are the Laguerre Lκ and Jacobi Jκ polyno- mials. Special− cases of the Jacobi polynomials are the Chebyshev and Legendre polynomials. Our first example, the Jack polynomials, generalizes the monomial scalar func- tions, xk. These polynomials are orthogonal on the unit circle: w = 1 and I = the unit circle in the complex plane. Therefore In may be thought of as an n dimen- sional torus. The orthogonality of the Jack polynomials may be found in formula (10.35) in Macdonald’s book [26, page 383]. Tables 1, 2, 3, and 4 give the coefficients of the Jack, Hermite, Laguerre, and Jacobi in terms of the monomial symmetric functions (for the first) and the Jack polynomials (for the last three). We take all degrees up to total degree 4 for the Jack polynomials, up to total degree 3 for the Hermite polynomials, and up to degree 2 for Laguerre and Jacobi; the coefficients can be seen by a simple call to the procedures, e.g.2, ′ ′ > jack(a, [2], J ); ′ ′ > hermite(a, [1, 1, 1], n, C ); ′ ′ > laguerre(a, [1, 1],g, n, C ); ′ ′ > jacobi(a, [1],g1,g2, n, C ); 1.2 History and connection to Random Matrix Theory The Jack polynomials have a very rich history. They represent a family of orthogonal polynomials dependent on a positive parameter α, and some of them are more famous than others. There are three values of α which have been studied independently, namely, α = 2, 1, 1/2. The Jack polynomials corresponding to α = 1 are better known as the Schur functions; the α = 2 Jack polynomials are better known as the zonal polynomials, and the Jack polynomials corresponding to α = 1/2 are known as the quaternion zonal polynomials. In an attempt to evaluate the integral (2) in connection with the non- central Wishart distribution, James [15] discovered the zonal polynomials in 2Note the use of a for α and g for γ in the calls. 3 Table 1: Coefficients of the Jack “J” polynomial expressed in monomial basis k = 2 m[2] m[1,1] k = 1 m[1] α α J[2] 1+ α 2 J[1] 1 α J[1,1] 0 2 k = 3 m[3] m[2,1] m[1,1,1] α J[3] (1 + α)(2 + α) 3(1 + α) 6 α J[2,1] 0 2+ α 6 α J[1,1,1] 0 0 6 k = 4 m[4] m[3,1] m[2,2] m[2,1,1] m[1,1,1,1] α 2 J[4] (1 + α)(1 + 2α)(1 + 3α) 4(1 + α)(1 + 2α) 6(1 + α) 12(1 + α) 24 α 2 J[3,1] 0 2(1 + α) 4(1 + α) 2(5 + 3α) 24 α J[2,2] 0 0 2(2 + α)(1 + α) 4(2 + α) 24 α J[2,1,1] 0 0 0 2(3 + α) 24 α J[1,1,1,1] 0 0 0 0 24 Table 2: Coefficients of the Hermite polynomial expressed in Jack “C” polynomial basis k = 2 Cα Cα Cα k = 1 Cα [2] [1,1] [1] [1] Hα 1 0 n(n+α) Hα 1 [2] − α [1] α n(n 1) H[1,1] 0 1 1+−α α α α α k = 3 C[3] C[2,1] C[1,1,1] C[1] α 3(n+α)(n+2α) H[3] 1 0 0 (1+2α)(1+α) α 6(n 1)(n+α)(α 1) H 0 1 0 − − [2,1] − (1+2α)(2+α) α 3α(n 1)(n 2) H[1,1,1] 0 0 1 (2+−α)(1+−α) Table 3: Coefficients of the Laguerre polynomial expressed in Jack “C” polynomial basis α α k = 1 C[1] 1= C[] α,γ (γα+n+α 1)n L 1 − [1] − α α α α α k = 2 C[2] C[1,1] C[1] 1= C[] α,γ 2(γα+n+2α 1)(n+α) (γα+n+α 1)(γα+n+2α 1)n(n+α) L[2] 1 0 α(1+−α) − α2(1+α) − α,γ 2(γα+n+α 2)(n 1) (γα+n+α 1)(γα+n+α 2)n(n 1) L 0 1 − − − − − [1,1] − 1+α α(1+α) 4 Table 4: Coefficients of the Jacobi polynomial expressed in Jack “C” polynomial basis α α k = 1 C[1] 1= C[] α,g1,g2 (g1α+n+α 1)n J 1 − [1] g1α+g2α+2n 2+2α − − α α α α k = 2 C[2] C[1,1] C[1] 1= C[] α,g1,g2 2(g1α+n+2α 1)(n+α) (g1α+n+α 1)(g1α+n+2α 1)n(n+α) J 1 0 − − − [2] (g1α+g2α+2n 2+4α)(1+α) (g1α+g2α+2n 2+4α)(g1α+g2α+2n 2+3α)α(1+α) − − − α,g ,g 2α(g1α+n+α 2)(n 1) 2α(g1α+n+α 1)(g1α+n+α 2)n(n 1) J 1 2 0 1 − − − − − [1,1] (g1α+g2α+2n 4+2α)(1+α) (g1α+g2α+2n 4+2α)(g1α+g2α+2n 3+2α)(1+α) − − − − 1960.