A Study of Viennot's Combinatorial Models of Orthogonal Polynomials
Total Page:16
File Type:pdf, Size:1020Kb
A Study of Viennot's Combinatorial Models of Orthogonal Polynomials Krista L. Smith with Advisor Dr. Gabor Hetyei Department of Mathematics and Statistics The University of North Carolina at Charlotte 9201 University City Blvd. Charlotte, NC 28223 April 25, 2014 1 Introduction This research project explores the subject of Orthogonal Polynomial Se- quences and algebraic methods for determining their corresponding moments using Xavier Viennot's numerous combinatorial models. Orthogonal polyno- mials include Hermite, Laguerre, and Jacobi among others. Throughout this project, we will keep the focus on the Chebyshev and Laguerre Polynomial Sequences. Orthogonal Polynomial Sequences (OPS) are bases of the vector space of polynomials that are orthogonal with respect to an inner product, defined in terms of a linear functional, called the moment functional. For Chebyshev polynomials of the 1st kind, the moment functional is given by Z 1 1 L [xn] = xn p dx: (1) 2 −1 1 − x The inner product of two polynomials can be found by substituting the product of the two polynomials into the moment functional. The moment functional is uniquely determined by the moments L [xn] obtained by sub- stituting the powers of x into the moment functional. An OPS must satisfy the following conditions for n ≥ 1: P−1(x) = 0 1 P0(x) = 1 L [PmPn] = 0, m 6= n Favard's Theorem, a key result of the theory of OPS, characterizes their monic variants as satisfying a three-term recurrence: Pn+1 = (x − bn)Pn − λnPn−1: (2) Favard's Theorem states that if the above conditions are met, then the moment functional L is quasi-definite and Pn(x) is a monic OPS. This relation holds for any orthogonal polynomial sequence taken with respect to a positive weight function. 2 Preliminaries 2.1 Xavier Viennot Emeritus Research Director at the French government's National Center for Scientific Research, Xavier Viennot's work focuses on combinatorics with ap- plications to pure and applied mathematics, computer science and physics. His research mainly exists as scans of French language lecture notes posted on the author's website, although a brief overview of these notes was pub- lished in English in 2003 by Dennis Stanton in his book, Orthogonal Poly- nomials and Special Functions. We are mainly interested in the two chapters of the lecture notes entitled Moments et re´currence line´aire and Moments de familles particulie`res de polyno^mes orthogonaux, translating as Moments and linear recurrence and Particular moments of families of orthogonal polynomials, re- spectively. 2.2 Quadratic Forms Given a symmetric bilinear form F on a real vector space V , we define a map Q : V −! R by Q(v) = F (v; v); where Q is the quadratic form associated with the symmetric bilinear form F [1]. Quadratic forms are homogeneous quadratic polynomials in n variables. The term quadratic form references Q being a homogeneous quadratic function of the coordinates (where each term has the same degree of two) [1]. Quadratic forms have matrix representations which are symmetric and unique. A quadratic form that is positive (nonnegative) on all vectors is positive (semi) definite. Every real, symmetric matrix represents a quadratic form, 2 and vice versa. Notice that the quadratic function, ax2 + bx + c is not a quadratic form in a one variable case because it is not homogeneous. In general, the positive (semi) definite property of a quadratic form may be characterized in terms of the leading principal minors. An analogous description exists for moment functionals. A moment functional L is positive definite if L [π(x)] > 0 for every poly- nomial π(x) that is not identically zero and is nonnegative for all real x [3]. Let L be positive definite. Then L has real moments and a corresponding OPS consisting of real polynomials exists. 2.3 Polynomials as a Vector Space The set of polynomials with coefficients in F is a vector space over F. Vector addition and scalar multiplication are defined in the obvious manner. If the degree of the polynomials is restricted to polynomials with degree less than or equal to n, then we have a vector space with dimension n + 1. The concept of an orthogonal basis is applicable to a vector space V (over any field) equipped with a symmetric bilinear form. In linear algebra, a linear functional or linear form is a linear map from a vector space to its field of scalars. Every vector space equipped with a symmetric bilinear form has an orthogonal basis which may be found using the Gram-Schmidt Pro- cess. Orthogonal Polynomial Sequences are orthogonal bases with respect to a special symmetric bilinear form, known as the moment functional (see section 1). 2.4 Orthogonal Polynomials Definition 2.1. A sequence Pn(x) from n = 0 to 1 is called an Orthogonal Polynomial Sequence with respect to a moment functional L provided for all nonnegative integers m and n: (i) Pn(x) is a polynomial of degree n (ii) L [Pm(x)Pn(x)] = 0 for m 6= n 2 (iii) L [Pn (x)] 6= 0. Orthogonal polynomials include Hermite, Laguerre, and Jacobi among others. Orthogonal Polynomial Sequences (OPS) are bases of the vector space of polynomials that are orthogonal with respect to an inner product, defined in terms of a linear functional, called the moment functional. All orthogonal polynomials can be expressed in terms of their moments, and their monic variants satisfy Favard's Theorem [3]: 3 Pn(x) = (x − cn)Pn−1(x) − λnPn−2(x): (3) Conversely, if a sequence of polynomials satisfies a recurrence in the form of equation (3) then it is an Orthogonal Polynomial Sequence. Moments are computed recursively. 2.4.1 Chebychev Polynomials of the 1st Kind Chebyshev polynomials appear in many different branches of mathematics beyond combinatorics, including differential equations, geometry, statistics, numerical analysis, number theory and approximation theory, as well as in physics. Of the 13 classical OPS listed by Abramowitz and Stegun [4], 6 are distinct Chebyshev polynomial sequences. The Chebyshev polynomials of the 1st kind are derived by making a change of variables as given by: Tn(x) = cos(nθ); (4) where: x = cos(θ): Favard's recurrence for the monic variant of the Chebyshev polynomials Pn(x) with unique constants bn and λn is defined as: for n = 1 1 P (x) = xP (x) − P (x): (5) n+1 n 2 n−1 for n ≥ 2 1 P (x) = xP (x) − P (x): (6) n+1 n 4 n−1 We see that all bn values are zero in Favard's recurrence for the Cheby- chev polynomials. We also see that there are two possible values for λ. We will see these constants again when we explore Viennot's combinatorial model for calculating OPS moments, where we have used the Chebyshev sequence as an example. The generating series for Chebyshev of the 1st kind: X 1 − xt T (x) = T tn = : (7) n n 1 − 2xt + t2 n≥0 The first 6 terms in the normalized Chebyshev polynomial sequence are: 4 Figure 1: First 6 terms of the normalized Chebyshev polynomial sequence of the 1st kind. P0(x) = 1 P1(x) = x 1 P (x) = x2 − 2 2 3 P (x) = x3 − x 3 4 1 P (x) = x4 − x2 + 4 8 5 5 P (x) = x5 − x3 + x 5 4 16 Classically, the moments of the Chebyshev polynomials sequence of the 1st kind can be determined using the moment functional from equation (1): Z 1 1 L [xn] = xn p dx: 2 −1 1 − x We determine the first six moments as the following: Z 1 0 1 µ0 = L [x ] = p dx = π 2 −1 1 − x Z 1 1 x µ1 = L [x ] = p dx = 0 2 −1 1 − x 5 Z 1 2 2 x π µ2 = L [x ] = p dx = 2 −1 1 − x 2 Z 1 3 3 x µ3 = L [x ] = p dx = 0 2 −1 1 − x Z 1 4 4 x 3π µ4 = L [x ] = p dx = 2 −1 1 − x 8 Z 1 5 5 x µ5 = L [x ] = p dx = 0 2 −1 1 − x We notice that all odd moments are zero for the Chebyshev OPS, as they should be for an odd function integrated over an interval that is symmetric to the origin. 3 Viennot's Combinatorial Models for Moments and Recurrence Viennot's chapter entitled Moments and linear recurrence introduces an alternative method to calculating the moments of orthogonal polynomial sequences. The traditional approach is to evaluate the inner product, given by condition ii of Definition 2:1, which is manageable for the first few terms in any OPS. But as we approach higher order polynomials in the sequence, the terms contained in each polynomial grows considerably. Calculating the inner product of a ninth and tenth term polynomial (including cross terms as this is a product) becomes quite a formidable task. Viennot's model is based on the constant terms bn and λn within Favard's Theorem discussed in the Introduction, that are unique to each monic OPS. 3.1 Viennot's Favard Paths Viennot has devised a combinatorial model for determining the next poly- nomial in the OPS using weighted lattice paths, as an alternative to using the recurrence relation presented by Favard's Theorem. Although the recur- rence relation for the Chebyshev polynomials is very straightforward, this is not always the case. For instance, the recurrence relation for determining the next polynomial in the sequence can be very cumbersome when working with the Legendre polynomial series. For this reason, Viennot has presented 6 Figure 2: The weighted steps for Favard Path of Chebyshev polynomials of the 1st kind.