
U.U.D.M. Project Report 2020:29 Orthogonal polynomials and special functions Elsa Graneland Examensarbete i matematik, 15 hp Handledare: Jörgen Östensson Examinator: Veronica Crispin Quinonez Juni 2020 Department of Mathematics Uppsala University Abstract This thesis is a brief introduction to the theory of orthogonal polynomials which is based on the theory of inner product spaces. We will also consider some special functions, especially the Bessel function. We present definitions of orthonormal and monic orthogonal polynomials, and discuss the three-term recurrence relation, the Jacobi matrix and also a result concerning the zeros of the orthogonal polynomials. Furthermore the Sturm-Liouville problem is presented and in particular, the Bessel function. Many polynomials and special functions are solutions to differential equations describing phenomena in nature. Lastly some applications to physics, e.g. quantum mechanics, are being presented. 2 Contents 1 Introduction 4 2 Inner product and inner product space 4 3 Gram-Schmidt process 6 4 Orthogonal polynomials 8 4.1 Orthonormal and monic orthogonal polynomials . .8 4.2 Three-term recurrence relation . 10 4.3 The Jacobi matrix . 12 4.4 Zeros of orthogonal polynomials . 13 5 Classical orthogonal polynomials 16 5.1 Legendre polynomials . 16 5.2 Hermite polynomials . 18 5.3 Laguerre polynomials . 20 6 Sturm-Liouville problems 21 7 Applications 24 7.1 Eigenvalues of the Laplace operator . 24 7.1.1 Eigenvalue problem in a disk . 25 7.1.2 Eigenvalue problem in a ball . 26 7.2 Schr¨odingerequation . 28 7.2.1 Harmonic oscillator . 29 7.2.2 Hydrogen atom . 30 8 References 33 3 1 Introduction The theory of orthogonal polynomials and special functions is of intrinsic interest to many parts of mathematics. Moreover, it can be used to explain many physical and chemical phenomena. For example, the vibrations of a drum head can be explained in terms of special functions known as Bessel functions. And the solutions of the Schr¨odingerequation for a harmonic oscillator can be described using orthogonal polynomials known as Hermite polynomials. Furthermore, the eigenfunctions for the Schr¨odingeroperator associated with the hydrogen atom are described in terms orthogonal polynomials known as Laguerre polynomials. In Section 2 the definitions of inner product and inner product space are presented, and in Section 3 the Gram-Schmidt algorithm is described. In Section 4 the orthonormal and monic orthogonal polynomials are defined. This is followed by a discussion about the three-term recurrence relation, the Jacobi matrix and the zeros of orthogonal polynomials. This thesis considers three of the classical orthogonal polynomials, which are presented in Section 5. In Section 6 we discuss Sturm-Liouville problems. The last section in this thesis, Section 7, present the applications mentioned above. 2 Inner product and inner product space The standard scalar products of Rn and Cn satisfy some calculation rules. These are taken as axioms in a general inner product space. Definition 2.1. The inner product h · ; · i on a vector space X is a mapping of X × X in to the scalar field K (= R or C) satisfying the following. For all vectors x, y and z and scalars α we have: 1. hx + y; zi = h x; z i + h y; z i: 2. h α x; y i = α h x; y i: 3. h x; y i = h y; x i: 4. h x; x i ≥ 0 and equal if and only if x = 0: The inner product on X defines a norm on X given by kxk = phx; xi and a distance function, or metric, on X given by d(x; y) = kx − yk = phx − y; x − yi: When an inner product space is complete, the space is called a Hilbert space and is usually denoted by H. Remark. It follows from the definition above, that h αx + βy; z i = αh x; z i + βh y; z i: h x; αy + βz i = αh x; y i + βh x; z i: Due to the conjugate in the second variable, one says that the inner product is 1 sesquilinear, meaning "1 2 times linear". 4 Definition 2.2. Two elements x and y in an inner product space X are said to be orthogonal if h x; y i = 0. A set of vectors is called a orthonormal set if these vectors are pairwise orthogonal and of norm 1. Example 1. Euclidean space Rn. n Given vectors x = (x1; x2; :::; xn) and y = (y1; y2; :::; yn) in R an inner product is defined by n X h x; y i = x1 y1 + x2 y2 + ::: + xn yn = xj yj: (2.1) j=1 This makes Rn into a Hilbert space. Example 2. Unitary space Cn. The standard inner product on Cn is given by n X h x; y i = xj yj: (2.2) j=1 This makes Cn into a Hilbert space. Example 3. The space C[a; b]. Let C[a; b] denote the space complex-valued continuous functions defined on the interval [a; b]. An inner product on C[a; b] is given by Z b h f; g i = f(x) g(x) dx: (2.3) a This space is not complete in the metric induced by the scalar product. Example 4. The space L2(R; dµ). Given a positive Borel measure dµ on R, we let L2(R; dµ) denote the equivalent classes of all Borel measurable functions such that Z jfj2dµ < 1: R Two functions are considered equivalent if they agree µ-almost everywhere. An inner product on L2[R; dµ] is given by Z h f; g i = f(x) g(x) dµ. (2.4) R This makes L2(R; dµ) into a Hilbert space. For the classical orthogonal polynomials it holds that dµ = !(x) dx: Moreover the so-called ! usually vanishes outside some interval [a; b]. In this case we write L2([a; b];!(x)dx), and the scalar product is given by Z b h f; g i = f(x) g(x)!(x) dx: a In case h f; g i = 0 we say that f and g are orthogonal with respect to the weight ! on [a; b]. For more about the theory of inner product spaces and Hilbert spaces, see [3]. 5 3 Gram-Schmidt process The Gram Schmidt process is an algorithm that is used to turn any linearly independent set of vectors into a orthonormal set of vectors. We denote the linearly independent set of vectors by xj, and the resulting orthonormal set of vectors by ej. The steps of the algorithm are as follows: • First step: The first element of the orthonormal sequence, e1, will be obtained from 1 e1 = x1: kx1k • Second step: All the following steps will include two parts: first create a vector orthogonal to the previous vector(s), then normalize it. We create v2 as v2 = x2 − h x2; e1 ie1; and then normalize it: 1 e2 = v2: kv2k • Third step: We create v3 as v3 = x3 − h x3; e1 ie1 − h x3; e2 ie2 and then normalize: 1 e3 = v3: kv3k The algorithm proceeds by induction. • nth step: Suppose that fe1; :::; en−1g is an orthonormal set such that spanfe1; :::; en−1g = spanfx1; :::; xn−1g. The vector vn is defined by: n−1 X vn = xn − h xn; ek iek: k=1 Note that vn is a non-zero vector. Otherwise xn would belong to the span of fx1; :::; xn−1g and the set fx1; :::; xng would be linearly dependent. Note also that vn is orthogonal to all the vectors e1; :::; en−1. Normalizing vn: 1 en = vn; kvnk we therefore obtain an orthonormal set fe1; :::; eng with spanfe1; :::; eng = spanfx1; :::; xng. Example 5. Gram-Schmidt procedure on vectors in R3 Consider the two vectors x1 = (1; 1; 0) and x2 = (1; 2; 1). The Gram-Schmidt procedure can be used to obtain a set fe1; e2g that is orthonormal with respect to the standard scalar product in R3. 6 • First step: The vector e1 is obtained by normalizing x1: 1 1 e1 = x1 = p 1; 1; 0 : kx1k 2 • Second step: We create v2 as: 3 1 v = x − h x ; e ie = (1; 2; 1) − (1; 1; 0) = (−1; 1; 2): 2 2 2 1 1 2 2 And now we normalize: 1 1 e2 = v2 = p (−1; 1; 2): kv2k 6 Example 6. Gram-Schmidt process on polynomials 2 2 Consider the set u = f1; x; x g, and let u1 = 1, u2 = x and u3 = x . The Gram-Schmidt process can be used to obtain a set fe1; e2; e3g that is orthonormal with respect to the inner product Z 1 h f; g i = f(x) g(x) dx: −1 • First step: The first element of the orthonormal sequence, e1, will be obtained from 1 1 e1 = u1 = p ku1k 2 • Second step: We create v2 as: Z 1 v2 = u2 − h u2; e1 ie1 = x − hx; 1i = x − x · 1 dx = x: −1 And then make it normalized: 1 x r3 e2 = v2 = q = x: kv2k 2 2 3 • Third step: We create v3 as: v3 = u3 − h u3; e1 ie1 − h u3; e2 ie2 Z 1 1 1 Z 1 r3 r3 = x2 − x2 · p dx · p − x2 · x dx · x −1 2 2 −1 2 2 1 2 1 = x2 − · − 0 = x2 − 2 3 3 Note that Z 1 2 2 2 1 8 kv3k = x − dx = ; −1 3 45 and therefore: p 1 3 5 2 1 e3 = v3 = p x − : kv3k 2 2 3 These are, up to a multiplicative factor, the first three so-called Legendre polynomials. 7 4 Orthogonal polynomials 4.1 Orthonormal and monic orthogonal polynomials Let dµ be a positive Borel measure on R having finite moments, that is Z jxjmdµ < 1 (4.1) R for all integers m ≥ 0.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages34 Page
-
File Size-