Characteristic Functions Contents
Total Page:16
File Type:pdf, Size:1020Kb
Characteristic functions 1/13/12 Literature Rick Durrett, Probability: Theory and Examples, Duxbury 2010. Olav Kallenberg, Foundations of Modern Probability, Springer 2001, Eugene Lukacs, Characteristic Functions, Hafner 1970. Robert G. Bartle, The Elements of Real Analysis, Wiley 1976. Michel Lo`eve, Probability Theory, Van Nostrand 1963. Contents 1 Definition and basic properties 2 2 Continuity 4 3 Inversion 7 4 CLT 9 4.1 The basic CLT . 9 4.2 Lindeberg-Feller Condition . 11 4.3 Poisson convergence . 14 4.4 Exercises . 17 5 Poisson random measure 19 5.1 Poisson measure and integral . 21 5.2 About stochastic processes . 22 5.3 Classical Poisson Process . 24 5.4 Transformations of Poisson process . 26 5.4.1 Nonhomogeneous Poisson process . 26 5.4.2 Reward or compound Poisson process . 27 5.5 A few constructions of Poisson random measure . 29 5.5.1 adding new atoms . 29 5.5.2 gluing the pieces . 30 5.5.3 using a density of a random element . 31 5.6 Exercises . 32 5.7 Non-positive awards . 34 5.8 SSα - symmetric α-stable processes . 37 1 5.9 Exercises . 39 6 Infinitely divisible distributions 40 6.1 Preliminaria . 40 6.2 A few theorems . 41 6.3 A side trip: decomposable distributions . 42 6.4 ID of Poisson type . 44 6.5 L´evy-Khinchin formula . 45 6.6 Exercises . 48 1 Definition and basic properties Let µ be a probability distribution of a random variable X. The characteristic function, a.k.a. Fourier transform, is the complex valued one-parameter function Z µ^(t) = '(t) = e{tx µ(dx) = E eitX : R L Rd Similarly weD defineE the ch.f. of a probability distribution µ = (X) on or in a Hilbert space where tx = t; x is the inner product. The definition applies also to finite measures, even to signed measures of bounded variation. The term \characteristic function" is restricted to probability measures. Proposition 1.1 Every ch.f. '(t) = µb(t) = E eitX has the properties: 1. '(0) = 1; 2. j'j ≤ 1; 3. ' is uniformly continuous on R. 4. ' is semi-positive definite, i.e., X X '(tj − tk) zj zk ≥ 0; for every finite sets f tj g ⊂ R; f zj g ⊂ C j k Proof. (3): j'(s) − '(t)j ≤ E eisX − eitX ≤ E 1 − ei(s−t)X ≤ E 1 ^ js − tjX: X 2 X X itj X (4): 0 ≤ E zjE e = '(tj − tk)zjzk: j j k A probabilist should know ch.fs. of basic probability distributions by heart and how they behave under simple transformations. To wit: 2 Proposition 1.2 1. 'aX (t) = 'X (at), hence '−X = '. 2. A convex combination of ch.fs. is a ch.f. 3. Hence, given a ch.f. ', <' = (' + ')=2 is a ch.f. 4. The finite product of ch.fs. is a ch.f. Namely, ' ··· ' = ' 0 ··· 0 ; X1 Xn X1+ +Xn where Xk's are independent copies of Xk. In other words, µ^1 ··· µ^n = (µ1 ⊗ · · · ⊗ µn)^: 5. Hence, given a ch.f. ', j'j and the natural powers 'n and j'jn are ch.fs. D 6. a ch.f. is real if and only if1 X is symmetric, i.e. X = −X. Notice that We will present examples as needed. Example 1.3 A \duality". 2(1 − cos t) 1 − cos x The triangular density (1 − jxj) has the ch.f. The Polya density has the + t2 πx2 ch.f. (1 − jtj)+. 1 The symmetrized exponential distribution with the density e−|xj=2 has the ch.f. The ch.f. 1 + t2 1 of the Cauchy density equals e−|tj. π(1 + x2) Using the idea from the proof of (3) of Proposition 1.1, for a family µα = L(Xα) we obtain the upper estimate that involves the standard L0-metric: sup j'α(s) − 'α(t)j ≤ sup k(s − t)Xαk0 α α 0 Corollary 1.4 If the family f µα g is tight (i.e., f Xα g is bounded in L ), then f 'α g is uniformly equi-continuous. The opposite implication is also true. 1only the \if" part is obvious now 3 Lemma 1.5 Consider µ = L(X);' =µ ^. Then, for r > 0, Z r 2=r (1) P(jXj ≥ r) = µ[−r; r]c ≤ (1 − '(t)) dt; 2 −2=r Z 1=r (2) P(jXj ≤ r) = µ[−r; r] ≤ 2r j'(t)j dt: −1=r Proof. W.l.o.g. we may and do assume that r = 1 (just consider X=r and change the variable in the right hand side integrals). (1): By Fubini's theorem the right hand side equals Z ( ) 2 ( ) 1 itX sin 2X E 1 − e dt = E 2 1 − ≥ E 1IfjX|≥1g = P(jXj ≥ 1): 2 −2 2X (2): In virtue of Fubini's theorem the left hand side is estimated as follows, using the formula for the ch.f. of the triangular density: Z Z Z − 1 ≤ 2(1 cos X) − j j itX − j j ≤ j j E 1IfjX|≤1g E 2 = E (1 t )+e dt = (1 t )+'(t) dt ϕ(t) dt: 2 X R R R Corollary 1.6 If a family f 'α g of ch.fs. is equicontinuous at 0, then µα is tight. j − j j j Proof. Let ϵ > 0 and δ > 0 be such that supα 1 'α(t) < ϵ/2 whenever t < δ. Let r0 = 2/δ. Then (1) in the Lemma entails sup P(jXαj ≥ r) ≤ ϵ for r > r0 α 2 Continuity Theorem 2.1 (L´evyContinuity Theorem) For ch.fs. 'n = µcn and '0 = µc0 the following are equivalent: 1. 'n ! '0 pointwise; w 2. µn ! µ0; 3. 'n ! '0 uniformly on every interval. Proof. (2) ) (1) follows by the definition of weak convergence and (3) ) (1) is obvious. The remaining nontrivial implications (1) ) (2) and (2) ) (3) would be much easier to prove if the measures would have the common bounded support, i.e., underlying random variables were 4 bounded. However, each of the assumptions implies that the family f µn g is tight, i.e. they are almost supported by a compact set. (1) ) (2): Assume the point convergence of ch.fs., which means that µnet ! µ0et for special itx functions et(x) = e , and thus for their finite linear combinations, forming an algebra A. We must show that µnf ! µ0f for every continuous bounded function on R. We infer that f µn g is tight. Indeed, let ϵ > 0 and choose r > 0 such that j1 − '(t)j < ϵ/4 for every t 2 [−r; r]. By Lemma 1.5.1 and the Dominated Convergence Theorem Z Z 2=r 2=r c r r lim sup µn[−r; r] ≤ lim sup (1 − 'n(t)) dt = (1 − '(t)) dt < ϵ/2 n n 2 −2=r 2 −2=r Then there is n0 such that c sup µn[−r; r] < ϵ. n>n0 At the same time there is r0 > 0 such that 0 0 c sup µn[−r ; r ] < ϵ n≤n0 Taking R = r _ r0, c sup µn[−R; R] < ϵ. n For any continuous bounded complex function h on R Z Z − ≤ jj jj h µn h µ 2 h 1 ϵ. (2.1) [−R;R]c [−R;R]c The Stone-Weierstrass Theorem (cf., e.g., Bartle, Theorem 26.2) says: If K ⊂ Rd is compact, and A is a separating algebra with unit that consists of complex functions on K, then every continuous function on K can be uniformly approximated by members of A. Take K = [−R; R] and the algebra A, defined above. In virtue of the Stone-Weierstrass Theorem there is g 2 A such that jjf − gjjK < ϵ. Hence, using (2.1) for h = f and for h = g, − ≤ − ≤ − (µn µ0)f (µn µ0)f 1IKc + (µn µ0) f1IK ≤ jj jj − − − − (2 f 1ϵ + (µn µ0)()f g )1IK + (µn µ0)g1IKc + (µn µ0)g ≤ 2jjfjj1 + 2 + 2jjgjj1 ϵ + (µn − µ0)g Let n ! 1 and then ϵ ! 0. (2) ) (3): Assume the weak convergence, consider an interval [−T;T ], and let ϵ > 0. So, f µn : n ≥ 0 g is tight, hence there is r > 0 such that Z ϵ itx ≤ − c ≤ sup e µn(dx) sup µn[ r; r] : (2.2) n≥0 [−r;r]c n 4 5 Then Z Z Z r j − j ≤ itx itx itx − 'n(t) '(t) e µn(dx) + e µ0(dx) + e (µn µ0)(dx) = A + B + C [−r;r]c [−r;r]c −r By (2.2), the sum A + B of the two first terms is bounded by ϵ/2. To estimate the third term, consider a partition (xk) of [−r; r] of mesh < ϵ/(8T ), chosen from the continuity set of µ0. In particular, we may enlarge the interval [−r; r], so −r, the first point of the partition, and r, the last point of the partition, are also continuity points. In short, Z Z r X xk = − r k xk−1 itx Adding and subtracting the term e k on each interval (xk−1; xk], C is bounded from above by the following expression: Z Z X xk ( ) X xk ( ) X itx − itxk itx − itxk − e e µn(dx) + e e µ0(dx) + µn(xk−1; xk] µ0(xk−1; xk] k xk−1 k xk−1 k itx itx Since e − e k ≤ tjx − xkj, hence X Z X Z xk ( ) ϵ xk ϵ eitx − eitxk µ (dx) ≤ µ (dx) ≤ ; n 8 n 8 k xk−1 k xk−1 so the sum of the first two terms in the latter estimate is less than ϵ/4.