Some Facts from Probability Theory
Total Page:16
File Type:pdf, Size:1020Kb
ApPENDIX A Some Facts from Probability Theory 1. Convergence of Moments. Uniform Integrability In all probability courses we are told that convergence in probability and convergence in distribution do not imply that moments converge (even if they exist). EXAMPLE 1.1. The standard example is {Xn' n ~ 1} defined by 1 1 P{Xn = O} = 1 - - and P{Xn = n} =-. (1.1) n n Then Xn ~ 0 as n ---. 00, but, for example, EXn ---.1 and Var Xn ---. 00 as n ---. 00, that is, the expected value converges, but not to the expected value of the limiting random variable, and the variance diverges. The reason for this behavior is that the distribution mass escapes to infinity in a forbidden way. The adequate concept in this context is the notion of uniform integrability. A sequence of random variables, {Xn' n ~ 1}, is said to be uniformly integrable if lim EIXnII{IXnl > IX} = 0 uniformly in n. (1.2) It is now easy to see that the sequence defined by (1.1) is not uniformly integrable. Another way to check uniform integrability is given by the following criterion (see e.g. Chung (1974), Theorem 4.5.3). Lemma 1.1. A sequence {Y", n ~ 1} is uniformly integrable iff (i) sup EI Y"I < 00. n 166 Appendix A. Some Facts from Probability Theory (ii) For every & > 0 there exists ~ > 0, such that for all events A with P {A} < ~ we have ElY" I I {A} < & for all n. (1.3) The following is an important result connecting uniform integrability and moment convergence. Theorem 1.1. Let 0 < r < 00, suppose that EIXnlr < 00 for all n and that Xn ~ X as n -+ 00. The following are equivalent: (i) Xn -+ X in L r as n -+ 00, (ii) EIXnlr -+ EIXlr < 00 as n -+ 00, (iii) {IXnlr, n ~ 1} is uniformly integrable. Furthermore, if Xn ~ X and one of (i)-(iii) hold, then (iv) E IXnlP -+ E IXIP as n -+ 00 for all p, 0 < p ::;; r. For proofs, see e.g. Loeve (1977), pp. 165-166, where this result (apart from (iv» is called an U-convergence theorem, and Chung (1974), Theorem 4.5.4. Remark 1.1. The theorem remains obviously true if one assumes convergence almost surely instead of convergence in probability, but also (except for (i» if one assumes convergence in distribution. For the latter, see Billingsley (1968), Theorem 5.4. Remark 1.2. The implications (iii) => (i), (ii), (iv) remain true for families of random variables. Lemma 1.2. Let U and V be positive random variables, such that EUr < 00 and EVr < 00 for some r > O. Then E(U + V)'!{U + V> a} ::;; 2rEurI {U >~} + 2rEvrI {V> n(a> 0). (1.4) PROOF. E(U + V)'!{U + V> a}::;; E(max{2U, 2V})'!{max{2U, 2V} > a} ::;; 2rEurI{U > a/2} + 2rEvrI{V > a/2}. 0 Lemma 1.3. Let {Un> n ~ 1} and {v", n ~ 1} be sequences of positive random variables such that, for some p > 0, {U,f, n ~ 1} and {V/, n ~ 1} are uniformly integrable. (1.5) Then {(Un + v,,)P, n ~ 1} is uniformly integrable. (1.6) 2. Moment Inequalities for Martingales 167 PROOF. We have to show that lim E(Un + v,,}P I {Un + v" > oc} = 0 uniformly in n. (1.7) (%-+00 But this follows from Lemma 1.2 and (1.5). o Remark 1.3. Lemmas 1.2 and 1.3 can be extended to more than two random variables (sequences) in an obvious way. 2. Moment Inequalities for Martingales The most technical convergence results are those where moments of stopped sums are involved, in particular, convergence of such moments. For those results we need some moment inequalities for martingales. The study of such inequalities was initiated in Burkholder (1966). Two further important references are Burkholder (1973) and Garsia (1973). Just as martingales in some sense are generalizations of sequences of partial sums of independent random variables with mean zero, the martingale inequalities we consider below are generalizations of the corresponding inequalities for such sequences of partial sums of independent random variables obtained by Marcinkiewicz and Zygmund (1937, 1938). As a preparation for the proofs of the moment convergence results for stopped random walks we prove moment convergence in the ordinary central limit theorem (see Theorem 1.4.2). Since the martingale inequalities below, in the proofs of moment convergence for stopped random walks, will play the role played by the Marcinkiewicz-Zygmund inequalities in the proof of moment convergence in the ordinary central limit theorem, we present the latter inequalities before we derive the former. Finally, since there is equality for the first and second moments of stopped sums just as in the classical case (Theorem 1.5.3), we close by quoting an application ofDoob's optional sampling theorem to stopped sums, which will be useful. However, we begin, for completeness, with the definition of a martingale and the most basic convergence result. Thus, let (Q, $', P) be a probability space, let {$i", n ~ 1} be an increasing sequenceofsub-lT-algebras of$' and set$'oo = IT{U:'=l$i,,} and$'o = {0,Q}. Further, let {Xn' n ~ 1} be a sequence of integrable random variables. We say that {(Xn,$i,,), n ~ 1} is a martingale if (i) {Xn' n ~ 1} is adapted, that is, if Xn is $i,,-measurable for all n, (ii) E(Xnl$'m} = Xm for m :s; n. Recall that a simple example of a martingale is obtained by Xn = L~=l Yt, where {Yt, k ~ 1} are independent random variables with mean 0 and $i" = IT{ Yt, k :s; n}. 168 Appendix A. Some Facts from Probability Theory The classical reference for martingale theory is Doob (1953). For more recent books, see e.g. Chung (1974) and Neveu (1975). We say that {(Xn,'.§n)' n ~ 1} is a reversed martingale if {'.§n, n ~ 1} is a decreasing sequence of sub-O"-algebras of ff' and (i) Xn is '.§n-measurable for all n, (ii) E(Xn I'.§m) = Xm for n:::; m. Alternatively one can define a reversed martingale exactly like a martingale, but with the index set being the negative integers. A common example of a reversed martingale is Xn = (lin) L~=l ~, where {~, k ~ 1} are i.i.d. random variables with finite mean and '.§n = 0" {Xk' k ~ n}. If the equality sign in (ii) of the definitions is replaced by ~ (:::;) we have sub(super) martingales. The classical martingale convergence theorems state that an L 1-bounded martingale is a.s. convergent and that all reversed martingales converge a.s. and in L1. The following is an important result on moment convergence for martingales. Theorem 2.1. Let {(Xn' ~), n ~ 1} be a martingale. The following are equivalent: (i) {Xn' n ~ 1} is uniformly integrable, (ii) Xn converges in L 1 as n --+ 00, (iii) Xn ~ Xoo as n --+ 00, where Xoo is integrable and such that {(Xn' ~), 1 :::; n :::; oo} is a martingale, (iv) there exists a random variable Y such that Xn = E(YI~) for all n. A similar result is true for reversed martingales, but, since every reversed martingale is, in fact, uniformly integrable, all corresponding conditions are automatically satisfied. Moment Inequalities for Sums of Independent Random Variables Let {Xko k ~ 1} be a sequence of random variables with the same distribution, such that E IXli' < 00 for some r ~ 1 and let {Sn, n ~ 1} be their partial sums. It is easy to see, by applying Minkowski's inequality, that (2.1) which gives an upper bound for EISnlr of the order of magnitude ofnr. Now, if, for example, the summands are i.i.d. with mean 0 and finite variance, 0"2, the central limit theorem states that 2. Moment Inequalities for Martingales 169 In view of the previous section it would therefore follow that E ISnlvlnlr converges as n --+ 00 provided {ISnlvlnlr, n ~ I} is uniformly integrable, in which case E ISn Ir would be (9(nr/2) instead of (9(nr). Below we shall see that this is, indeed, the correct order of magnitude when r ~ 2. By starting with Khintchine's inequality and Rademacher functions Marcinkiewicz and Zygmund (1937, 1938) proved (without using the central limit theorem) the following celebrated inequalities: Let {Xk' k ~ I} be independent (not necessarily identically distributed) random variables with mean 0 and a finite moment of some order r ~ 1. Let {Sn, n ~ I} denote the partial sums. Then there exist numerical constants Ar and Br , depending on r only such that (2.2) If, in particular, the summands are i.i.d. the right most inequality can be estimated by Minkowski's inequality if rl2 ~ 1 and by the cr-inequality if r12:::;; 1 and one obtains S forI E r<{BrnEIXllr :::;;r:::;;2 (2.3) I nl - Brnr/2 E IXlir for r ~ 2. For r = 2 we have, of course, ES; = nEX; and, for r = 1, EISnl:::;; nEIXll. Note that (2.3), for r ~ 2, can be rewritten as E ISnlvlnlr :::;; Br· E IXllr. Since, by Billingsley (1968), Theorem 5.3, liminf EISnlvlnlr ~ EIZlr, n .... oo where Z is normal with mean 0 and variance (f2, we have established the correct order of magnitude for these moments. That they actually converge is proved in Theorem 14.2, by showing that the sequence {ISnlvlnlr, 1i ~ I} is uniformly integrable. Moment Inequalities for Martingales We now turn our attention to martingale extensions of the Marcinkiewicz Zygmund inequalities (2.2). The first extension is due to Burkholder (1966), Theorem 9, where it is shown that (2.2) remains true for r > 1, if {Xk' k ~ I} is a sequence of martingale differences and {Sn' n ~ I} is a martingale.