Notes from Limit Theorems 2 Mihai Nica
Total Page:16
File Type:pdf, Size:1020Kb
Notes from Limit Theorems 2 Mihai Nica Notes. These are my notes from the class Limit Theorems 2 taught by Proffe- sor McKean in Spring 2012. I have tried to carefully go over the bigger theorems from the course and fill in all the details explicitly. There is also a lot of information that is folded in from other sources. • The section on Martingales is supplemented with some notes from "A First Look at Rigorous Probability Theory" by Jeffrey S. Rosenthal, which has a really nice introduction to Martingales. • The section of the law of the iterated logarithm is supplemented with some inequalities which I looked up on the internet...mostly wikipedia and PlanetMath. • In the section on Ergodic theorem, I use a notation I found on wikipedia that I like for continued fractions. In my pen-and-paper notes, there is also a little section about Ergodic theory for geodesics on surfaces, which is really cute. However, I couldn't figure out a good way to draw the pictures so it hasn't been typed up yet. • The section on Brownian Motion is supplemented by the book Brownian Motion and Martingale's in Analysis by Richard Durret which is really wonderful. Some of the slick results are taken straight from there. • I also include an appendix with results that I found myself reviewing as I went through this stuff. Contents Chapter 1. Martingales 5 1. Definitions and Examples 5 2. Stopping times 6 3. Martingale Convergence Theorem 7 4. Applications 9 Chapter 2. The Law of the Iterated Logarithm 13 1. First Half of the Law of the Iterated Logarithm 13 2. Second Half of the Law of the Iterated Logarithm 15 Chapter 3. Ergodic Theorem 19 1. Motivation 19 2. Birkhoff's Theorem 20 3. Continued Fractions 24 Chapter 4. Brownian Motion 29 1. Motivation 29 2. Levy's Construction 30 3. Construction from Durret's Book 33 4. Some Properties 36 Chapter 5. Appendix 39 1. Conditional Random Variables 39 2. Extension Theorems 40 3 CHAPTER 1 Martingales 1. Definitions and Examples This section on Martingales contains heavy use of conditional random variables. I do a quick review of this topic from Limit Theorems 1 in the appendix. Definition 1.1. A sequence of random variables X0;X1; ::: is called a martingale if E(jXnj) < 1for all n and with probability 1: E (Xn+1jX0;X1; :::; Xn) = Xn Intuitively, this is says that the average value of Xn+1is the same as that of Xn, even if we are given the values of X0to Xn. Note that conditioning on X0; :::; Xnis just different notation for conditioning on σ(X0; :::; Xn), which is the sigma algebra generated by preimages of Borel sets through X0; :::; Xn: One can make more general martingales by replacing σ(X0; :::; Xn) with an arbitrary increasing chain of sigma algebras Fn; the results here carry over to that setting too. Example 1.2. Sometimes martingales are called \fair games". The analogy is that the random variable Xn represents the bankroll of the gambler at time n. The game is fair, because at any point in time the equity of the gambler is constant. Definition 1.3. A submartingale is when E (Xn+1jX0;X1; :::; Xn) ≥ Xn (i.e. the capital is increasing) and a supermartingale is when E (Xn+1jX0;X1; :::; Xn) ≤ Xn (i.e. the capital is decreasing) Most of the theorems for martingales work for submartingales, just change the inequality in the right place. To avoid confusion between sub-, super-, and ordinary martingales, we will sometimes call a martingale a \fair martingale". Example 1.4. The symmetric random walk, Xn = Z0 + Z1 + ::: + Zn with 1 each Zn = ±1 with probability 2 is a martingale. In terms of the fair game, this is gambling on the outcome of a fair coin. Remark. Using the properties of conditional probabilities to see that: E (Xn+2jX0;X1; :::; Xn) = E (E (Xn+2jX0;X1; :::; Xn+1) jX0; :::Xn) = E (Xn+1jX0; :::Xn) = Xn With a simple argument by induction, we get that in general: E (XmjX0;X1; :::; Xn) = Xn In particular then E(Xn) = E(X0) for every n. If τ is a random \time", (a non-negative integer) that is independent of the Xn's, then E(Xτ ) is a weighted average of E(Xn)'s, so have E(Xτ ) = E(X0) still. What if υis dependent on the 5 6 1. MARTINGALES 0 Xns? In general we cannot have equality for the example of the simple symmetric random walk (coin-flip betting), with τ =first time that Xn = −1 has E(Xn) = −1 6= 0 = E(X0): The next section gives some conditions where this holds. 2. Stopping times Definition 2.1. For a martingale fXng; A non-negative integer valued random variable τ is a stopping time if it has the property that: fτ = ng 2 σ(X1;X2;:::;Xn) Intuitively, this is saying that one can determine if τ = n just by looking at the first n steps in the martingale. Example 2.2. In the example of the random coin flipping, if we let τ be the first time so that Xn =10, then τ is a stopping time. Example 2.3. We often are interested in Xτ , the value of the martingale at the random time τ: This is precisely defined as Xτ (!) = Xτ(!)(!). Another handy P rewriting is: Xτ = Xk1fτ=kg . Lemma 2.4. If fXngis a submartingale and τ1; τ2are bounded stopping times so that 9M s.t. 0 ≤ τ1 ≤ τ2 ≤ M with probability 1, then E(Xτ1 ) ≤ E(Xτ2 ), with equality for fair martingales. Proof. For fixed k, the event fτ1 < k ≤ τ2gcan be written as fτ1 < k ≤ C τ2g = fτ1 ≤ k − 1g \ fτ2 ≤ k − 1g from which we see that the event fτ1 < k ≤ τ2g 2 σ(X0;X1;:::;Xk−1) because τ1and τ2are both stopping times. We have then the following manipulation using a telescoping series, linearity of the expectation, the fact that E(Y 1A)= E(E(Y jX0;X1;:::;Xk−1)1A) for events A 2 σ(X0;X1;:::;Xk−1), and finally the fact that E(XkjX0;X1;:::Xk−1) − Xk−1 ≥ 0 since Xn is a (sub)martingale. (with equality for fair martingales): E(Xτ2 ) − E(Xτ1 ) = E(Xτ2 − Xτ1 ) τ X2 = E( Xk − Xk−1) k=τ1+1 M ! X = E (Xk − Xk−1)1fτ1<k≤τ2g k=1 M ! X = E (E(XkjX0;X1;:::Xk−1) − Xk−1)1fτ1<k≤τ2g k=1 M X = E (E(XkjX0;X1;:::Xk−1) − Xk−1)1fτ1<k≤τ2g k=1 M X ≥ E 01fτ1<k≤τ2g k=1 = 0 Where the inequality is equality in the case of a fair martingale. 3. MARTINGALE CONVERGENCE THEOREM 7 Theorem 2.5. Say fXng is a martingale and τ a bounded stopping time, (that is 9M s.t. 0 ≤ τ ≤ M with probability 1). Then: E(Xτ ) = E(X0) Proof. Let υbe the random variable which is constantly 0. This is a stopping time! So by the above lemma, since 0 ≤ υ ≤ τ ≤ M, we have that E(Xτ ) = E(Xυ) = E(X0) Theorem 2.6. For fXnga martingale and τ a stopping time which is almost surely finite (that is P(τ < 1) = 1) we have: E(Xτ ) = E(X0) () E lim Xmin(τ;n) = lim E Xmin(τ;n) n!1 n!1 Proof. It suffices to show that E(Xτ ) = E limn!1 Xmin(τ;n) andE(X0) = limn!1 E Xmin(τ;n) . The first equality holds since P(τ < 1) = 1 gives P(limn!1 Xmin(τ;n) = Xτ ) = 1, so they agree almost surely. The second holds by the above theorem con- cerning bounded stopping times since for any n, min(τ; n) is a bounded stopping time, so we have E Xmin(τ;n) = E(X0), so equality holds in the limit too. Remark. The above theorem can be combined with things like monotone convergence theorem or Lebesgue dominated convergence theorem to switch the limits and conclude that E(Xτ ) = E(X0). Here are some examples: Example 2.7. If fXngis a martingale and τ a stopping time so that P(τ < 1) = 1 and E(jXτ j) < 1; and limn!1 E(Xn1τ>n) = 0, then E(Xτ ) = E(X0): Proof. For any n we have: Xmin(τ;n) = Xn1τ>n +Xτ 1τ≤nTaking expectation and then the limit as n ! 1, gives: lim E(Xmin(τ;n)) = lim E(Xn1τ>n) + lim E(Xτ 1τ>n) n!1 n!1 n!1 = 0 + E(Xτ ) Where the first term is 0 by hypothesis, and the second limit is justified since Xτ 1τ>n ! Xτ pointwise almost surely since P(τ < 1) = 1, and the dominant majorant E(jXτ j) < 1lets us use the Lebesgue dominated convergence theorem to conclude the convergence of the expectation. Example 2.8. Suppose fXngis a martingale and τ a stopping time so that E(τ) < 1 and jXn+1 − Xnj ≤ M < 1for some fixed M and for every n: Then E(Xτ ) = E(X0). Proof. Let Y = jX0j + Mτ. Then Y can be used as a dominant majorant in a L.D.C.T. very similar to the above example to get the conclusion. 3. Martingale Convergence Theorem The proof relies on the famous upcrossing lemma: Lemma 3.1. [The Upcrossing Lemma]. Let fXngbe a submartingale. For fixed α,β α; β 2 R, β > α;and M 2 N let UM be the number of \upcrossings" that the martingale fXngmakes of the interval α; β in the time period 1 ≤ n ≤ M.