Probability Theory Oral Exam Study Notes
Total Page:16
File Type:pdf, Size:1020Kb
Probability Theory Oral Exam study notes Notes transcribed by Mihai Nica Abstract. These are some study notes that I made while studying for my oral exams on the topic of Probability Theory. I took these notes from a few dierent sources, and they are not in any particular order. They tend to move around a lot. They also skip some of the basics of measure theory which are covered in the real analysis notes. Please be extremely caution with these notes: they are rough notes and were originally only for me to help me study. They are not complete and likely have errors. I have made them available to help other students on their oral exams (Note: I specialized in probability theory, so these go a bit further into a few topics that most people would do for their oral exam). See also the sections on Conditional Expectation and the Law of the Iterated Logarithm from my Limit Theorem II notes. Contents Independence and Weak Law of Large Numbers 5 1.1. Independence 5 1.2. Weak Law of Large Numbers 9 Borel Cantelli Lemmas 12 2.3. Borel Cantelli Lemmas 12 2.4. Bounded Convergence Theorem 15 Central Limit Theorems 17 3.5. The De Moivre-Laplace Theorem 17 3.6. Weak Convergence 17 3.7. Characteristic Functions 24 3.8. The moment problem 31 3.9. The Central Limit Theorem 33 3.10. Other Facts about CLT results 35 3.11. Law of the Iterated Log 36 Moment Methods 39 4.12. Basics of the Moment Method 39 4.13. Poisson RVs 42 4.14. Central Limit Theorem 42 Martingales 51 5.15. Martingales 51 5.16. Stopping Times 53 Uniform Integrability 58 6.17. An 'absolute continuity' property 58 6.18. Denition of a UI family 58 6.19. Two simple sucient conditions for the UI property 59 6.20. UI property of conditional expectations 59 6.21. Convergence in Probability 60 6.22. Elementary Proof of Bounded Convergence Theorem 60 6.23. Necessary and Sucient Conditions for L1 convergence 60 UI Martingales 62 7.24. UI Martingales 62 7.25. Levy's 'Upward' Theorem 62 7.26. Martingale Proof of Kolmogorov 0-1 Law 63 7.27. Levy's 'Downward' Theorem 63 7.28. Martingale Proof of the Strong Law 64 3 CONTENTS 4 7.29. Doob's Subartingale Inequality 64 Kolmogorov's Three Series Theorem using L2 Martingales 66 A few dierent proofs of the LLN 73 9.30. Truncation Lemma 73 9.31. Truncation + K3 Theorem + Kronecker Lemma 74 9.32. Truncation + Sparsication 76 9.33. Levy's Downward Theorem and the 0-1 Law 79 9.34. Ergodic Theorem 80 9.35. Non-convergence for innite mean 81 Ergodic Theorems 82 10.36. Denitions and Examples 82 10.37. Birkho's Ergodic Theorem 86 10.38. Recurrence 88 10.39. A Subadditive Ergodic Theorem 88 10.40. Applications 90 Large Deviations 91 10.41. LDP for Finite Dimesnional Spaces 92 10.42. Cramer's Theorem 103 Bibliography 109 Independence and Weak Law of Large Numbers These are notes from Chapter 2 of [2]. 1.1. Independence Definition. A; B indep if P(A \ B) = P(A)P(B), X; Y are indep if P(X 2 C; Y 2 D) = P(X 2 C)P(Y 2 D) for every C; D 2 R. Two σ−algebras are independent if A 2 F and B 2 G has A; B independent. Exercise. (2.1.1) Show that if X; Y are indep then σ(X) and σ(Y ) are. ii) Show that if X is F-measurable and Y is G−measurable then and F , G are inde- pendent, then X and Y are independent Proof. This is immediate from the denitions. Exercise. (2.1.2) i) Show A; B are independent then Acand B are independent too. ii) Show A; B are independent i 1A and 1B are independent. Proof. i) P(B) = P(A\B)+P(Ac \B) = P(A)P(B)+P(Ac \B) , rearrange. ii) Simple using if and c otherwise. f1A 2 Cg = A 1 2 C = A Remark. By the above quick exercises, all of independence is dened by in- dependence of σ-algbras, so we will view that as the ventral object. Definition. F1; F2 ::: are independent if for any nite subset and for any sets from an index set we have Q . Ai 2 Fi i 2 I ⊂ N P (\Ai) = P(Ai) X1;X2;::: are independent if σ(X1); σ(X2) ::: are independent.A1;A2;::: are independent if are independent. 1A1 ; 1A2 ;::: Exercise. (2.1.3.) Same as previous exercise with more than two sets. Example. (2.1.1) The usually example of three events which are pairwise in- dependent but not independent on the space of three fair coinips. 1.1.1. Sucient Conditions for Independence. We will work our way to Theorem 2.1.3 which is the main result for this subsection. Definition. We call a collection of sets A1; A2 ::: An independent if any col- lection from an index set i 2 I ⊂ f1; : : : ; ng Ai 2 Ai is independent. (Just like the denition for the σ- algebras only we dont require A to be a sigma algebra) Lemma. (2.1.1) If we suppose that each Ai contains Ω then the criteria for independence works with I = f1; : : : ; ng Proof. When you put Ak = Ω it doesn't change the intersection and it doesnt change the product since . P(Ak) = 1 5 1.1. INDEPENDENCE 6 Definition. A π−system is a collection A which is closed under intersections, i.e. A; B 2 A =) A \ B 2 A. A λ−system is a collection L that satises: i) Ω 2 L ii) A; B 2 L and A ⊂ B =) B − A 2 L iii) An 2 L and An " A =) A 2 L Remark. (Mihai - From Wiki) An equivalent def'n of a λ−system is: i) Ω 2 L ii) A 2 L =) Ac 2 L iii) disjoint 1 A1;A2;::: 2 L =)[n=1An 2 L In this form, the denition of a λ−system is more comparable to the denition of a σ algebra, and we can see that it is strictly easier to be a λ−system than a σ−algebra (only needed to be closed under disjoint unions rather than arbitrary countable unions). The rst denition presented by Durret however is more useful since it is easier to check in practice! Theorem. (2.1.2) (Dynkin's π − λ) Theorem. If P is a π−system and L is a λ−system that contain P, then σ(P) ⊂ L. Proof. In the appendix apparently? Will come back to this when I do measure theory. Theorem. (2.1.3) Suppose A1; A2 are independent of each other and each Ai is a π−system. Then σ(A1); : : : ; σ(An) are independent. Proof. (You can basically reduce to the case n = 2 ) Fix any A2;:::;An in A2;:::; An respectively and let F = A2\:::\An Let L = fA : P(A \ F ) = P(A)P(F )g. Then we verify that L is a λ−system by using basic properties of P. For A ⊂ B both in F we have: P ((B − A) \ F ) = P (B \ F ) − P(A \ F ) = P(B)P(F ) − P(A)P(F ) = P(B − A)P(F ) (increasing limits is easy by continuity of probability) By the π − λ theorem, σ(A1) ⊂ L, since this works for any F, we have then that σ(A1); A2; A3;:::; An are independent. Iterating the argument n − 1 more times gives the desired result. Remark. (Durret) The reason the π − λ theorem is helpful here is because it is hard to check that for A; B 2 L that A \ B 2 L or that A [ B 2 L. However, it is easy to check that if A; B 2 L with A ⊂ B then B − A 2 L. The π − λ theorem coverts these π and λ systems (which are easier to work with) to σ algebras (which are harder to work with, but more useful) Theorem. (2.1.4) In order for X1;X2;:::;Xn to be independent, it is su- cient that for all x1; x2; : : : ; xn 2 (−∞; 1] that: n Y P (X1 ≤ x1;:::;Xn ≤ xn) = P (Xi ≤ xi) i=1 Proof. Let Ai be sets of the form fXi ≤ xig. It is easy to check that this is a π−system, so the result is a direct application of the previous theore m. 1.1. INDEPENDENCE 7 Exercise. (2.1.4.) Suppose (X1;:::;Xn) has density f(x1; : : : ; xn) and f can be written as a product . Show that the 0 are independent. g1(x1) · ::: · gn(xn) Xis Proof. Let g~i(xi) = cigi(xi) where ci is chosen so that g~i = 1. Can verify Q Q that ci = 1 from f = g and f = 1. Then integrate along´ the margin to see that each g~i is in fact a pdf for X´ i. Can then apply Thm 2.1.4 after replacing g's by g~'s Exercise. (2.1.5) Same as 2.1.4 but on a discrete space with a probability mass function instead of a probability density function. Proof. Work the same but with sums instead of integrals. We will now prove that functions of independent random variables are still independent (to be made more precise in a bit) Theorem. (2.1.5) Suppose Fi;j 1 ≤ i ≤ n and 1 ≤ j ≤ m(i) are independent σ−algebras and let Gi = σ([jFi;j).