Introduction to Probability Ariel Yadin Lecture 7 1
Total Page:16
File Type:pdf, Size:1020Kb
Introduction to Probability Ariel Yadin Lecture 7 1. Independence Revisited *** Dec. 6 *** 1.1. Some reminders. Let (Ω; F; P) be a probability space. Given a collection of subsets K ⊂ F, recall that the σ-algebra generated by K, is \ σ(K) = fG : G is a σ-algebra ; K ⊂ Gg ; and this σ-algebra is the smallest σ-algebra containing K. σ(K) can be thought of as all possible information that can be generated by the sets in K. Recall that a collection of events (An)n are mutually independent if for any finite number of these events A1;:::;An we have that P(A1 \···\ An) = P(A1) ··· P(An): We can also define the independence of families of events: Definition 7.1. Let (Ω; F; P) be a probability space. Let (Kn)n be a collection of families of events in F. Then, (Kn)n are mutually independent if for any finite number of families from this collection, K1;:::; Kn and any events A1 2 K1;A2 2 K2;:::;An 2 Kn, we have that P(A1 \···\ An) = P(A1) ··· P(An): For example, we would like to think that (An) are mutually independent, if all the information from each event in the sequence is independent from the information from the other events in the sequence. This is the content of the next proposition. Proposition 7.2. Let (An) be a sequence of events on a probability space (Ω; F; P). Then, (An) are mutually independent if and only if the σ-algebras (σ(An))n are mutually independent. It is not difficult to prove this by induction: Proof. By induction on n we show that (σ(A1); : : : ; σ(An); fAn+1g ;:::) are mutually indepen- dent. The base n = 0 is just the assumption. 1 2 So assume that (σ(A1); : : : ; σ(An−1); fAng ; fAn+1g ;:::) are mutually independent. Let n1 < n2 < : : : < nk < nk+1 < : : : < nm be a finite number of indices such that nk < n < nk+1. Let Bj 2 σ(Anj ) for j = 1; : : : ; k, and Bj = Anj for j = k + 1; : : : ; m. Let B 2 σ(An). If B = ; then P(B1 \···\ Bn \ B) = 0 = P(B1) ··· P(Bn) · P(B). If B = Ω then P(B1 \···\ Bn \ B) = P(B1 \···\ Bn) = P(B1) ··· P(Bn), by the induction hypotheses. If B = An then this also holds c by the induction hypotheses. So we only have to deal with the case B = An. In this case, for X = B1 \···\ Bn, P(X \ B) = P(X n (X \ An)) = P(X) − P(X \ An) = P(X)(1 − P(An)) = P(X) P(B); where we have used the induction hypotheses to say that X and An are independent. Since this holds for any choice a a finite number of events, we have the induction step. ut However, we will take a more windy road to get stronger results... (Corollary 7.8) 2. π-systems and Independence Definition 7.3. Let K be a family of subsets of Ω. • We say that K is a π-system if ; 2 K and K is closed under intersections; that is, for all A; B 2 K, A \ B 2 K. • We say that K is a Dynkin system (or λ-system) if K is closed under set complements and countable disjoint unions; that is, for any A 2 K and any sequence (An) in K of c S pairwise disjoint subsets, we have that A 2 K and n An 2 K. The main goal now is to show that probability measures are uniquely defined once they are defined on a π-system. This is the content of Theorem 7.7. Proposition 7.4. If F is Dynkin system on Ω and F is also a π-system on Ω, then F is a σ−algebra. Proof. Since F is a Dynkin system, it is closed under complements, and Ω = ;c 2 F. Let (An) be a sequence of subsets in F. Set B1 = A1, C1 = ;, and for n > 1, n−1 [ Cn = Aj and Bn = An n Cn: j=1 3 Tn−1 c c Since F is a π-system and closed under complements, Cn = ( j=1 Aj) 2 F, and so Bn = c An \ Cn 2 F. Since (Bn) is a sequence of pairwise disjoint sets, and since F is a Dynkin system, [ [ An = Bn 2 F: n n ut Proposition 7.5. If (Dα)α is a collection of Dynkin systems (not necessarily countable). Then T *** leave as ex- D = α Dα is a Dynkin system. ercise *** Proof. Since ; 2 Dα for all α, we have that ; 2 D. c c If A 2 D, then A 2 Dα for all α. So A 2 Dα for all α, and thus A 2 D. If (An)n is a countable sequence of pairwise disjoint sets in D, then An 2 Dα for all α and all S S n. Thus, for any α, n An 2 Dα. So n An 2 D. ut Lemma 7.6 (Dynkin's Lemma). If a Dynkin system D contains a π-system K, then σ(K) ⊂ D. Proof. Let \ F = fD0 : D0 is a Dynkin system containing Kg : So F is a Dynkin system and K ⊂ F ⊂ D. We will show that F is a σ−algebra, so σ(K) ⊂ F ⊂ D. Suppose we know that F is closed under intersections (which is Claim 3 below). Since ; 2 K ⊂ F, we will then have that F is a π-system. Being both a Dynkin system and a π−system, F is a σ−algebra. Thus, to show that F is a σ−algebra, it suffices to show that F is closed under intersections. Note that F is closed under complements (because all Dynkin systems are). Claim 1. If A ⊂ B are subsets in F, then B n A 2 F. Proof. If A; B 2 F, then since F is a Dynkin system, also Bc 2 F. Since A ⊂ B, we have that A; Bc are disjoint, so A [ Bc 2 F and so B n A = (Ac \ B) = (A [ Bc)c 2 F. ut Claim 2. For any K 2 K, if A 2 F then A \ K 2 F. Proof. Let E = fA : A \ K 2 Fg. Let A 2 E and (An) be a sequence of pairwise disjoint subsets in E. Since K 2 F and A \ K 2 F, by Claim 1 we have that Ac \ K = K n (A \ K) 2 F. So Ac 2 E. 4 Since (An \ K)n is a sequence of pairwise disjoint subsets in F, we get that [ \ [ An K = (An \ K) 2 F: n n So we conclude that E is a Dynkin system. Since K is closed under intersections, E contains K. Thus, by definition F ⊂ E. So for any A 2 F we have that A 2 E, and A \ K 2 F. ut Claim 3. For any B 2 F, if A 2 F then A \ B 2 F. Proof. Let E = fA : A \ B 2 Fg. Let A 2 E and (An) be a sequence of pairwise disjoint subsets in E. Since B 2 F and A \ B 2 F, by Claim 1 we have that Ac \ B = B n (A \ B) 2 F. So Ac 2 E. Since (An \ B)n is a sequence of pairwise disjoint subsets in F, we get that [ \ [ An B = (An \ B) 2 F: n n So we conclude that E is a Dynkin system. By Claim 2, K is contained in E. So by definition, F ⊂ E. ut Since F is closed under intersections, this completes the proof. ut The next theorem tells us that a probability measure on (Ω; F) is determined by it's values on a π-system generating F. Theorem 7.7 (Uniqueness of Extension). Let K be a π−system on Ω, and let F = σ(K) be the σ−algebra generated by K. Let P; Q be two probability measures on (Ω; F), such that for all A 2 K, P (A) = Q(A). Then, P (B) = Q(B) for any B 2 F. Proof. Let D = fA 2 F : P (A) = Q(A)g. So K ⊂ D. We will show that D is a Dynkin system, and since it contains K, the by Dynkin's Lemma it must contain F = σ(K). Of course P (Ω) = 1 = Q(Ω), so Ω 2 D. If A 2 D, then P (Ac) = 1 − P (A) = 1 − Q(A) = Q(Ac), so Ac 2 D. Let (An) be a sequence of pairwise disjoint sets in D. Then, [ X X [ P ( An) = P (An) = Q(An) = Q( An): n n n n S So n An 2 D. ut 5 Corollary 7.8. Let (Ω; F; P) be a probability space. Let (Πn)n be a sequence of π-systems, and let Fn = σ(Πn). Then, (Πn)n are mutually independent if and only if (Fn)n are mutually independent. Proof. We will prove by induction on n that for any n ≥ 0, the collection (F1; F2;:::; Fn; Πn+1; Πn+2;:::; ) are mutually independent. For n = 0 this is the assumption. For n > 1, let n1 < n2 < : : : < nk < nk+1 < : : : < nm be a finite number of indices such that nk < n < nk+1. Let Aj 2 Fnj ; j = 1; : : : ; k and Aj 2 Πnj ; j = k + 1; : : : ; m. For any A 2 Fn, if P(A1 \···\ Am) = 0 then A is independent of A1 \···\ Am. So assume that P(A1 \···\ Am) > 0.