
Topic 6 Conditional Probability and Independence One of the most important concepts in the theory of probability is based on the question: How do we modify the probability of an event in light of the fact that something new is known? What is the chance that we will win the game now that we have taken the first point? What is the chance that I am a carrier of a genetic disease now that my first child does not have the genetic condition? What is the chance that a child smokes if the household has two parents who smoke? This question leads us to the concept of conditional probability. 6.1 Restricting the Sample Space - Conditional Probability Toss a fair coin 3 times. Let winning be “at least two heads out of three” HHH HHT HTH HTT THH THT TTH TTT Figure 6.1: Outcomes on three tosses of a coin, with the winning event indicated. 1 If we now know that the first coin toss is heads, then only the top row is possible A and we would like to say that the probability of winning is 0.8 #(outcomes that result in a win and also have a heads on the first coin toss) 0.6 #(outcomes with heads on the first coin toss) 0.4 B # HHH, HHT, HTH 3 = { } = . # HHH, HHT, HTH, HTT 4 0.2 { } We can take this idea to create a formula in the case of equally likely outcomes for 0 the statement the conditional probability of A given B. −0.2 P (A B)=the proportion of outcomes in A that are also in B | −0.4 #(A B) A = \ #(B) −0.6 B We can turn this into a more general statement using only the probability, P , by−0.8 dividing both the numerator and the denominator in this fraction by #(⌦). −1 0 0.2 0.4 0.6 0.8 1 #(A B)/#(⌦) P (A B) Figure 6.2: Two Venn diagrams P (A B)= \ = \ (6.1) to illustrate conditional probability. | #(B)/#(⌦) P (B) For the top diagram P (A) is large but P (A B) is small. For the bot- We thus take this version (6.1) of the identity as the general definition of conditional | A B P (B) > 0 tom diagram P (A) is small but probability for any pair of events and as long as the denominator . P (A B) is large. | 89 Introduction to the Science of Statistics Conditional Probability and Independence Exercise 6.1. Pick an event B so that P (B) > 0. Define, for every event A, Q(A)=P (A B). | Show that Q satisfies the three axioms of a probability. In words, a conditional probability is a probability. Exercise 6.2. Roll two dice. Find P sum is 8 first die shows 3 , and P sum is 8 first die shows 1 { | } { | } (1,1) (1,2) (1,3) (1,4) (1,5) (1,6) (2,1) (2,2) (2,3) (2,4) (2,5) (2,6) (3,1) (3,2) (3,3) (3,4) (3,5) (3,6) (4,1) (4,2) (4,3) (4,4) (4,5) (4,6) (5,1) (5,2) (5,3) (5,4) (5,5) (5,6) (6,1) (6,2) (6,3) (6,4) (6,5) (6,6) Figure 6.3: Outcomes on the roll of two dice. The event first roll is 3 is indicated. { } Exercise 6.3. Roll two four-sided dice. With the numbers 1 through 4 on each die, the value of the roll is the number on the side facing downward. Assuming all 16 outcomes are equally likely, find P sum is at least 5 , P first die is 2 { } { } and P sum is at least 5 first die is 2 { | } 6.2 The Multiplication Principle The defining formula (6.1) for conditional probability can be rewritten to obtain the multiplication principle, P (A B)=P (A B)P (B). (6.2) \ | Now, we can complete an earlier problem: P ace on first two cards = P ace on second card ace on first card P ace on first card { } { | } { } 3 4 1 1 = = . 51 ⇥ 52 17 ⇥ 13 We can continue this process to obtain a chain rule: P (A B C)=P (A B C)P (B C)=P (A B C)P (B C)P (C). \ \ | \ \ | \ | Thus, P ace on first three cards { } = P ace on third card ace on first and second card P ace on second card ace on first card P ace on first card { | } { | } { } 2 3 4 1 1 1 = = . 50 ⇥ 51 ⇥ 52 25 ⇥ 17 ⇥ 13 Extending this to 4 events, we consider the following question: Example 6.4. In a urn with b blue balls and g green balls, the probability of green, blue, green, blue (in that order) is g b g 1 b 1 (g) (b) − − = 2 2 . b + g · b + g 1 · b + g 2 · b + g 3 (b + g) − − − 4 4 Notice that any choice of 2 green and 2 blue would result in the same probability. There are 2 =6such choices. Thus, with 4 balls chosen without replacement 4 (g) (b) P 2 blue and 2 green = 2 2 . { } 2 (b + g) ✓ ◆ 4 90 Introduction to the Science of Statistics Conditional Probability and Independence Exercise 6.5. Show that 4 (g) (b) b g 2 2 = 2 2 . 2 (b + g) b+g ✓ ◆ 4 4 Explain in words why P 2 blue and 2 green is the expression on the right. { } We will later extend this idea when we introduce sampling without replacement in the context of the hypergeomet- ric random variable. 6.3 The Law of Total Probability If we know the fraction of the population in a given state of the United States that has a given attribute - is diabetic, over 65 years of age, has an income of $100,000, owns their own home, is married - then how do we determine what fraction of the total population of the United States has this attribute? We address this question by introducing a concept - partitions - and an identity - the law of total probability. Definition 6.6. A partition of the sample1.2 space ⌦ is a finite collection of pairwise mutually exclusive events C ,C ,...,C { 1 2 n} whose union is ⌦. 1 C C 8 Thus, every outcome ! ⌦ belongs to ex- 4 C 2 6 actly one of the Ci. In particular, distinct mem- C bers of the partition are mutually exclusive.0.8 (C 1 i \ C = , if i = j) j ; 6 If we know the fraction of the population from A 18 to 25 that has been infected by the0.6 H1N1 in- fluenza A virus in each of the 50 states, then we C cannot just average these 50 values to obtain the 7 C fraction of this population infected in the whole 2 country. This method fails because it0.4 give equal C weight to California and Wyoming. The law of 5 total probability shows that we should weigh these conditional probabilities by the probability C 0.2 9 of residence in a given state and then sum over all C of the states. 3 Theorem 6.7 (law of total probability).0Let P be ⌦ C ,C ,...,C a probability on and let 1 2 n be a Figure 6.4: A partition C1 ...,C9 of the sample space ⌦. The event A can be { } { } partition of ⌦ chosen so that P (Ci) > 0 for all i. written as the union (A C1) (A C9) of mutually exclusive events. \ [···[ \ Then, for any event A ⌦, ⇢ −0.2 n −0.2 P (A)=0 P (A C0.2)P (C ). 0.4 0.6 0.8 (6.3)1 1.2 | i i i=1 X Because C ,C ,...,C is a partition, A C ,A C ,...,A C are pairwise mutually exclusive events. { 1 2 n} { \ 1 \ 2 \ n} By the distributive property of sets, their union is the event A. (See Figure 6.4.) To refer the example above the C are the residents of state i, A C are those residents who are from 18 to 25 i \ i years old and have been been infected by the H1N1 influenza A virus. Thus, distinct A C are mutually exclusive - \ i individuals cannot reside in 2 different states. Their union is A, all individuals in the United States between the ages of 18 and 25 years old who have been been infected by the H1N1 virus. 91 Introduction to the Science of Statistics Conditional Probability and Independence Thus, n c P (A)= P (A Ci). (6.4) C C \ i=1 X Finish by using the multiplication identity (6.2), P (A C )=P (A C )P (C ),i=1, 2,...,n A \ i | i i and substituting into (6.4) to obtain the identity in (6.3). The most frequent use of the law of total probability comes in the case of a partition of the sample space into two events, C, Cc . In this case the law of total probability { } becomes the identity P (A)=P (A C)P (C)+P (A Cc)P (Cc). (6.5) | | Figure 6.5: A partition into two events C and Cc. Exercise 6.8. The problem of points is a classical problem in probability theory. The problem concerns a series of games with two sides who have equal chances of winning each game. The winning side is one that first reaches a given number n of wins. Let n =4for a best of seven playoff. Determine p = P winning the playoff after i wins vs j opponent wins ij { } 1 (Hint: pii = 2 for i =0, 1, 2, 3.) 6.4 Bayes formula Let A be the event that an individual tests positive for some disease and C be the event that the person actually has the disease.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages12 Page
-
File Size-