SOLUTION FOR HOMEWORK 1, STAT 6329

Well, welcome to your first homework. In my solutions you may find some seeded mistakes (this is why it is not a good idea just to copy my solution). If you find them — please, do not e-mail or call me. Instead, write down them on the first page of your solutions and you may give yourself an extra credit — but keep in mind that the total for your homeworks cannot exceed 20 points.

Now let us look at your problems. 1. Problem 1.4. (page 15). This is a nice exercise to polish your understanding about c c combination of events. DeMorgan’s Law, which states that (∪iEi) = ∩iEi , and similarly c c (∩iEi) = ∪iEi , is useful. Also, using complementary events may be useful. Here we are dealing with three arbitrary events E, F and G from the same S. Remember that we may write EF := E ∩ F to make formulae simpler. (a). Only F occurs is the event

F (E ∪ G)c = F (Ec ∩ Gc)= F EcGc.

(b) both E and F but not G can be written “as you read it”

EFGc.

(c) at least one event occurs is simply

E ∪ F ∪ G

(d) at least two events occur you may write as

EF ∪ FG ∪ EG

(e) all three events occur is simply EFG (f) none occurs is the complementary event to at least one occurs,

(E ∪ F ∪ G)c = EcF cGc.

In the last equality I used DeMorgan’s Law. (g) At most one event occurs is the complementary event to at least two occurs, discussed in (d). So we get via DeMorgan’s law

(EF )c(FG)c(EG)c.

(h) At most two occur is the complementary to all three events occur, discussed in (e), so we get (EFG)c

1 2. Problem 1.10. Well, let us understand the Hint. Write

A ∪ B = A ∪ (AcB) and note that events A and BAc are mutually exclusive. Now, if B = C ∪ D then we can continue A ∪ B = A ∪ (AcB)= A ∪ (Ac(C ∪ D)) = A ∪ (Ac(C ∪ CcD) To continue, I should remember DISTRIBUTIVE LAWS

(E ∪ F )G = EG ∪ FG, EF ∪ G =(E ∪ G)(F ∪ G).

It is a good idea to know these laws because they help a lot in solving different problems. Then we continue A ∪ B = A ∪ CAc ∪ DAcCc. For n = 3 this is exactly the Hint, and you may continue to prove it (or use induction). As a result, using notation of the Hint, n n n n P (∪i=1Ei)= P (∪i=1Fi)= X P (Fi) ≤ X P (Ei). i=1 i=1 In the second equality I used the third axiom of that the probability of the union of mutually exclusive events is the sum of of the events, and in the last inequality I used the fact that Fi ⊂ Ei and then P (Fi) ≤ P (Ei). 3. Problem 1.23. Remember that for any two events A and B we have

P (AB)= P (A)P (B|A).

Further, the can be treated as a traditional probability (please look how I prove this in the last problem 1.47), that is, for instance we have

P (AB|E)= P (A|E)P (B|AE) as long as you work with the same “given” event, here E. Using these results we can write

P (E1E2 ...En)= P (E1 ∩ (E2E3 ...En))

P (E1)P ((E2E3 ...En)|E1)= P (E1)P (E2|E1)P (E3E4 ...En|E2E1) and continuing these steps we arrive at

= P (E1)P (E2|E1)P (E3|E2E1) ...P (En|En−1En−2 ...E1).

4. Problem 1.28. Given P (A)

2 Using the given P (A)

P (B)P (A) P (B|A) > = P (B). P (A)

Answer: The made conjecture is correct. 5. Problem 1.33. Suppose that we add x sophomore girls. Denote: F - selected student is freshman, B - selected student is boy. Then 4 P (F B)= , 4+6+ x +6 and 4+6 4+6 P (F )= , P (B)= . 4+6+ x +6 4+6+ x +6 We have P (F B)= P (F )P (B) if (solve the corresponding equation with respect to x) x = 9. Please check that then we also have P (F cB)= P (F c)P (B) and P (F Bc)= P (F )P (Bc). Can you prove that this is a general result for this setting? 6. Problem 1.38. This is a classical Bayes problem. You have a two-stage experiment, of the second stage is given and you need to recalculate probability for outcomes of the first stage. Denote by W1 and B1 complementary events that white (black) ball is transferred from urn 1, and W2 the event that the ball drawn from urn 2 is white. Write using the Bayes formula

P (W1W2) P (W1W2) P (W1|W2)= = P (W2) P (W1)P (W2|W1)+ P (B1)P (W2|B1)

P (W1)P (W2|W1) = P (W1)P (W2|W1)+ P (B1)P (W2|B1) (2/3)(2/7) = =4/5. (2/3)(2/7)+(1/3)(1/7)

7. Problem 1.42. This is again a problem to use the Bayes formula (two-stage experiment and you need to recalculate the probability of outcome for the first stage given outcome for the second stage). Denote the events for the first stage as T - two-heads, F - fair coin, U - unfair coin with 75% for the head. Let H2 be the event that on the second stage we observe head. For the conditional probability in question we can write

P (TH2) P (T )P (H2|T ) P (T |H2)= = = P (H2) P (T )P (H2|T )+ P (F )P (H2|F )+ P (U)P (H2|U))

(1/3)(1) = =4/9. (1/3)(1) + (1/3)(1/2)+(1/3)(3/4)

3 8. Problem 1.43. It is again the Bayes formula’s problem. Let I be the event that Ith coin is flipped. Then (I will use notation P (A ∩ B) =: P (A, B) that will be used often) P (I =5,H) P (I = 5)P (H|I = 5) P (I =5|H)= = 10 P (H) Pi=1 P (I = i)P (H|i) (1/10)(5/10) = 10 =1/(11). Pi=1(1/10)(i/10) 9. Problem 1.44. This is again the Bayes formula’s application. Let T and H means that the tails or heads is the outcome of flipping the coin, and then W and B are the events that white or black ball is selected from urns 2 or 1 (depending on tossing the coin). Then the probability in question is P (T W ) P (T )P (W |T ) P (T |W )= = P (W ) P (T )P (W |T )+ P (H)P (W |H) (1/2)(1/5) = = 12/37. (1/2)(1/5)+(1/2)(5/12) 10. Problem 1.47. What the author wants to say is that the conditional probability is a regular probability and thus it satisfies 3 axioms of probability. Indeed, for the the first axiom P (A|B)= P (AB)/P (B) and this conditional probability is nonnegative and it is at most 1 because P (AB) ≤ P (B). For the second axiom P (S|B)= P (SB)/P (B)= P (B)/P (B)=1 so it is satisfied. For the third axiom we have for a collection Ei, i =1, 2,... of mutually exclusive events

that ∞ ∞ P (∪i=1(EiB)) P (∪i=1Ei|B)= . P (B) For the numerator we can use the third axiom because all events are again mutually exclusive so we continue ∞ ∞ ∞ Pi=1 P (EiB) P (EiB) P (∪i=1Ei|B)= = X P (B) i=1 P (B) ∞ = X P (Ei|B). i=1 This verifies the third axiom for the conditional density. As a result, since we know (the total probability formula) that P (A)= P (C)P (A|C)+ P (Cc)P (A|Cc) we can use it for the conditional probability P (A|B) and get P (A|B)= P (C|B)P (A|CB)+ P (Cc|B)P (A|CcB). What was wished to show.

4