<<
Home , P

1) Suppose (A) = 0.5 and P() = 0.4

a) Find P(A OR B) if A and B are independent.

b) Find P(A OR B) if A and B are mutually exclusive

[we didn’ talk about mutually exclusive, aka disjoint, in class, but this simply means the two events can not both happen, thus P(A AND B) = 0]

2) The route used by a motorist contains 2 intersections with traffic signals. The probability that he must stop at the 1st signal is 0.2 and the probability that he must stop at the 2nd signal is 0.3. The probability that he must stop at at least one of the signals is 0.4.

a) Find the probability that he must stop at both signals.

b) Find the probability that he must stop at the 1st signal but not at the 2nd signal.

) Find the probability that he must stop at the 2nd signal given that he stops at the 1st signal.

) Are the events “stop at the 1st signal” and “stop at the 2nd signal” independent? Explain.

3) A bag contains 20 slips of paper, 12 are red, 6 are blue, and 2 are green. Three slips are selected at random.

a) what is the probability that all three selected slips are red?

b) what is the probability that none of the three selected slips are red?

c) what is the probability that at least one of the selected slips are red?

4) A system is made up of 3 components, A, B, and C. The system fails if any of the 3 components fail. Component A fails 5% of the time, component B fails 10% of the time, component C fails 15% of the time, and the 3 components operate independently. What is the probability that the system fails?

5) Two players, Player A and Player B, are playing rock-paper-scissors. For those unfamiliar with the game, rock wins against scissors, paper wins against rock, scissors wins against paper. Player A selects rock 30% of the time, paper 20% of the time, and scissors 50% of the time, while Player B selects rock 40% of the time, paper 20% of the time, and scissors 40% of the time. Assume that on each round of the game, the players make selections at random according to the above probabilities, and that the players’ selections are made independently of one another.

a) what is the probability that Player A wins?

b) what is the probability that Player A wins given that Player B selects rock?

c) what is the probability that Player A wins given that Player B does not select rock?

d) what is the probability of a tie?

) what is the probability that Player B wins?

) what is the probability that Player A selected rock given that Player A wins?

Solutions

1) Suppose P(A) = 0.5 and P(B) = 0.4

a) Find P(A OR B) if A and B are independent.

P(A OR B) = P(A) + P(B) – P(A AND B)

If they are independent, then P(A AND B) = P(A) ∙ P(B), so

P(A OR B) = 0.5 + 0.4 – (0.5)(0.4) = 0.7

b) Find P(A OR B) if A and B are mutually exclusive

P(A OR B) = P(A) + P(B) – P(A AND B)

If they are mutually exclusive then P(A AND B) = 0, so

P(A OR B) = 0.5 + 0.4 – 0.0 = 0.9

2) The route used by a motorist contains 2 intersections with traffic signals. The probability that he must stop at the 1st signal is 0.2 and the probability that he must stop at the 2nd signal is 0.3. The probability that he must stop at at least one of the signals is 0.4.

Let A = stop at 1st signal, let B = stop at 2nd signal, so

P(A) = 0.2, P(B) = 0.3, P(A OR B) = 0.4

[“both” = AND, “at least one” = OR]

a) Find the probability that he must stop at both signals.

P(A OR B) = P(A) + P(B) – P(A AND B)

0.4 = 0.2 + 0.3 – P(A AND B)

P(A AND B) = 0.1

b) Find the probability that he must stop at the 1st signal but not at the 2nd signal.

P(A AND BC) = 0.1

(Venn diagram makes this one trivial)

A B 0.1 0.1 0.2

0.6 Notice that P(A) = 0.2 is split into two pieces, the part that is also in B, P(A AND B) = 0.1, and the part that is not in B, P(A AND BC) = 0.1. The Venn diagram makes some other probabilities trivial that sound difficult when worded. “What is the probability that he must stop at the 2nd signal but not at the 1st signal?”, P(B AND AC) = 0.2 [in B and not in A]. “What is the probability that he must stop at neither signal?”, P(AC AND BC) = 0.6.

c) Find the probability that he must stop at the 2nd signal given that he stops at the 1st signal.

P(B | A) = P(B AND A) / P(A) = 0.1 / 0.2 = 0.5

c) Are the events “stop at the 1st signal” and “stop at the 2nd signal” independent? Explain.

By definition of independence:

P(B | A) = 0.5 ≠ P(B) = 0.3, so not independent

OR

By checking multiplication rule for independent events:

P(A) P(B) = 0.2 x 0.3 = 0.06 ≠ P(A AND B) = 0.1, so not independent

3) A bag contains 20 slips of paper, 12 are red, 6 are blue, and 2 are green. Three slips are selected at random.

a) what is the probability that all three selected slips are red?

By general multiplication rule:

P(red AND red AND red) = P(1st = red AND 2nd = red AND 3rd = red)

= P(1st = red) ∙ P(2nd = red | 1st = red) ∙ P(3rd = red | 1st = red AND 2nd = red)

= (12/20) (11/19) (10/18) = 0.1930

OR

By counting

P(red AND red AND red)

= number of ways to select three red / number of ways to select three slips

= 12C3 / 20C3 = 0.1930

b) what is the probability that none of the three selected slips are red?

P(redc AND redc AND redc) = (8/20) (7/19) (6/18) = 0.0491

OR

c c c P(red AND red AND red ) = 8C3 / 20C3 = 0.0491

c) what is the probability that at least one of the selected slips are red?

P(at least one is red) = 1 – P(none are red) = 1 – 0.0491 = 0.9509

4) A system is made up of 3 components, A, B, and C. The system fails if any of the 3 components fail. Component A fails 5% of the time, component B fails 10% of the time, component C fails 15% of the time, and the 3 components operate independently. What is the probability that the system fails?

P(system fails) = 1 – P(system does not fail) = 1 – P(A doesn’t fail AND B doesn’t fail AND C doesn’t fail) = 1 – P(A doesn’t fail) ∙ P(B doesn’t fail) ∙ P(C doesn’t fail) = 1 – (1 – 0.05)(1 – 0.10)(1 – 0.15) = 1 – 0.72675 = 0.27325

We get to multiply the probabilities of each one NOT failing because the failures are independent.

5) Two players, Player A and Player B, are playing rock-paper-scissors. For those unfamiliar with the game, rock wins against scissors, paper wins against rock, scissors wins against paper. Player A selects rock 30% of the time, paper 20% of the time, and scissors 50% of the time, while Player B selects rock 40% of the time, paper 20% of the time, and scissors 40% of the time. Assume that on each round of the game, the players make selections at random according to the above probabilities, and that the players’ selections are made independently of one another.

a) what is the probability that Player A wins?

P(A wins) = P( [A = rock AND B = scissors] OR [A = paper AND B = rock] OR [A = scissors AND B = paper] ) = (0.3)(0.4) + (0.2)(0.4) + (0.5)(0.2) = 0.12 + 0.08 + 0.1 = 0.3

There is no subtraction in calculating the OR probabilities because the three ways A can win are mutually exclusive.

Also we just get to multiply probabilities, eg P(A = rock AND B = scissors) = P(A = rock) ∙ P(B = scissors), because we were told the selections by player A and player B were independent.

b) what is the probability that Player A wins given that Player B selects rock?

P(A wins | B = rock) = P(A wins AND B = rock) / P(B = rock) = P(A = paper AND B = rock) / P(B = rock) = P(A = paper) ∙ P(B = rock) / P(B rock) = P(A = paper) = 0.2

note: given that Player B selects rock, Player A can only win by selecting paper, which happens 20% of the time, regardless of that Player B does (since their selections are independent). Also note that Player A winning is NOT independent of what Player B picks, because in part a the probability of Player A winning is 0.3, but here in part b the probability that Player A wins given that Player B has selected rock is now only 0.2. So the probability of Player A winning has changed given that we know what Player B has chosen. However, Player A’ selection is independent of Player B’s selection.

c) what is the probability that Player A wins given that Player B does not select rock?

P(A wins | B = not rock) = P(A wins AND B = not rock) / P(B not rock) = P( [A = rock AND B = scissors] OR [A = scissors AND B = paper] ) / P(B not rock) = [ (0.3)(0.4) + (0.5)(0.2) ] / 0.6 = 0.22 / 0.6 = 0.3667

d) what is the probability of a tie?

P(tie) = P([A = rock AND B = rock] OR [A = paper AND B = paper] OR [A = scissors AND B = scissors]) = P(A = rock) ∙ P(B = rock) + P(A = paper) ∙ P(B = paper) + P(A = scissors) ∙ P(B = scissors) = = (0.3)(0.4) + (0.2)(0.2) + (0.5)(0.4) = 0.36

e) what is the probability that Player B wins?

P(A wins) + P(B wins) + P(tie) = 1 0.3 + P(B wins) + 0.36 = 1 P(B wins) = 0.34

f) what is the probability that Player A selected rock given that Player A wins?

P(A = rock | A wins) = P(A = rock AND A wins) / P(A wins)

= P(A rock AND B scissors) / P(A wins)

= P(A rock) P(B scissors) / P(A wins)

= (0.3)(0.4) / 0.3 = 0.4

Note: If A selects rock AND A wins, the only way A can win with rock is if B has selected scissors, which happens 40% of the time. Here is yet another way to see the same probability…

Player A Rock (0.3) Paper (0.2) Scissors (0.5) Rock (0.4) (0.3)(0.4) = 0.12 (0.2)(0.4) = 0.08 (0.5)(0.4) = 0.20 Tie A wins B wins Paper (0.2) (0.3)(0.2) = 0.06 (0.2)(0.2) = 0.04 (0.5)(0.2) = 0.10 Player B B wins Tie A wins Scissors (0.4) (0.3)(0.4) = 0.12 (0.2)(0.4) = 0.08 (0.5)(0.4) = 0.20 A wins B wins Tie

From this table you can see that tie is 0.36, A wins is 0.30 and B wins is 0.34. Also, given that A has won, the only way that happens with A selecting rock is when B selects scissors. And there’s your 0.12 / 0.3 = 0.4