18.310 lecture notes February 10, 2015 Probability Theory Lecturer: Michel Goemans These notes cover the basic definitions of discrete probability theory, and then present some results including Bayes' rule, inclusion-exclusion formula, Chebyshev's inequality, and the weak law of large numbers. 1 Sample spaces and events To treat probability rigorously, we define a sample space S whose elements are the possible outcomes of some process or experiment. For example, the sample space might be the outcomes of the roll of a die, or flips of a coin. To each element x of the sample space, we assign a probability, which will be a non-negative number between 0 and 1, which we will denote by p(x). We require that X p(x) = 1; x2S so the total probability of the elements of our sample space is 1. What this means intuitively is that when we perform our process, exactly one of the things in our sample space will happen. Example. The sample space could be S = a; b; c , and the probabilities could be p(a) = 1=2, p(b) = 1=3, p(c) = 1=6. f g If all elements of our sample space have equal probabilities, we call this the uniform probability distribution on our sample space. For example, if our sample space was the outcomes of a die roll, the sample space could be denoted S = x1; x2; : : : ; x6 , where the event xi correspond to rolling i. f g The uniform distribution, in which every outcome xi has probability 1/6 describes the situation for a fair die. Similarly, if we consider tossing a fair coin, the outcomes would be H (heads) and T (tails), each with probability 1/2. In this situation we have the uniform probability distribution on the sample space S = H; T . We define an event Afto beg a subset of the sample space. For example, in the roll of a die, if the event A was rolling an even number, then A = x2; x4; x6 . The probability of an event A, f g denoted by P(A), is the sum of the probabilities of the corresponding elements in the sample space. For rolling an even number, we have 1 (A) = p(x ) + p(x ) + p(x ) = P 2 4 6 2 Given an event A of our sample space, there is a complementary event which consists of all points in our sample space that are not in A. We denote this event by A. Since all the points in a sample space S add to 1, we see that : X X X P(A) + P( A) = p(x) + p(x) = p(x) = 1; : x2A x=2A x2S Prob-1 and so P( A) = 1 P(A). Note that,: although− two elements of our sample space cannot happen simultaneously, two events can happen simultaneously. That is, if we defined A as rolling an even number, and B as rolling a small number (1,2, or 3), then it is possible for both A and B to happen (this would require a roll of a 2), neither of them to happen (this would require a roll of a 5), or one or the other to happen. We call the event that both A and B happen \A and B", denoted by A B (or sometimes A B), and the event that at least one happens \A or B", denoted by A B (or^ sometimes A B).\ Suppose that we have two events A and B. These divide our_ sample space into four[ disjoint parts, corresponding to the cases where both events happen, where one event happens and the other does not, and where neither event happens, see Figure 1. These cases cover the sample space, accounting for each element in it exactly once, so we get P(A B) + P(A B) + P( A B) + P( A B) = 1: ^ ^ : : ^ : ^ : A B S A B A B B A ∧ ¬ ∧ ∧ ¬ A B ¬ ∧ ¬ Figure 1: Two events A and B as subsets of the state space S. 2 Conditional probability and Independence Let A be an event with non-zero probability. We define the probability of an event B conditioned on event A, denoted by P(B A), to be j P(A B) P(B A) = ^ : j P(A) Why is this an interesting notion? Let's give an example. Suppose we roll a fair die, and we ask what is the probability of getting an odd number, conditioned on having rolled a number that is at most 3? Since we know that our roll is 1, 2, or 3, and that they are equally likely (since we started with the uniform distribution corresponding to a fair die), then the probability of each of 1 these outcomes must be 3 . Thus the probability of getting an odd number (that is, of getting 1 2 or 3) is 3 . Thus if A is the event \outcome is at most 3" and B is the event \outcome is odd", then we would like the mathematical definition of the \probability of B conditioned on A" to give P(B A) = 2=3. And indeed, mathematically we find j P(B A) 2=6 2 P(B A) = ^ = = : j P(A) 1=2 3 The intuitive reason for which our definition of P(B A) gives the answers we wanted is that the probability of every outcome in A gets multiplied by j 1 when one conditions on the event A. P(A) Prob-2 It is a simple calculation to check that if we have two events A and B, then P(A) = P(A B)P(B) + P(A B)P( B): j j: : Indeed, the first term is P(A B) and the second term P(A B). Adding these together, we get ^ ^ : P(A B) + P(A B) = P(A): ^ ^ : If we have two events A and B, we say that they are independent if the probability that both happen is the product of the probability that the first happens and the probability that the second happens, that is, if P(A B) = P(A) P(B): ^ · Example. For a die roll, the events A of rolling an even number, and B of rolling a number less 1 1 1 or equal to 3 are not independent, since P(A) P(B) = P(A B). Indeed, 2 2 = 6 . However, if · 6 ^ · 6 1 you define C to be the event of rolling a 1 or 2, then A and C are independent, since P(A) = 2 , 1 1 P(C) = , and P(A C) = . 3 ^ 6 Let us now show on an example that our mathematical definition of independence does capture the intuitive notion of independence. Let's assume that we toss two coins (not necessarily fair coins). The sample space is S = HH;HT;TH;TT (where the first letter represents the result of the first coin). Let us denote thef event of the firstg coin being a tail by T , and the event of the ◦ second coin being a tail by T and so on. By definition, we have P(T ) = p(TH) + p(TT ) and so on. Suppose that knowing◦ that the first coin is a tail doesn't change◦ the probability that the second coin is a tail. This gives P( T T ) = P( T ): ◦ j ◦ ◦ Moreover, by definition of conditional probability, P(TT ) P( T T ) = : ◦ j ◦ P(T ) ◦ Combining these equations gives P(TT ) = P(T )P( T ); ◦ ◦ or equivalently P(T T ) = P(T )P( T ): ◦ ^ ◦ ◦ ◦ Which is the condition we took to define the independence. Conclusion: knowing that the first coin is a tail doesn't change the probability that the second coin is a tail is the same as what we defined as \independence" between the events T and T . ◦ ◦ More generally, suppose that A and B are independent. In this case, we have P(A B) P(A)P(B) P(B A) = ^ = = P(B): j P(A) P(A) That is, if two events are independent, then the probability of B happening, conditioned on A happening is the same as the probability of B happening without the conditioning. It is straight- forward to check that the reasoning can be reversed as well: if the probability of B does not change when you condition on A, then the two events are independent. Prob-3 We define k events A1 :::Ak to be independent if the intersection of any subset of these events is equal to the product of their probability, that is, if for all 1 i1 < i2 < < is k, ≤ ··· ≤ P(Ai1 Ai2 Ais ) = P(Ai1 )P(Ai2 ) P(Ais ): ^ ^ · · · ^ ··· It is possible to have a set of three events such that any two of them are independent, but all three are not independent. It is an interesting exercise to try to find such an example. If we have k probability distributions on sample spaces S1 :::Sk, we can construct a new prob- ability distribution called the product distribution by assuming that these k processes are inde- pendent. Our new sample space is made of all the k-tuples (s1; s2; : : : ; sk) where si Si. The probability distribution on this sample space is defined by 2 k Y p(s1; s2; : : : ; sk) = p(si): i=1 For example, if you roll k dice, your sample space will be the set of tuples (s1; : : : ; sk) where si x1; x2; : : : ; x6 . The value of si represents the result of the i-th die (for instance si = x3 means2 f that the i-thg die rolled 3).
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages15 Page
-
File Size-