
MTH 453: Basic Random Processes Lecture 1: Basic Probability Review References: Probability for Stat. and Machine Learning Fund. and Adv. Topics by Anirban DasGupta. Probability With Applications and R by Robert Dobrow. Introduction to Stochastic Processes with R by Robert Dobrow. Outline: 1) Experiments, Sample Spaces and Random Variables 2) A First Look into Simulation 3) Integer-Valued and Discrete Random Variables 4) Continuous Random Variables 5) Expectation, Moments and Simulation of Arbitrary Continuous RVs. 6) Joint Distributions and Independence of Random Variables 7) Properties of the most Common Distributions (will be taught/reviewed on the go) 8) Normal Approximations and Central Limit Theorem (will be taught/reviewed on the go) Part 1. Experiments, Sample Spaces and Random Variables Treatment of probability theory starts with the consideration of a sample space. The sample space is the set of all possible outcomes in some experiment. For example, if a coin is tossed twice and after each toss the face that shows is recorded, then the possible outcomes of this particular coin-tossing experiment, say X, are fHH;HT;TH;TT g, with H denoting the occurrence of heads and T denoting the occurrence of tails. We call Ω = fHH;HT;TH;TT g the sample space of the experiment X. In general, a sample space is a general set Ω, finite or infinite. An easy example where the sample space Ω is infinite is to toss a coin until the first time heads show up and record the number of the trial at which the first head appeared. In this case, the sample space Ω is the countably infinite set Ω = f1; 2; 3;:::g Sample spaces can also be uncountably infinite; for example, consider the experiment measuring the time for a light bulb to get burned. The sample space of this experiment is Ω = R+. In this case, Ω is an uncountably infinite set. In all cases, individual elements of a sample space are denoted as !. The first task is to define events and to explain the meaning of the probability of an event. 1 Definition 1.1. Let Ω be the sample space of an experiment X. Then any subset A of Ω, including the empty set ; and the entire sample space Ω is called an event. Events may contain even one single sample point !, in which case the event is a singleton set f!g. We want to assign probabilities to events. But we want to assign probabilities in a way that they are logically consistent. In fact, this cannot be done in general if we insist on assigning probabilities to arbitrary collections of sample points, that is, arbitrary subsets of the sample space . We can only define probabilities for such subsets of Ω that are tied together like a family, the exact concept being that of a σ-field. In most applications, including those cases where the sample space Ω is infinite, events that we would want to normally think about will be members of such an appropriate σ-field. Here is a definition of what counts as a legitimate probability on events. Definition 1.2. Given a sample space Ω, a probability or a probability measure on Ω is a function P ON SUBSETS of Ω such that (a) P[A] ≥ 0 for all A ⊆ Ω. )In particular A can be a singleton A = ! 2 Ω. P (b) P[Ω] = P[!] = 1 !2Ω (c) Given disjoint subsets A1;A2;::: of Ω, " 1 # 1 [ X P Ai = P[Ai] i=1 i=1 In particular considering an event A and all the possible elements in A, X P[A] = P[!] !2A You may not be familiar with some of the notations in this definition. The symbol 2 means \is an element of". So ! 2 Ω means ! is an element of Ω. We are also using a generalized Σ-notation. The P notation !2Ω means that the sum is over all ! that are elements of the sample space, that is, all outcomes in the sample space. In the case of a finite sample space Ω = f!1;:::;!kg, the equation in (b) becomes k X X P[Ω] = P[!] = P[!k] = P[!1] + P[!2] + ::: + P[!k] = 1 !2Ω n=1 Definition 1.3. Let Ω be a finite sample space consisting of N sample points. We say that the sample points are equally likely or that the probability distribution is uniform if P(!) = 1=N for each sample point !. An immediate consequence, due to the addivity axiom, is the following useful formula. 2 Proposition 1.4. Let Ω be a finite sample space consisting of N equally likely sample points. Let A be any event and suppose A contains n distinct sample points. Then Number of sample points favorable to A n (A) = = P Total number of sample points N Example 1.5. Roll a pair of dice. Find the sample space, identify the event that the sum of the two dice is equal to 7 and compute its probability. Solution: The random experiment is rolling two dice. Keeping track of the roll of each die gives the sample space Ω = f(1; 1); (1; 2); (1; 3); (1; 4); (1; 5); (1; 6); (2; 1); (2; 2);:::; (6; 5); (6; 6)g: The event is A = fSum is 7g = f(1; 6); (2; 5); (3; 4); (4; 3); (5; 2); (6; 1)g, and its probability, using Propo- sition 1, is 6=36 = 1=6. Homework Problem 1. Yolanda and Zach are running for president of the student association. Ten students will be voting. Identify (i) the sample space and (ii) the event that Yolanda beats Zach by at least 7 votes. Homework Problem 2. Joe will continue to flip a coin until heads appears. Identify the sample space and the event that it will take Joe at least three coin flips to get a head. Example 1.6. A college has six majors: biology, geology, physics, dance, art, and music. The proportion of students taking these majors are 45, 30, 15, 10, 10, and 35,respectively. Choose a random student. What is the probability they are a science major? Solution: The random experiment is choosing a student. The sample space is Ω = fBio; Geo; P hy; Dan; Art; Musg: The probability function is given by the number of students in the major divided by the total number of students (45+30+15+10+10+35=145). That is, P[Bio] = 45=145 ≈ 0:31; P[Geo] = 30=145 ≈ 0:21; P[Phy] = 15=145 ≈ 0:1; P[Dan] = 10=145 ≈ 0:07 P[Art] = 10=145 ≈ 0:07 P[Mus] = 35=145 ≈ 0:24 The event in question is A = fScience majorg = fBio, Geo, Phyg = fBiog [ fGeog [ fPhyg where the last equality holds because the events are disjoint (we are not considering \double majors"). Finally, P[A] = P[fBio; Geo; P hyg] = P[fBiog[fGeog[fPhyg] = P[Bio]+P[Geo]+P[P hy] ≈ 0:31+0:21+0:1 = 0:62: Homework Problem 3. In three coin tosses, what is the probability of getting at least two tails? 3 In order to compute some probabilities we need to know also how does the probability behaves or compares when two events have certain properties and the following proposition explains this more clearly: Proposition 1.7 (Properties of Probabilities). 1. If A implies B, that is, if A ⊆ B, then P[A] ≤ P[B]. Ex. A=Roll of dice being 2, B= Roll of dice being even. 2. P[A does not occur] = P[Ac] = 1 − P[A]. Ex: A=Roll of dice being 2 or 4, Ac = Roll of dice being 1,3,5 or 6. 3. For any events A and B, P[A or B] = P (A [ B) = P (A) + P (B) − P (A \ B): Ex: P[Roll of dice being 2, 3 or Roll of dice being 2 or 6] = P[Roll of dice being 2 or 3] + P[Roll of dice being 2 or 3] − P[Roll of dice being 2] = 1=3 + 1=3 − 1=6 = 1=2 Homework Problem 4. In a city, 75% of the population have brown hair, 40% have brown eyes, and 25% have both brown hair and brown eyes. A person is chosen at random. What is the probability that they 1. have brown eyes or brown hair? 2. have neither brown eyes nor brown hair? Often the outcomes of a random experiment take on numerical values. For instance, we might be interested in how many heads occur in three coin tosses. Let X be the number of heads. Then X is equal to 0; 1; 2; or 3, depending on the outcome of the coin tosses. The object X is called a random variable. The possible values of X are 0; 1; 2; and 3. Definition 1.8. A random variable X is a (measurable) function from the sample space Ω to the real numbers R. You can think simply that a random variable assigns numerical values to the outcomes of a random experiment. Random variables are enormously useful and allow us to use algebraic expressions, equalities, and inequalities when manipulating events. In many of the previous examples, we have been working with random variables without using the name, for example, the number of threes in rolls of a die, the number of votes received, the number of heads in repeated coin tosses, etc. Example 1.9. If we throw two dice, what is the probability that the sum of the dice is greater than or equal to four? Solution: We can, of course, find the probability by direct counting.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages24 Page
-
File Size-