Basic Concepts of Probability Theory (Part I)

Total Page:16

File Type:pdf, Size:1020Kb

Basic Concepts of Probability Theory (Part I) ECE270: Handout 1 Basic Concepts of Probability Theory (Part I) Outline: 1. a random experiment, 2. an experiment outcome, 3. sample space and its three di®erent types, 4. events, 5. review of set theory, Venn diagrams, and DeMorgan's laws. F Random Experiment ² Random experiment is an experiment in which the outcome varies in a unpredictable fashion, when the experiment is repeated under the same conditions. ² A random experiment is denoted by E and is speci¯ed by i) stating an experimental proce- dure, ii) exactly what is (are) measured or observed. EXAMPLE 1 { An information block of length 5 is transmitted repeatedly over a noisy channel until an error-free block arrives at the receiver. Count the number of transmission required. Count the number of errors in each transmitted block. { Measure the lifetime of a given computer memory chip in a speci¯ed environment. { Pick two numbers between 0 and 1. { Pick a number x between 0 and 1 randomly and then pick a number y between x and 1. { Flip a coin twice. Note the sequence of heads and tails. Note the number of tails. F An Experiment Outcome ² The result of a random experiment is called outcome. An outcome is denoted by lowercase letters x; y; t; »; :::. When we perform a random experiment one and only one outcome occurs. So outcomes are mutually exclusive, in the sense that they cannot occur simultaneously. F Sample Space ² Random experiments do not result in the same outcome. We should determine the set of all possible outcomes. The set of all possible outcomes is called sample space and denoted as S. An outcome » is an element of the set S, i.e., by set theory notation we have » 2 S. ² We can view the result of a random experiment as a random selection of a single outcome » from the set S. EXAMPLE 2 ² A hospital administrator codes incoming patients according to whether they have insurance (coding 1 if they do and 0 if they don't) and their condition (rated as good, fair, serious). Cod- ing the patients is a random experiment with sample space S = f(0; g); (0; f); (0; s); (1; g); (1; f); (1; s)g, » = (1; f) 2 S is an outcome. Exercise: What are the sample spaces S of the random experiments in Example 1? F Di®erent Types of Sample Space ² A sample space S can be i) ¯nite, ii) countably in¯nite, iii) uncountably in¯nite. ² A sample space S is either discrete or continuous. ² A discrete sample space S is either ¯nite or countably in¯nite. A continuous sample space S is uncountably in¯nite. EXAMPLE 3 (di®erent types of S) { ¯nite S, e.g., S = f1; 2; 3; 4; cos(¼=6); sin(¼=3)g, 6 2 S, S = fTT;TH;HT;HHg;TH 2 S { countably in¯nite S, e.g., S = f0; 1; 2; 3; :::g; 270 2 S, S = fT; HT; HHT; HHHT; :::g; T H2 = S; HHHHT 2 S { uncountably in¯nite S, e.g., S = fθ : 0 6 θ 6 2¼g; ¼ 2 S; S = f(x; y) : 0 6 x 6 y 6 2¼g; (5:4; ¼) 2 S (typical examples are: S is the real line, an interval, the plane, a region). Exercise: Can you think of a random experiment corresponding to the sample spaces in Example 3? Exercise: Consider the sample spaces of the random experiments in Example 1. Determine the type of each sample space (i.e., whether it is ¯nite, countably in¯nite, or uncountably in¯nite). Page 2 F Events ² An event is a set consisting of outcome(s) and is denoted by capital letters A; B; E; F; :::. Any event E is a subset of the sample space S, i.e., E ½ S (Recall: in set theory for two sets A and B, we say A ½ B if every element of A also belongs to B). ² We say the event E happens if the outcome of the experiment » belongs to the set E. ² An event from a discrete sample space S (S is either ¯nite or countably in¯nite) may consist of a single outcome. Such an event is called elementary event. ² Two events of special interest are: i) certain event S, this event always occurs, ii) null or impossible event ©, this event includes no outcome and hence never occurs. Exercise: Consider the the random experiments in Example 1. For each random experiment specify i) the null event, ii) the certain event, iii) an event that is neither certain nor impos- sible. EXAMPLE 4 Let consider Example 2 again. Suppose A is the event that the patient is in serious condition ) A = f(0; s); (1; s)g, we have A ½ S; (0; s) 2 A; (0; f) 2= A. Suppose B is the event that the patient is uninsured ) B = f(0; g); (0; f); (0; s)g, we have B ½ S; (0; f) 2 B; (1; f) 2= B. Suppose C is the event that the patient is uninsured and is in serious condition ) C = f(0; s)g, we have (0; s) 2 C (C is an elementary event). F Review of Set Theory ² In probability theory we are interested in determining the probabilities of events. Since each event is represented by a set we need to know the basics of set theory. ² For two events E and F the new event E [ F (so-called the union of E and F ) consists of all outcomes that are either in E or in F or in both. ² For two events E and F the new event E \ F or EF (so-called the intersection of E and F ) consists of all outcomes that are in both E and F . If E \ F = © then E and F are said to be mutually exclusive. ² For any event E we de¯ne the new event Ec (so-called the complement of E). The event Ec consists of all outcomes in S that are not in E, i.e., Ec occurs if and only if E does not occur. ² For any event E we have E [ Ec = S and E \ Ec = ©. ² For any event E we have (Ec)c = E. ² If E = S then Ec = ©. Also, if E = © then Ec = S. ² For two events E and F we write E ½ F if all outcomes in E are also in F . Page 3 ² For two events E and F suppose we have E ½ F . Then the occurrence of E necessarily implies the occurrence of F (why? suppose E occurs. This implies that the random experiment outcome belongs to E. Therefore, the outcome belongs to F as well. Hence, F occurs.). ² For two events E and F if E ½ F and F ½ E then we have E = F , i.e., the two events are equal. EXAMPLE 5 Let consider Example 4 again. We have C = A \ B, f(1; s)g = A \ Bc, C ½ B, C ½ A, B = B [ C. F Venn Diagrams ² When discussing sets, we use Venn diagram to show the relations among sets. S is represented as consisting of all outcomes in a large rectangular, the events F; E; G are represented as consisting of all the outcomes in given circles within the rectangular. Events of interest are indicated by shading appropriate regions of the diagram. ² The operation of forming unions, intersections, and complements of events obey certain rules (laws) similar to the rules of algebra. The rules are: i) commutative laws E [ F = F [ E and E \ F = F \ E ii) associative laws (E [ F ) [ G = E [ (F [ G) and (E \ F ) \ G = E \ (F \ G) iii) distributive laws (E[F )\G = (E\G)[(F \G) and (E\F )[G = (E[G)\(F [G) F DeMorgan's Laws ² DeMorgan's laws describes a useful relationship between the three basic operations of forming unions, intersections, and complements: a)(E [ F )c = Ec \ F c b)(E \ F )c = Ec [ F c Proof of a): we show 1) (E [ F )c ½ Ec \ F c and 2) Ec \ F c ½ (E [ F )c. Putting 1) and 2) together we conclude (E [ F )c = Ec \ F c. { suppose » 2 (E [ F )c ) »2 = E [ F ) »2 = E and »2 = F ) » 2 Ec and » 2 F c ) » 2 Ec \ F c ) (E [ F )c ½ Ec \ F c { suppose » 2 Ec \ F c ) » 2 Ec and » 2 F c ) »2 = E and »2 = F ) »2 = E [ F ) » 2 (E [ F )c ) Ec \ F c ½ (E [ F )c. Proof of b): we use rule a) (Ec [ F c)c = (Ec)c \ (F c)c = E \ F . By taking complement we ¯nd (E \ F )c = Ec [ F c. Page 4.
Recommended publications
  • Probability and Statistics Lecture Notes
    Probability and Statistics Lecture Notes Antonio Jiménez-Martínez Chapter 1 Probability spaces In this chapter we introduce the theoretical structures that will allow us to assign proba- bilities in a wide range of probability problems. 1.1. Examples of random phenomena Science attempts to formulate general laws on the basis of observation and experiment. The simplest and most used scheme of such laws is: if a set of conditions B is satisfied =) event A occurs. Examples of such laws are the law of gravity, the law of conservation of mass, and many other instances in chemistry, physics, biology... If event A occurs inevitably whenever the set of conditions B is satisfied, we say that A is certain or sure (under the set of conditions B). If A can never occur whenever B is satisfied, we say that A is impossible (under the set of conditions B). If A may or may not occur whenever B is satisfied, then A is said to be a random phenomenon. Random phenomena is our subject matter. Unlike certain and impossible events, the presence of randomness implies that the set of conditions B do not reflect all the necessary and sufficient conditions for the event A to occur. It might seem them impossible to make any worthwhile statements about random phenomena. However, experience has shown that many random phenomena exhibit a statistical regularity that makes them subject to study. For such random phenomena it is possible to estimate the chance of occurrence of the random event. This estimate can be obtained from laws, called probabilistic or stochastic, with the form: if a set of conditions B is satisfied event A occurs m times =) repeatedly n times out of the n repetitions.
    [Show full text]
  • Random Variable = a Real-Valued Function of an Outcome X = F(Outcome)
    Random Variables (Chapter 2) Random variable = A real-valued function of an outcome X = f(outcome) Domain of X: Sample space of the experiment. Ex: Consider an experiment consisting of 3 Bernoulli trials. Bernoulli trial = Only two possible outcomes – success (S) or failure (F). • “IF” statement: if … then “S” else “F” • Examine each component. S = “acceptable”, F = “defective”. • Transmit binary digits through a communication channel. S = “digit received correctly”, F = “digit received incorrectly”. Suppose the trials are independent and each trial has a probability ½ of success. X = # successes observed in the experiment. Possible values: Outcome Value of X (SSS) (SSF) (SFS) … … (FFF) Random variable: • Assigns a real number to each outcome in S. • Denoted by X, Y, Z, etc., and its values by x, y, z, etc. • Its value depends on chance. • Its value becomes available once the experiment is completed and the outcome is known. • Probabilities of its values are determined by the probabilities of the outcomes in the sample space. Probability distribution of X = A table, formula or a graph that summarizes how the total probability of one is distributed over all the possible values of X. In the Bernoulli trials example, what is the distribution of X? 1 Two types of random variables: Discrete rv = Takes finite or countable number of values • Number of jobs in a queue • Number of errors • Number of successes, etc. Continuous rv = Takes all values in an interval – i.e., it has uncountable number of values. • Execution time • Waiting time • Miles per gallon • Distance traveled, etc. Discrete random variables X = A discrete rv.
    [Show full text]
  • Introduction to Stochastic Processes - Lecture Notes (With 33 Illustrations)
    Introduction to Stochastic Processes - Lecture Notes (with 33 illustrations) Gordan Žitković Department of Mathematics The University of Texas at Austin Contents 1 Probability review 4 1.1 Random variables . 4 1.2 Countable sets . 5 1.3 Discrete random variables . 5 1.4 Expectation . 7 1.5 Events and probability . 8 1.6 Dependence and independence . 9 1.7 Conditional probability . 10 1.8 Examples . 12 2 Mathematica in 15 min 15 2.1 Basic Syntax . 15 2.2 Numerical Approximation . 16 2.3 Expression Manipulation . 16 2.4 Lists and Functions . 17 2.5 Linear Algebra . 19 2.6 Predefined Constants . 20 2.7 Calculus . 20 2.8 Solving Equations . 22 2.9 Graphics . 22 2.10 Probability Distributions and Simulation . 23 2.11 Help Commands . 24 2.12 Common Mistakes . 25 3 Stochastic Processes 26 3.1 The canonical probability space . 27 3.2 Constructing the Random Walk . 28 3.3 Simulation . 29 3.3.1 Random number generation . 29 3.3.2 Simulation of Random Variables . 30 3.4 Monte Carlo Integration . 33 4 The Simple Random Walk 35 4.1 Construction . 35 4.2 The maximum . 36 1 CONTENTS 5 Generating functions 40 5.1 Definition and first properties . 40 5.2 Convolution and moments . 42 5.3 Random sums and Wald’s identity . 44 6 Random walks - advanced methods 48 6.1 Stopping times . 48 6.2 Wald’s identity II . 50 6.3 The distribution of the first hitting time T1 .......................... 52 6.3.1 A recursive formula . 52 6.3.2 Generating-function approach .
    [Show full text]
  • 39 Section J Basic Probability Concepts Before We Can Begin To
    Section J Basic Probability Concepts Before we can begin to discuss inferential statistics, we need to discuss probability. Recall, inferential statistics deals with analyzing a sample from the population to draw conclusions about the population, therefore since the data came from a sample we can never be 100% certain the conclusion is correct. Therefore, probability is an integral part of inferential statistics and needs to be studied before starting the discussion on inferential statistics. The theoretical probability of an event is the proportion of times the event occurs in the long run, as a probability experiment is repeated over and over again. Law of Large Numbers says that as a probability experiment is repeated again and again, the proportion of times that a given event occurs will approach its probability. A sample space contains all possible outcomes of a probability experiment. EX: An event is an outcome or a collection of outcomes from a sample space. A probability model for a probability experiment consists of a sample space, along with a probability for each event. Note: If A denotes an event then the probability of the event A is denoted P(A). Probability models with equally likely outcomes If a sample space has n equally likely outcomes, and an event A has k outcomes, then Number of outcomes in A k P(A) = = Number of outcomes in the sample space n The probability of an event is always between 0 and 1, inclusive. 39 Important probability characteristics: 1) For any event A, 0 ≤ P(A) ≤ 1 2) If A cannot occur, then P(A) = 0.
    [Show full text]
  • Probability and Counting Rules
    blu03683_ch04.qxd 09/12/2005 12:45 PM Page 171 C HAPTER 44 Probability and Counting Rules Objectives Outline After completing this chapter, you should be able to 4–1 Introduction 1 Determine sample spaces and find the probability of an event, using classical 4–2 Sample Spaces and Probability probability or empirical probability. 4–3 The Addition Rules for Probability 2 Find the probability of compound events, using the addition rules. 4–4 The Multiplication Rules and Conditional 3 Find the probability of compound events, Probability using the multiplication rules. 4–5 Counting Rules 4 Find the conditional probability of an event. 5 Find the total number of outcomes in a 4–6 Probability and Counting Rules sequence of events, using the fundamental counting rule. 4–7 Summary 6 Find the number of ways that r objects can be selected from n objects, using the permutation rule. 7 Find the number of ways that r objects can be selected from n objects without regard to order, using the combination rule. 8 Find the probability of an event, using the counting rules. 4–1 blu03683_ch04.qxd 09/12/2005 12:45 PM Page 172 172 Chapter 4 Probability and Counting Rules Statistics Would You Bet Your Life? Today Humans not only bet money when they gamble, but also bet their lives by engaging in unhealthy activities such as smoking, drinking, using drugs, and exceeding the speed limit when driving. Many people don’t care about the risks involved in these activities since they do not understand the concepts of probability.
    [Show full text]
  • Is the Cosmos Random?
    IS THE RANDOM? COSMOS QUANTUM PHYSICS Einstein’s assertion that God does not play dice with the universe has been misinterpreted By George Musser Few of Albert Einstein’s sayings have been as widely quot- ed as his remark that God does not play dice with the universe. People have naturally taken his quip as proof that he was dogmatically opposed to quantum mechanics, which views randomness as a built-in feature of the physical world. When a radioactive nucleus decays, it does so sponta- neously; no rule will tell you when or why. When a particle of light strikes a half-silvered mirror, it either reflects off it or passes through; the out- come is open until the moment it occurs. You do not need to visit a labora- tory to see these processes: lots of Web sites display streams of random digits generated by Geiger counters or quantum optics. Being unpredict- able even in principle, such numbers are ideal for cryptography, statistics and online poker. Einstein, so the standard tale goes, refused to accept that some things are indeterministic—they just happen, and there is not a darned thing anyone can do to figure out why. Almost alone among his peers, he clung to the clockwork universe of classical physics, ticking mechanistically, each moment dictating the next. The dice-playing line became emblemat- ic of the B side of his life: the tragedy of a revolutionary turned reaction- ary who upended physics with relativity theory but was, as Niels Bohr put it, “out to lunch” on quantum theory.
    [Show full text]
  • Topic 1: Basic Probability Definition of Sets
    Topic 1: Basic probability ² Review of sets ² Sample space and probability measure ² Probability axioms ² Basic probability laws ² Conditional probability ² Bayes' rules ² Independence ² Counting ES150 { Harvard SEAS 1 De¯nition of Sets ² A set S is a collection of objects, which are the elements of the set. { The number of elements in a set S can be ¯nite S = fx1; x2; : : : ; xng or in¯nite but countable S = fx1; x2; : : :g or uncountably in¯nite. { S can also contain elements with a certain property S = fx j x satis¯es P g ² S is a subset of T if every element of S also belongs to T S ½ T or T S If S ½ T and T ½ S then S = T . ² The universal set ­ is the set of all objects within a context. We then consider all sets S ½ ­. ES150 { Harvard SEAS 2 Set Operations and Properties ² Set operations { Complement Ac: set of all elements not in A { Union A \ B: set of all elements in A or B or both { Intersection A [ B: set of all elements common in both A and B { Di®erence A ¡ B: set containing all elements in A but not in B. ² Properties of set operations { Commutative: A \ B = B \ A and A [ B = B [ A. (But A ¡ B 6= B ¡ A). { Associative: (A \ B) \ C = A \ (B \ C) = A \ B \ C. (also for [) { Distributive: A \ (B [ C) = (A \ B) [ (A \ C) A [ (B \ C) = (A [ B) \ (A [ C) { DeMorgan's laws: (A \ B)c = Ac [ Bc (A [ B)c = Ac \ Bc ES150 { Harvard SEAS 3 Elements of probability theory A probabilistic model includes ² The sample space ­ of an experiment { set of all possible outcomes { ¯nite or in¯nite { discrete or continuous { possibly multi-dimensional ² An event A is a set of outcomes { a subset of the sample space, A ½ ­.
    [Show full text]
  • Negative Probability in the Framework of Combined Probability
    Negative probability in the framework of combined probability Mark Burgin University of California, Los Angeles 405 Hilgard Ave. Los Angeles, CA 90095 Abstract Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extended probability. In this work, axiomatic system that synthesizes conventional probability and negative probability is constructed in the form of combined probability. Both theoretical concepts – extended probability and combined probability – stretch conventional probability to negative values in a mathematically rigorous way. Here we obtain various properties of combined probability. In particular, relations between combined probability, extended probability and conventional probability are explicated. It is demonstrated (Theorems 3.1, 3.3 and 3.4) that extended probability and conventional probability are special cases of combined probability. 1 1. Introduction All students are taught that probability takes values only in the interval [0,1]. All conventional interpretations of probability support this assumption, while all popular formal descriptions, e.g., axioms for probability, such as Kolmogorov’s
    [Show full text]
  • Probability Theory Review 1 Basic Notions: Sample Space, Events
    Fall 2018 Probability Theory Review Aleksandar Nikolov 1 Basic Notions: Sample Space, Events 1 A probability space (Ω; P) consists of a finite or countable set Ω called the sample space, and the P probability function P :Ω ! R such that for all ! 2 Ω, P(!) ≥ 0 and !2Ω P(!) = 1. We call an element ! 2 Ω a sample point, or outcome, or simple event. You should think of a sample space as modeling some random \experiment": Ω contains all possible outcomes of the experiment, and P(!) gives the probability that we are going to get outcome !. Note that we never speak of probabilities except in relation to a sample space. At this point we give a few examples: 1. Consider a random experiment in which we toss a single fair coin. The two possible outcomes are that the coin comes up heads (H) or tails (T), and each of these outcomes is equally likely. 1 Then the probability space is (Ω; P), where Ω = fH; T g and P(H) = P(T ) = 2 . 2. Consider a random experiment in which we toss a single coin, but the coin lands heads with 2 probability 3 . Then, once again the sample space is Ω = fH; T g but the probability function 2 1 is different: P(H) = 3 , P(T ) = 3 . 3. Consider a random experiment in which we toss a fair coin three times, and each toss is independent of the others. The coin can come up heads all three times, or come up heads twice and then tails, etc.
    [Show full text]
  • Sample Space, Events and Probability
    Sample Space, Events and Probability Sample Space and Events There are lots of phenomena in nature, like tossing a coin or tossing a die, whose outcomes cannot be predicted with certainty in advance, but the set of all the possible outcomes is known. These are what we call random phenomena or random experiments. Probability theory is concerned with such random phenomena or random experiments. Consider a random experiment. The set of all the possible outcomes is called the sample space of the experiment and is usually denoted by S. Any subset E of the sample space S is called an event. Here are some examples. Example 1 Tossing a coin. The sample space is S = fH; T g. E = fHg is an event. Example 2 Tossing a die. The sample space is S = f1; 2; 3; 4; 5; 6g. E = f2; 4; 6g is an event, which can be described in words as "the number is even". Example 3 Tossing a coin twice. The sample space is S = fHH;HT;TH;TT g. E = fHH; HT g is an event, which can be described in words as "the first toss results in a Heads. Example 4 Tossing a die twice. The sample space is S = f(i; j): i; j = 1; 2;:::; 6g, which contains 36 elements. "The sum of the results of the two toss is equal to 10" is an event. Example 5 Choosing a point from the interval (0; 1). The sample space is S = (0; 1). E = (1=3; 1=2) is an event. Example 6 Measuring the lifetime of a lightbulb.
    [Show full text]
  • Notes for Math 450 Lecture Notes 2
    Notes for Math 450 Lecture Notes 2 Renato Feres 1 Probability Spaces We first explain the basic concept of a probability space, (Ω, F,P ). This may be interpreted as an experiment with random outcomes. The set Ω is the collection of all possible outcomes of the experiment; F is a family of subsets of Ω called events; and P is a function that associates to an event its probability. These objects must satisfy certain logical requirements, which are detailed below. A random variable is a function X :Ω → S of the output of the random system. We explore some of the general implications of these abstract concepts. 1.1 Events and the basic set operations on them Any situation where the outcome is regarded as random will be referred to as an experiment, and the set of all possible outcomes of the experiment comprises its sample space, denoted by S or, at times, Ω. Each possible outcome of the experiment corresponds to a single element of S. For example, rolling a die is an experiment whose sample space is the finite set {1, 2, 3, 4, 5, 6}. The sample space for the experiment of tossing three (distinguishable) coins is {HHH,HHT,HTH,HTT,THH,THT,TTH,TTT } where HTH indicates the ordered triple (H, T, H) in the product set {H, T }3. The delay in departure of a flight scheduled for 10:00 AM every day can be regarded as the outcome of an experiment in this abstract sense, where the sample space now may be taken to be the interval [0, ∞) of the real line.
    [Show full text]
  • Chapter 1 Probability, Random Variables and Expectations
    Chapter 1 Probability, Random Variables and Expectations Note: The primary reference for these notes is Mittelhammer (1999). Other treatments of proba- bility theory include Gallant (1997), Casella and Berger (2001) and Grimmett and Stirzaker (2001). This chapter provides an overview of probability theory as it applied to both discrete and continuous random variables. The material covered in this chap- ter serves as a foundation of the econometric sequence and is useful through- out financial economics. The chapter begins with a discussion of the axiomatic foundations of probability theory and then proceeds to describe properties of univariate random variables. Attention then turns to multivariate random vari- ables and important difference from univariate random variables. Finally, the chapter discusses the expectations operator and moments. 1.1 Axiomatic Probability Probability theory is derived from a small set of axioms – a minimal set of essential assumptions. A deep understanding of axiomatic probability theory is not essential to financial econometrics or to the use of probability and statistics in general, although understanding these core concepts does provide additional insight. The first concept in probability theory is the sample space, which is an abstract concept con- taining primitive probability events. Definition 1.1 (Sample Space). The sample space is a set, Ω, that contains all possible outcomes. Example 1.1. Suppose interest is on a standard 6-sided die. The sample space is 1-dot, 2-dots, . ., 6-dots. Example 1.2. Suppose interest is in a standard 52-card deck. The sample space is then A|, 2|, 3|,..., J |, Q|, K |, A},..., K }, A~,..., K ~, A♠,..., K ♠.
    [Show full text]