Probability Theory Umpt001

Total Page:16

File Type:pdf, Size:1020Kb

Probability Theory Umpt001 uksinghmaths.com PROBABILITY Let us first see some of the basic terminologies and concepts used in probability. EXPERIMENT- An Experiment is an action or operation which can produce some well-defined result/outcome(s). There are two types of experiment:- (1) Deterministic Experiment (2) Indeterministic/Random/Probabilistic/Stochasticuksinghmaths.com Experiment Deterministic Experiment:- Those experiments which when repeated under identical condition always gives the same outcome/ result, no matter where it is performed and no matter how many times it is performed, are called Deterministic experiment. Those experiments which when repeated under Indeterministic/Random/ Probabilistic Experiment:- identical conditions do not always give the same output/result, is known as Indeterministic/Random/ Probabilistic/Stochastic experiment. Trial An experiment once performed is called Trial. uksinghmaths.com The set of all possible outcomes of of a random experiment is called Sample Sample space of that experiment and is generally denoted by S. Space:- For example, In case of tossing a coin, the sample space is given by S={H,T}, where H=Head and T=Tail. Similarly in rolling a die, the sample space is given by S={1,2,3,4,5,6} Each element of a sample space is called a sample point or an elementary event. Event:- Any subset of a sample space is called an Event. For example, let us consider the case of rolling a die, then sample space is given by, S={1,2,3,4,5,6}. Let E1={1}, E2={2,3}, E3={2,4,6) , E4={1,3,5}, Then E1,E2,E3 & E4 all are events, as each are the subset of S. If the outcome of a random experiment is an element of an event then we say that the event has occurred otherwise not. For example, in rolling a die the outcome is ‘1’, then we say that above events E1 & E4 has occurred but E2 & E3 has not occurred, As 1∈E1&E4. Let S be the sample space of an experiment . since null set(empty set) is subset of every set . Hence,Ø⊆S,so Ø is an event. This event is called impossible event. Since every set is the subset of itself. Hence S⊆S,so S is an event and this event is called sure event. Equally Two events are called equally likely event if the occurrence of one is Likely Event:- not preferred over the other. For example, In tossing a coin the event of getting a head or a tail is equallyuksinghmaths.com likely. Similarly, in case of rolling a die, the event of getting any one of the six digits is equally likely. Exclusive Two Events are said to be exclusive, if the occurrence of one Events:- excludes the possibility of the occurrence of other. such events are disjoint events. Thus two events A and B are exclusive if and only if A∩B=Ø Exhaustive Let S be the sample space of an experiment. Let E1,E2,E3,…En Events:- be events such that, E1∪E2∪E3……∪En=S, Then the set of these events are called exhaustive events. uksinghmaths.com Let S be the sample space of an experiment. If E1,E2,E3,………En be sets of events such that: Exclusively Exhaustive (1) E1∪E2∪E3……… ∪En=S Events:- (2) Ei∩Ej=Ø (i≠j) are called exclusively exhaustive event. Complementary Event:- Two events A and B of the sample space S are said to be complementary event if, (1) A∪B=S (2) A∩B=Ø The complement of A is generally denoted by A’. Independent Event:- Two events are said to be independent of each other if the happening of one does not effect the happening of other. Dependent Event:- Two events are said to be dependent if the occurrence of one affects the occurrence of other. Probability:- Probability is a concept which numerically measures the degree of uncertainty and therefore degree of certainty of the occurrence of events. Let S be the Sample space associated with some random experiment. Let E ⊆S be any event. Then the Probability of occurrence of E is denoted P(E) and is defined as, P(E)=n(E)/n(S), Where n stands for number of elements. =Numberuksinghmaths.com of outcomes favourable to E/Total number of possible outcomes For example, let us consider the experiment of rolling a die, the sample space is given by, S={1,2,3,4,5,6}. Let E={2,4} be an event then, P(E)=n(E)/n(S)=2/6=1/3. The probability of impossible event is zero. As, P(Ø)=n(Ø)/n(S)=0/n(S)=0 The probability of sure event is always ‘1’. As, P(S)=n(S)/n(S)=1 The probability of any event lies between o and 1. 0≤P(E)≤1 uksinghmaths.com Odds in favour and odds against of an event: If an event A can occur in m ways and can fail in n ways . odds in favour of the event A is given by Odds m/n=no.of favourable choices/no. of unfavourable choices. in favour =no. of successes/no. of failures. =probability of occurrence of event{m/m+n)}/ probability of not occurrence of event{n/m+n)}. =P(A)/P(A'). For example, odds in favour of rolling a 6 on a fair six sided die is1/5. Odds against of the event A is given by Odds n/m=no. of unfavourable choices/no. of favourable choices. in favour =no. of failures/no. of successes. =probability of not occurrence of event{n/(m+n)}/ probability of occurrence of event{m/(m+n)}. =P(A')/P(A). For example,odds against of rolling a 6 on a fair six sided die is 5/1. Q. Find the odds in favour of getting exactly two heads when three coins are tossed. Sol.- The sample space of tossing three coins are given by S={HHH,HHT,HTH,THH,HTT,THT,TTH,TTT} Thus,the odds in favour of exactly two head=3/5. Theorems of probability S Addition Rule of Let S be the sample space associated Probability:- with a random experiment and A & B A B uksinghmaths.comare two events.Then, by venn diagram n(A)∪n(B)=n(A)+n(B)-n(A∩B) ⇒{n(A)∪n(B)}/n(S)={n(A)+n(B)-n(A∩B)}/n(S) ⇒{n(A∪B)/n(S)}={n(A)/n(S)}+{n(B)/n(S)}- {n(A∩B)}/n(S)} ⇒P(A∪B)=P(A)+P(B)-P(A∩B) s If A and B are mutually exclusive events then A B A∩B=Ø ⇒P(A∩B)=P(Ø)=0 A & B are mutually exclusive events Thus, P(A∪B)=P(A)+P(B) uksinghmaths.com For complementary events, A∪A’=S & A∩A’=Ø ⇒P(A∪A’)=P(S) ⇒P(A)+P(A’)=1 ⇒P(A’)=1-P(A) Let S be the sample space associated with a random experiment and A,B and C be three events associated with it. Then P(A∪B∪C)=P(A)+P(B)+P(C)-P(A∩B)-P(B∩C)-P(C∩A)+P(A∩B∩C) If A,B & C are mutually exclusive events,Then A∩B=B∩C=C∩A=A∩B∩C=Ø ⇒P(A∩B)=P(B∩C)=P(C∩A)=P(A∩B∩C)=P(Ø)=0 ⇒P(A∪B∪C)=P(A)+P(B)+P(C) Multiplication Rule of Probability or Let S be the sample space associated with Theorem of compound Probability:- some random experiment and A,B are its two events. Let event B has occurred and B≠Ø . Then, no. of favourable cases of A=n(A∩B),after occurrence of event B. Now,S will not be the sample space for A,here the sample space will be B. No. of elements in sample space of A will be n(B). P(A | B), means probability of event A given event B has occurred.This is also known as conditional probability A given even B has already occurred, and is defineduksinghmaths.com as P(A | B)=n(A∩B)/n(B) ={n(A∩B)/n(S)}/{n(B)/n(S)} =P(A∩B)/P(B) ⇒P(A | B)=P(A∩B)/P(B) ⇒P(A∩B)=P(A | B)*P(B) Similarly,P(B | A)=P(A∩B)/P(A) , & P(A∩B)=P(B | A)*P(A) uksinghmaths.com If A and B are two independent events then, P(A∩B)=P(A)*P(B) Proof- From multiplication theorem, we know that for any two events A and B, we know that P(A∩B)=P(A | B)*P(B)………..(i) But A & B are independent events,So occurrence of B does not efects occurrence of A. Hence,P(A | B)=P(A) Now putting this value in eqn.(i), we have P(A∩B)=P(A)*P(B). If A and B are not independent events, Then P(A∩B)≠P(A)*P(B) If A,B,C are three independent events, Then P(A∩B∩C)=P(A)*P(B)*P(C) Proof- Let B∩C=E Since A,B,C are independent events, therefore A and E are also independent. Since A,E are independent events, then P(A∩E)=P(A)*P(E) =P(A)*P(B∩C) =P(A)*P(B)*P(C) {as, B and C are independent,hence P(B∩C)=P(B)*P(C)} ⇒P(A∩B∩C)=P(A)*P(B)*P(C).
Recommended publications
  • Effective Program Reasoning Using Bayesian Inference
    EFFECTIVE PROGRAM REASONING USING BAYESIAN INFERENCE Sulekha Kulkarni A DISSERTATION in Computer and Information Science Presented to the Faculties of the University of Pennsylvania in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy 2020 Supervisor of Dissertation Mayur Naik, Professor, Computer and Information Science Graduate Group Chairperson Mayur Naik, Professor, Computer and Information Science Dissertation Committee: Rajeev Alur, Zisman Family Professor of Computer and Information Science Val Tannen, Professor of Computer and Information Science Osbert Bastani, Research Assistant Professor of Computer and Information Science Suman Nath, Partner Research Manager, Microsoft Research, Redmond To my father, who set me on this path, to my mother, who leads by example, and to my husband, who is an infinite source of courage. ii Acknowledgments I want to thank my advisor Prof. Mayur Naik for giving me the invaluable opportunity to learn and experiment with different ideas at my own pace. He supported me through the ups and downs of research, and helped me make the Ph.D. a reality. I also want to thank Prof. Rajeev Alur, Prof. Val Tannen, Prof. Osbert Bastani, and Dr. Suman Nath for serving on my dissertation committee and for providing valuable feedback. I am deeply grateful to Prof. Alur and Prof. Tannen for their sound advice and support, and for going out of their way to help me through challenging times. I am also very grateful for Dr. Nath's able and inspiring mentorship during my internship at Microsoft Research, and during the collaboration that followed. Dr. Aditya Nori helped me start my Ph.D.
    [Show full text]
  • Probability and Statistics Lecture Notes
    Probability and Statistics Lecture Notes Antonio Jiménez-Martínez Chapter 1 Probability spaces In this chapter we introduce the theoretical structures that will allow us to assign proba- bilities in a wide range of probability problems. 1.1. Examples of random phenomena Science attempts to formulate general laws on the basis of observation and experiment. The simplest and most used scheme of such laws is: if a set of conditions B is satisfied =) event A occurs. Examples of such laws are the law of gravity, the law of conservation of mass, and many other instances in chemistry, physics, biology... If event A occurs inevitably whenever the set of conditions B is satisfied, we say that A is certain or sure (under the set of conditions B). If A can never occur whenever B is satisfied, we say that A is impossible (under the set of conditions B). If A may or may not occur whenever B is satisfied, then A is said to be a random phenomenon. Random phenomena is our subject matter. Unlike certain and impossible events, the presence of randomness implies that the set of conditions B do not reflect all the necessary and sufficient conditions for the event A to occur. It might seem them impossible to make any worthwhile statements about random phenomena. However, experience has shown that many random phenomena exhibit a statistical regularity that makes them subject to study. For such random phenomena it is possible to estimate the chance of occurrence of the random event. This estimate can be obtained from laws, called probabilistic or stochastic, with the form: if a set of conditions B is satisfied event A occurs m times =) repeatedly n times out of the n repetitions.
    [Show full text]
  • 1 Dependent and Independent Events 2 Complementary Events 3 Mutually Exclusive Events 4 Probability of Intersection of Events
    1 Dependent and Independent Events Let A and B be events. We say that A is independent of B if P (AjB) = P (A). That is, the marginal probability of A is the same as the conditional probability of A, given B. This means that the probability of A occurring is not affected by B occurring. It turns out that, in this case, B is independent of A as well. So, we just say that A and B are independent. We say that A depends on B if P (AjB) 6= P (A). That is, the marginal probability of A is not the same as the conditional probability of A, given B. This means that the probability of A occurring is affected by B occurring. It turns out that, in this case, B depends on A as well. So, we just say that A and B are dependent. Consider these events from the card draw: A = drawing a king, B = drawing a spade, C = drawing a face card. Events A and B are independent. If you know that you have drawn a spade, this does not change the likelihood that you have actually drawn a king. Formally, the marginal probability of drawing a king is P (A) = 4=52. The conditional probability that your card is a king, given that it a spade, is P (AjB) = 1=13, which is the same as 4=52. Events A and C are dependent. If you know that you have drawn a face card, it is much more likely that you have actually drawn a king than it would be ordinarily.
    [Show full text]
  • Random Variable = a Real-Valued Function of an Outcome X = F(Outcome)
    Random Variables (Chapter 2) Random variable = A real-valued function of an outcome X = f(outcome) Domain of X: Sample space of the experiment. Ex: Consider an experiment consisting of 3 Bernoulli trials. Bernoulli trial = Only two possible outcomes – success (S) or failure (F). • “IF” statement: if … then “S” else “F” • Examine each component. S = “acceptable”, F = “defective”. • Transmit binary digits through a communication channel. S = “digit received correctly”, F = “digit received incorrectly”. Suppose the trials are independent and each trial has a probability ½ of success. X = # successes observed in the experiment. Possible values: Outcome Value of X (SSS) (SSF) (SFS) … … (FFF) Random variable: • Assigns a real number to each outcome in S. • Denoted by X, Y, Z, etc., and its values by x, y, z, etc. • Its value depends on chance. • Its value becomes available once the experiment is completed and the outcome is known. • Probabilities of its values are determined by the probabilities of the outcomes in the sample space. Probability distribution of X = A table, formula or a graph that summarizes how the total probability of one is distributed over all the possible values of X. In the Bernoulli trials example, what is the distribution of X? 1 Two types of random variables: Discrete rv = Takes finite or countable number of values • Number of jobs in a queue • Number of errors • Number of successes, etc. Continuous rv = Takes all values in an interval – i.e., it has uncountable number of values. • Execution time • Waiting time • Miles per gallon • Distance traveled, etc. Discrete random variables X = A discrete rv.
    [Show full text]
  • Introduction to Stochastic Processes - Lecture Notes (With 33 Illustrations)
    Introduction to Stochastic Processes - Lecture Notes (with 33 illustrations) Gordan Žitković Department of Mathematics The University of Texas at Austin Contents 1 Probability review 4 1.1 Random variables . 4 1.2 Countable sets . 5 1.3 Discrete random variables . 5 1.4 Expectation . 7 1.5 Events and probability . 8 1.6 Dependence and independence . 9 1.7 Conditional probability . 10 1.8 Examples . 12 2 Mathematica in 15 min 15 2.1 Basic Syntax . 15 2.2 Numerical Approximation . 16 2.3 Expression Manipulation . 16 2.4 Lists and Functions . 17 2.5 Linear Algebra . 19 2.6 Predefined Constants . 20 2.7 Calculus . 20 2.8 Solving Equations . 22 2.9 Graphics . 22 2.10 Probability Distributions and Simulation . 23 2.11 Help Commands . 24 2.12 Common Mistakes . 25 3 Stochastic Processes 26 3.1 The canonical probability space . 27 3.2 Constructing the Random Walk . 28 3.3 Simulation . 29 3.3.1 Random number generation . 29 3.3.2 Simulation of Random Variables . 30 3.4 Monte Carlo Integration . 33 4 The Simple Random Walk 35 4.1 Construction . 35 4.2 The maximum . 36 1 CONTENTS 5 Generating functions 40 5.1 Definition and first properties . 40 5.2 Convolution and moments . 42 5.3 Random sums and Wald’s identity . 44 6 Random walks - advanced methods 48 6.1 Stopping times . 48 6.2 Wald’s identity II . 50 6.3 The distribution of the first hitting time T1 .......................... 52 6.3.1 A recursive formula . 52 6.3.2 Generating-function approach .
    [Show full text]
  • Lecture 2: Modeling Random Experiments
    Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2021 Lecture 2: Modeling Random Experiments Relevant textbook passages: Pitman [5]: Sections 1.3–1.4., pp. 26–46. Larsen–Marx [4]: Sections 2.2–2.5, pp. 18–66. 2.1 Axioms for probability measures Recall from last time that a random experiment is an experiment that may be conducted under seemingly identical conditions, yet give different results. Coin tossing is everyone’s go-to example of a random experiment. The way we model random experiments is through the use of probabilities. We start with the sample space Ω, the set of possible outcomes of the experiment, and consider events, which are subsets E of the sample space. (We let F denote the collection of events.) 2.1.1 Definition A probability measure P or probability distribution attaches to each event E a number between 0 and 1 (inclusive) so as to obey the following axioms of probability: Normalization: P (?) = 0; and P (Ω) = 1. Nonnegativity: For each event E, we have P (E) > 0. Additivity: If EF = ?, then P (∪ F ) = P (E) + P (F ). Note that while the domain of P is technically F, the set of events, that is P : F → [0, 1], we may also refer to P as a probability (measure) on Ω, the set of realizations. 2.1.2 Remark To reduce the visual clutter created by layers of delimiters in our notation, we may omit some of them simply write something like P (f(ω) = 1) orP {ω ∈ Ω: f(ω) = 1} instead of P {ω ∈ Ω: f(ω) = 1} and we may write P (ω) instead of P {ω} .
    [Show full text]
  • 39 Section J Basic Probability Concepts Before We Can Begin To
    Section J Basic Probability Concepts Before we can begin to discuss inferential statistics, we need to discuss probability. Recall, inferential statistics deals with analyzing a sample from the population to draw conclusions about the population, therefore since the data came from a sample we can never be 100% certain the conclusion is correct. Therefore, probability is an integral part of inferential statistics and needs to be studied before starting the discussion on inferential statistics. The theoretical probability of an event is the proportion of times the event occurs in the long run, as a probability experiment is repeated over and over again. Law of Large Numbers says that as a probability experiment is repeated again and again, the proportion of times that a given event occurs will approach its probability. A sample space contains all possible outcomes of a probability experiment. EX: An event is an outcome or a collection of outcomes from a sample space. A probability model for a probability experiment consists of a sample space, along with a probability for each event. Note: If A denotes an event then the probability of the event A is denoted P(A). Probability models with equally likely outcomes If a sample space has n equally likely outcomes, and an event A has k outcomes, then Number of outcomes in A k P(A) = = Number of outcomes in the sample space n The probability of an event is always between 0 and 1, inclusive. 39 Important probability characteristics: 1) For any event A, 0 ≤ P(A) ≤ 1 2) If A cannot occur, then P(A) = 0.
    [Show full text]
  • Probability and Counting Rules
    blu03683_ch04.qxd 09/12/2005 12:45 PM Page 171 C HAPTER 44 Probability and Counting Rules Objectives Outline After completing this chapter, you should be able to 4–1 Introduction 1 Determine sample spaces and find the probability of an event, using classical 4–2 Sample Spaces and Probability probability or empirical probability. 4–3 The Addition Rules for Probability 2 Find the probability of compound events, using the addition rules. 4–4 The Multiplication Rules and Conditional 3 Find the probability of compound events, Probability using the multiplication rules. 4–5 Counting Rules 4 Find the conditional probability of an event. 5 Find the total number of outcomes in a 4–6 Probability and Counting Rules sequence of events, using the fundamental counting rule. 4–7 Summary 6 Find the number of ways that r objects can be selected from n objects, using the permutation rule. 7 Find the number of ways that r objects can be selected from n objects without regard to order, using the combination rule. 8 Find the probability of an event, using the counting rules. 4–1 blu03683_ch04.qxd 09/12/2005 12:45 PM Page 172 172 Chapter 4 Probability and Counting Rules Statistics Would You Bet Your Life? Today Humans not only bet money when they gamble, but also bet their lives by engaging in unhealthy activities such as smoking, drinking, using drugs, and exceeding the speed limit when driving. Many people don’t care about the risks involved in these activities since they do not understand the concepts of probability.
    [Show full text]
  • Topic 1: Basic Probability Definition of Sets
    Topic 1: Basic probability ² Review of sets ² Sample space and probability measure ² Probability axioms ² Basic probability laws ² Conditional probability ² Bayes' rules ² Independence ² Counting ES150 { Harvard SEAS 1 De¯nition of Sets ² A set S is a collection of objects, which are the elements of the set. { The number of elements in a set S can be ¯nite S = fx1; x2; : : : ; xng or in¯nite but countable S = fx1; x2; : : :g or uncountably in¯nite. { S can also contain elements with a certain property S = fx j x satis¯es P g ² S is a subset of T if every element of S also belongs to T S ½ T or T S If S ½ T and T ½ S then S = T . ² The universal set ­ is the set of all objects within a context. We then consider all sets S ½ ­. ES150 { Harvard SEAS 2 Set Operations and Properties ² Set operations { Complement Ac: set of all elements not in A { Union A \ B: set of all elements in A or B or both { Intersection A [ B: set of all elements common in both A and B { Di®erence A ¡ B: set containing all elements in A but not in B. ² Properties of set operations { Commutative: A \ B = B \ A and A [ B = B [ A. (But A ¡ B 6= B ¡ A). { Associative: (A \ B) \ C = A \ (B \ C) = A \ B \ C. (also for [) { Distributive: A \ (B [ C) = (A \ B) [ (A \ C) A [ (B \ C) = (A [ B) \ (A [ C) { DeMorgan's laws: (A \ B)c = Ac [ Bc (A [ B)c = Ac \ Bc ES150 { Harvard SEAS 3 Elements of probability theory A probabilistic model includes ² The sample space ­ of an experiment { set of all possible outcomes { ¯nite or in¯nite { discrete or continuous { possibly multi-dimensional ² An event A is a set of outcomes { a subset of the sample space, A ½ ­.
    [Show full text]
  • Negative Probability in the Framework of Combined Probability
    Negative probability in the framework of combined probability Mark Burgin University of California, Los Angeles 405 Hilgard Ave. Los Angeles, CA 90095 Abstract Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extended probability. In this work, axiomatic system that synthesizes conventional probability and negative probability is constructed in the form of combined probability. Both theoretical concepts – extended probability and combined probability – stretch conventional probability to negative values in a mathematically rigorous way. Here we obtain various properties of combined probability. In particular, relations between combined probability, extended probability and conventional probability are explicated. It is demonstrated (Theorems 3.1, 3.3 and 3.4) that extended probability and conventional probability are special cases of combined probability. 1 1. Introduction All students are taught that probability takes values only in the interval [0,1]. All conventional interpretations of probability support this assumption, while all popular formal descriptions, e.g., axioms for probability, such as Kolmogorov’s
    [Show full text]
  • 1 — a Single Random Variable
    1 | A SINGLE RANDOM VARIABLE Questions involving probability abound in Computer Science: What is the probability of the PWF world falling over next week? • What is the probability of one packet colliding with another in a network? • What is the probability of an undergraduate not turning up for a lecture? • When addressing such questions there are often added complications: the question may be ill posed or the answer may vary with time. Which undergraduate? What lecture? Is the probability of turning up different on Saturdays? Let's start with something which appears easy to reason about: : : Introduction | Throwing a die Consider an experiment or trial which consists of throwing a mathematically ideal die. Such a die is often called a fair die or an unbiased die. Common sense suggests that: The outcome of a single throw cannot be predicted. • The outcome will necessarily be a random integer in the range 1 to 6. • The six possible outcomes are equiprobable, each having a probability of 1 . • 6 Without further qualification, serious probabilists would regard this collection of assertions, especially the second, as almost meaningless. Just what is a random integer? Giving proper mathematical rigour to the foundations of probability theory is quite a taxing task. To illustrate the difficulty, consider probability in a frequency sense. Thus a probability 1 of 6 means that, over a long run, one expects to throw a 5 (say) on one-sixth of the occasions that the die is thrown. If the actual proportion of 5s after n throws is p5(n) it would be nice to say: 1 lim p5(n) = n !1 6 Unfortunately this is utterly bogus mathematics! This is simply not a proper use of the idea of a limit.
    [Show full text]
  • Probability Theory Review 1 Basic Notions: Sample Space, Events
    Fall 2018 Probability Theory Review Aleksandar Nikolov 1 Basic Notions: Sample Space, Events 1 A probability space (Ω; P) consists of a finite or countable set Ω called the sample space, and the P probability function P :Ω ! R such that for all ! 2 Ω, P(!) ≥ 0 and !2Ω P(!) = 1. We call an element ! 2 Ω a sample point, or outcome, or simple event. You should think of a sample space as modeling some random \experiment": Ω contains all possible outcomes of the experiment, and P(!) gives the probability that we are going to get outcome !. Note that we never speak of probabilities except in relation to a sample space. At this point we give a few examples: 1. Consider a random experiment in which we toss a single fair coin. The two possible outcomes are that the coin comes up heads (H) or tails (T), and each of these outcomes is equally likely. 1 Then the probability space is (Ω; P), where Ω = fH; T g and P(H) = P(T ) = 2 . 2. Consider a random experiment in which we toss a single coin, but the coin lands heads with 2 probability 3 . Then, once again the sample space is Ω = fH; T g but the probability function 2 1 is different: P(H) = 3 , P(T ) = 3 . 3. Consider a random experiment in which we toss a fair coin three times, and each toss is independent of the others. The coin can come up heads all three times, or come up heads twice and then tails, etc.
    [Show full text]