The Probability Set-Up.Pdf

Total Page:16

File Type:pdf, Size:1020Kb

The Probability Set-Up.Pdf CHAPTER 2 The probability set-up 2.1. Basic theory of probability We will have a sample space, denoted by S (sometimes Ω) that consists of all possible outcomes. For example, if we roll two dice, the sample space would be all possible pairs made up of the numbers one through six. An event is a subset of S. Another example is to toss a coin 2 times, and let S = fHH;HT;TH;TT g; or to let S be the possible orders in which 5 horses nish in a horse race; or S the possible prices of some stock at closing time today; or S = [0; 1); the age at which someone dies; or S the points in a circle, the possible places a dart can hit. We should also keep in mind that the same setting can be described using dierent sample set. For example, in two solutions in Example 1.30 we used two dierent sample sets. 2.1.1. Sets. We start by describing elementary operations on sets. By a set we mean a collection of distinct objects called elements of the set, and we consider a set as an object in its own right. Set operations Suppose S is a set. We say that A ⊂ S, that is, A is a subset of S if every element in A is contained in S; A [ B is the union of sets A ⊂ S and B ⊂ S and denotes the points of S that are in A or B or both; A \ B is the intersection of sets A ⊂ S and B ⊂ S and is the set of points that are in both A and B; ; denotes the empty set; Ac is the complement of A, that is, the points in S that are not in A. We extend this denition to have n is the union of sets , and similarly n . [i=1Ai A1; ··· ;An \i=1Ai An exercise is to show that De Morgan's laws n !c n n !c n [ \ c and \ [ c Ai = Ai Ai = Ai : i=1 i=1 i=1 i=1 © Copyright 2013 Richard F. Bass, 2020 Masha Gordina Typesetting date: March 31, 2020 21 22 2. THE PROBABILITY SET-UP We will also need the notion of pairwise disjoint sets 1 which means that fEigi=1 Ei \ Ej = ; unless i = j. There are no restrictions on the sample space S. The collection of events, F, is assumed to be a σ-eld, which means that it satises the following. Denition ( σ-eld) A collection F of sets in S is called a σ-eld if (i) Both ;;S are in F, (ii) if A is in F, then Ac is in F, (iii) if are in , then S1 and T1 are in . A1;A2;::: F i=1 Ai i=1 Ai F Typically we will take F to be all subsets of S, and so (i)-(iii) are automatically satised. The only times we won't have F be all subsets is for technical reasons or when we talk about conditional expectation. 2.1.2. Probability axioms. So now we have a sample space S, a σ-eld F, and we need to talk about what a probability is. Probability axioms (1) 0 6 P(E) 6 1 for all events E 2 F. (2) P(S) = 1. (3) If the 1 are pairwise disjoint, S1 P1 . fEigi=1 ;Ei 2 F P( i=1 Ei) = i=1 P(Ei) Note that probabilities are probabilities of subsets of S, not of points of S. However, it is common to write P(x) for P(fxg). Intuitively, the probability of E should be the number of times E occurs in n experiments, taking a limit as n tends to innity. This is hard to use. It is better to start with these axioms, and then to prove that the probability of E is the limit as we hoped. Below are some easy consequences of the probability axioms. Proposition 2.1 (Properties of probability) (1) P(;) = 0. (2) If are pairwise disjoint, then Sn Pn . E1;:::;En 2 F P( i=1 Ei) = i=1 P(Ei) (3) P(Ec) = 1 − P(E) for any E 2 F. (4) If E ⊂ F , then P(E) 6 P(F ), for any E; F 2 F. (5) for any E; F 2 F (2.1.1) P(E [ F ) = P(E) + P(F ) − P(E \ F ): The last property is sometimes called the inclusion-exclusion identity. Proof. To show (1), choose for each . These are clearly disjoint, so Ei = ; i P(;) = 1 P1 P1 . If were strictly positive, then the last term would P([i=1Ei) = i=1 P(Ei) = i=1 P(;) P(;) 2.1. BASIC THEORY OF PROBABILITY 23 be innity, contradicting the fact that probabilities are between 0 and 1. So the probability of ; must be zero. Part (2) follows if we let . Then 1 are still pairwise En+1 = En+2 = ··· = ; fEigi=1 ;Ei 2 F disjoint, and S1 Sn , and by (1) we have i=1 Ei = i=1 Ei 1 n X X P(Ei) = P(Ei): i=1 i=1 To prove (3), use S = E [ Ec. By (2), P(S) = P(E) + P(Ec). By axiom (2), P(S) = 1, so (1) follows. c c To prove (4), write F = E [ (F \ E ), so P(F ) = P(E) + P(F \ E ) > P(E) by (2) and axiom (1). Similarly, to prove (5), we have P(E[F ) = P(E)+P(Ec\F ) and P(F ) = P(E\F )+P(Ec\F ). Solving the second equation for P(Ec \ F ) and substituting in the rst gives the desired result. It is common for a probability space to consist of nitely many points, all with equally likely probabilities. For example, in tossing a fair coin, we have S = fH; T g, with P(H) = P(T ) = 1 . Similarly, in rolling a fair die, the probability space consists of , each point 2 f1; 2; 3; 4; 5; 6g having probability 1 . 6 Example 2.1. What is the probability that if we roll 2 dice, the sum is 7? Solution: There are 36 possibilities, of which 6 have a sum of 7: (1; 6); (2; 5); (3; 4), (4; 3), , . Since they are all equally likely, the probability is 6 1 . (5; 2) (6; 1) 36 = 6 Example 2.2. What is the probability that in a poker hand (5 cards out of 52) we get exactly 4 of a kind? Solution: we have four suits: clubs, diamonds, hearts and spades. Each suit includes an ace, a king, queen and jack, and ranks two through ten. For example, the probability of 4 aces and 1 king is 44 4 1 52 : 5 The probability of 4 jacks and one 3 is the same. There are 13 ways to pick the rank that we have 4 of, and then 12 ways to pick the rank we have one of, so the answer is 44 4 1 13 · 12 · 52 : 5 Example 2.3. What is the probability that in a poker hand we get exactly 3 of a kind (and the other two cards are of dierent ranks)? 24 2. THE PROBABILITY SET-UP Solution: for example, the probability of 3 aces, 1 king and 1 queen is 444 3 1 1 52 : 5 We have 13 choices for the rank we have 3 of, and 12 choices for the other two ranks, so 2 the answer is 444 12 3 1 1 13 2 52 : 5 Example 2.4. In a class of 30 people, what is the probability everyone has a dierent birthday? (We assume each day is equally likely.) Solution: we assume that it is not a leap year. Let the rst person have a birthday on some day. The probability that the second person has a dierent birthday will be 364 . The 365 probability that the third person has a dierent birthday from the rst two people is 363 . So 365 the answer is 364 363 336 · · ::: · : 365 365 365 2.2. FURTHER EXAMPLES AND APPLICATIONS 25 2.2. Further examples and applications 2.2.1. Sets revisited. A visual way to represent set operations is given by the Venn diagrams. A picture of Venn diagrams from http://www.onlinemathlearning.com/shading-venn-diagrams.html Example 2.5. Roll two dice. We can describe the sample set S as ordered pairs of numbers 1; 2; :::; 6, that is, S has 36 elements. Examples of events are E = the two dice come up equal and even = f(2; 2) ; (4; 4) ; (6; 6)g ; F = the sum of the two dice is 8 = f(2; 6) ; (3; 5) ; (4; 4) ; (5; 3) ; (6; 2)g ; E [ F = f(2; 2) ; (2; 6) ; (3; 5) ; (4; 4) ; (5; 3) ; (6; 2) ; (6; 6)g ; E \ F = f(4; 4)g ; F c = all 31 pairs that do not include f(2; 6) ; (3; 5) ; (4; 4) ; (5; 3) ; (6; 2)g : Example 2.6. Let S = [0; 1) be the space of all possible ages at which someone can die. Possible events are A = person dies before reaching 30 = [0; 30). Ac = [30; 1) = person dies after turning 30. A [ Ac = S, B = a person lives either less than 15 or more than 45 years = (15; 45]. 2.2.2. Axioms of probability revisited. Example 2.7 (Coin tosses). In this case S = fH; T g, where H stands for heads, and T stands for tails.
Recommended publications
  • VENN DIAGRAM Is a Graphic Organizer That Compares and Contrasts Two (Or More) Ideas
    InformationVenn Technology Diagram Solutions ABOUT THE STRATEGY A VENN DIAGRAM is a graphic organizer that compares and contrasts two (or more) ideas. Overlapping circles represent how ideas are similar (the inner circle) and different (the outer circles). It is used after reading a text(s) where two (or more) ideas are being compared and contrasted. This strategy helps students identify Wisconsin similarities and differences between ideas. State Standards Reading:INTERNET SECURITYLiterature IMPLEMENTATION OF THE STRATEGY •Sit Integration amet, consec tetuer of Establish the purpose of the Venn Diagram. adipiscingKnowledge elit, sed diam and Discuss two (or more) ideas / texts, brainstorming characteristics of each of the nonummy nibh euismod tincidunt Ideas ideas / texts. ut laoreet dolore magna aliquam. Provide students with a Venn diagram and model how to use it, using two (or more) ideas / class texts and a think aloud to illustrate your thinking; scaffold as NETWORKGrade PROTECTION Level needed. Ut wisi enim adK- minim5 veniam, After students have examined two (or more) ideas or read two (or more) texts, quis nostrud exerci tation have them complete the Venn diagram. Ask students leading questions for each ullamcorper.Et iusto odio idea: What two (or more) ideas are we comparing and contrasting? How are the dignissimPurpose qui blandit ideas similar? How are the ideas different? Usepraeseptatum with studentszzril delenit Have students synthesize their analysis of the two (or more) ideas / texts, toaugue support duis dolore te feugait summarizing the differences and similarities. comprehension:nulla adipiscing elit, sed diam identifynonummy nibh. similarities MEASURING PROGRESS and differences Teacher observation betweenPERSONAL ideas FIREWALLS Conferring Tincidunt ut laoreet dolore Student journaling magnaWhen aliquam toerat volutUse pat.
    [Show full text]
  • Probability and Statistics Lecture Notes
    Probability and Statistics Lecture Notes Antonio Jiménez-Martínez Chapter 1 Probability spaces In this chapter we introduce the theoretical structures that will allow us to assign proba- bilities in a wide range of probability problems. 1.1. Examples of random phenomena Science attempts to formulate general laws on the basis of observation and experiment. The simplest and most used scheme of such laws is: if a set of conditions B is satisfied =) event A occurs. Examples of such laws are the law of gravity, the law of conservation of mass, and many other instances in chemistry, physics, biology... If event A occurs inevitably whenever the set of conditions B is satisfied, we say that A is certain or sure (under the set of conditions B). If A can never occur whenever B is satisfied, we say that A is impossible (under the set of conditions B). If A may or may not occur whenever B is satisfied, then A is said to be a random phenomenon. Random phenomena is our subject matter. Unlike certain and impossible events, the presence of randomness implies that the set of conditions B do not reflect all the necessary and sufficient conditions for the event A to occur. It might seem them impossible to make any worthwhile statements about random phenomena. However, experience has shown that many random phenomena exhibit a statistical regularity that makes them subject to study. For such random phenomena it is possible to estimate the chance of occurrence of the random event. This estimate can be obtained from laws, called probabilistic or stochastic, with the form: if a set of conditions B is satisfied event A occurs m times =) repeatedly n times out of the n repetitions.
    [Show full text]
  • Basic Structures: Sets, Functions, Sequences, and Sums 2-2
    CHAPTER Basic Structures: Sets, Functions, 2 Sequences, and Sums 2.1 Sets uch of discrete mathematics is devoted to the study of discrete structures, used to represent discrete objects. Many important discrete structures are built using sets, which 2.2 Set Operations M are collections of objects. Among the discrete structures built from sets are combinations, 2.3 Functions unordered collections of objects used extensively in counting; relations, sets of ordered pairs that represent relationships between objects; graphs, sets of vertices and edges that connect 2.4 Sequences and vertices; and finite state machines, used to model computing machines. These are some of the Summations topics we will study in later chapters. The concept of a function is extremely important in discrete mathematics. A function assigns to each element of a set exactly one element of a set. Functions play important roles throughout discrete mathematics. They are used to represent the computational complexity of algorithms, to study the size of sets, to count objects, and in a myriad of other ways. Useful structures such as sequences and strings are special types of functions. In this chapter, we will introduce the notion of a sequence, which represents ordered lists of elements. We will introduce some important types of sequences, and we will address the problem of identifying a pattern for the terms of a sequence from its first few terms. Using the notion of a sequence, we will define what it means for a set to be countable, namely, that we can list all the elements of the set in a sequence.
    [Show full text]
  • A Diagrammatic Inference System with Euler Circles ∗
    A Diagrammatic Inference System with Euler Circles ∗ Koji Mineshima, Mitsuhiro Okada, and Ryo Takemura Department of Philosophy, Keio University, Japan. fminesima,mitsu,[email protected] Abstract Proof-theory has traditionally been developed based on linguistic (symbolic) represen- tations of logical proofs. Recently, however, logical reasoning based on diagrammatic or graphical representations has been investigated by logicians. Euler diagrams were intro- duced in the 18th century by Euler [1768]. But it is quite recent (more precisely, in the 1990s) that logicians started to study them from a formal logical viewpoint. We propose a novel approach to the formalization of Euler diagrammatic reasoning, in which diagrams are defined not in terms of regions as in the standard approach, but in terms of topo- logical relations between diagrammatic objects. We formalize the unification rule, which plays a central role in Euler diagrammatic reasoning, in a style of natural deduction. We prove the soundness and completeness theorems with respect to a formal set-theoretical semantics. We also investigate structure of diagrammatic proofs and prove a normal form theorem. 1 Introduction Euler diagrams were introduced by Euler [3] to illustrate syllogistic reasoning. In Euler dia- grams, logical relations among the terms of a syllogism are simply represented by topological relations among circles. For example, the universal categorical statements of the forms All A are B and No A are B are represented by the inclusion and the exclusion relations between circles, respectively, as seen in Fig. 1. Given two Euler diagrams which represent the premises of a syllogism, the syllogistic inference can be naturally replaced by the task of manipulating the diagrams, in particular of unifying the diagrams and extracting information from them.
    [Show full text]
  • 1 Probabilities
    1 Probabilities 1.1 Experiments with randomness We will use the term experiment in a very general way to refer to some process that produces a random outcome. Examples: (Ask class for some first) Here are some discrete examples: • roll a die • flip a coin • flip a coin until we get heads And here are some continuous examples: • height of a U of A student • random number in [0, 1] • the time it takes until a radioactive substance undergoes a decay These examples share the following common features: There is a proce- dure or natural phenomena called the experiment. It has a set of possible outcomes. There is a way to assign probabilities to sets of possible outcomes. We will call this a probability measure. 1.2 Outcomes and events Definition 1. An experiment is a well defined procedure or sequence of procedures that produces an outcome. The set of possible outcomes is called the sample space. We will typically denote an individual outcome by ω and the sample space by Ω. Definition 2. An event is a subset of the sample space. This definition will be changed when we come to the definition ofa σ-field. The next thing to define is a probability measure. Before we can do this properly we need some more structure, so for now we just make an informal definition. A probability measure is a function on the collection of events 1 that assign a number between 0 and 1 to each event and satisfies certain properties. NB: A probability measure is not a function on Ω.
    [Show full text]
  • Propensities and Probabilities
    ARTICLE IN PRESS Studies in History and Philosophy of Modern Physics 38 (2007) 593–625 www.elsevier.com/locate/shpsb Propensities and probabilities Nuel Belnap 1028-A Cathedral of Learning, University of Pittsburgh, Pittsburgh, PA 15260, USA Received 19 May 2006; accepted 6 September 2006 Abstract Popper’s introduction of ‘‘propensity’’ was intended to provide a solid conceptual foundation for objective single-case probabilities. By considering the partly opposed contributions of Humphreys and Miller and Salmon, it is argued that when properly understood, propensities can in fact be understood as objective single-case causal probabilities of transitions between concrete events. The chief claim is that propensities are well-explicated by describing how they fit into the existing formal theory of branching space-times, which is simultaneously indeterministic and causal. Several problematic examples, some commonsense and some quantum-mechanical, are used to make clear the advantages of invoking branching space-times theory in coming to understand propensities. r 2007 Elsevier Ltd. All rights reserved. Keywords: Propensities; Probabilities; Space-times; Originating causes; Indeterminism; Branching histories 1. Introduction You are flipping a fair coin fairly. You ascribe a probability to a single case by asserting The probability that heads will occur on this very next flip is about 50%. ð1Þ The rough idea of a single-case probability seems clear enough when one is told that the contrast is with either generalizations or frequencies attributed to populations asserted while you are flipping a fair coin fairly, such as In the long run; the probability of heads occurring among flips is about 50%. ð2Þ E-mail address: [email protected] 1355-2198/$ - see front matter r 2007 Elsevier Ltd.
    [Show full text]
  • Determinism, Indeterminism and the Statistical Postulate
    Tipicality, Explanation, The Statistical Postulate, and GRW Valia Allori Northern Illinois University [email protected] www.valiaallori.com Rutgers, October 24-26, 2019 1 Overview Context: Explanation of the macroscopic laws of thermodynamics in the Boltzmannian approach Among the ingredients: the statistical postulate (connected with the notion of probability) In this presentation: typicality Def: P is a typical property of X-type object/phenomena iff the vast majority of objects/phenomena of type X possesses P Part I: typicality is sufficient to explain macroscopic laws – explanatory schema based on typicality: you explain P if you explain that P is typical Part II: the statistical postulate as derivable from the dynamics Part III: if so, no preference for indeterministic theories in the quantum domain 2 Summary of Boltzmann- 1 Aim: ‘derive’ macroscopic laws of thermodynamics in terms of the microscopic Newtonian dynamics Problems: Technical ones: There are to many particles to do exact calculations Solution: statistical methods - if 푁 is big enough, one can use suitable mathematical technique to obtain information of macro systems even without having the exact solution Conceptual ones: Macro processes are irreversible, while micro processes are not Boltzmann 3 Summary of Boltzmann- 2 Postulate 1: the microscopic dynamics microstate 푋 = 푟1, … . 푟푁, 푣1, … . 푣푁 in phase space Partition of phase space into Macrostates Macrostate 푀(푋): set of macroscopically indistinguishable microstates Macroscopic view Many 푋 for a given 푀 given 푀, which 푋 is unknown Macro Properties (e.g. temperature): they slowly vary on the Macro scale There is a particular Macrostate which is incredibly bigger than the others There are many ways more to have, for instance, uniform temperature than not equilibrium (Macro)state 4 Summary of Boltzmann- 3 Entropy: (def) proportional to the size of the Macrostate in phase space (i.e.
    [Show full text]
  • Topic 1: Basic Probability Definition of Sets
    Topic 1: Basic probability ² Review of sets ² Sample space and probability measure ² Probability axioms ² Basic probability laws ² Conditional probability ² Bayes' rules ² Independence ² Counting ES150 { Harvard SEAS 1 De¯nition of Sets ² A set S is a collection of objects, which are the elements of the set. { The number of elements in a set S can be ¯nite S = fx1; x2; : : : ; xng or in¯nite but countable S = fx1; x2; : : :g or uncountably in¯nite. { S can also contain elements with a certain property S = fx j x satis¯es P g ² S is a subset of T if every element of S also belongs to T S ½ T or T S If S ½ T and T ½ S then S = T . ² The universal set ­ is the set of all objects within a context. We then consider all sets S ½ ­. ES150 { Harvard SEAS 2 Set Operations and Properties ² Set operations { Complement Ac: set of all elements not in A { Union A \ B: set of all elements in A or B or both { Intersection A [ B: set of all elements common in both A and B { Di®erence A ¡ B: set containing all elements in A but not in B. ² Properties of set operations { Commutative: A \ B = B \ A and A [ B = B [ A. (But A ¡ B 6= B ¡ A). { Associative: (A \ B) \ C = A \ (B \ C) = A \ B \ C. (also for [) { Distributive: A \ (B [ C) = (A \ B) [ (A \ C) A [ (B \ C) = (A [ B) \ (A [ C) { DeMorgan's laws: (A \ B)c = Ac [ Bc (A [ B)c = Ac \ Bc ES150 { Harvard SEAS 3 Elements of probability theory A probabilistic model includes ² The sample space ­ of an experiment { set of all possible outcomes { ¯nite or in¯nite { discrete or continuous { possibly multi-dimensional ² An event A is a set of outcomes { a subset of the sample space, A ½ ­.
    [Show full text]
  • Elements of Set Theory
    Elements of set theory April 1, 2014 ii Contents 1 Zermelo{Fraenkel axiomatization 1 1.1 Historical context . 1 1.2 The language of the theory . 3 1.3 The most basic axioms . 4 1.4 Axiom of Infinity . 4 1.5 Axiom schema of Comprehension . 5 1.6 Functions . 6 1.7 Axiom of Choice . 7 1.8 Axiom schema of Replacement . 9 1.9 Axiom of Regularity . 9 2 Basic notions 11 2.1 Transitive sets . 11 2.2 Von Neumann's natural numbers . 11 2.3 Finite and infinite sets . 15 2.4 Cardinality . 17 2.5 Countable and uncountable sets . 19 3 Ordinals 21 3.1 Basic definitions . 21 3.2 Transfinite induction and recursion . 25 3.3 Applications with choice . 26 3.4 Applications without choice . 29 3.5 Cardinal numbers . 31 4 Descriptive set theory 35 4.1 Rational and real numbers . 35 4.2 Topological spaces . 37 4.3 Polish spaces . 39 4.4 Borel sets . 43 4.5 Analytic sets . 46 4.6 Lebesgue's mistake . 48 iii iv CONTENTS 5 Formal logic 51 5.1 Propositional logic . 51 5.1.1 Propositional logic: syntax . 51 5.1.2 Propositional logic: semantics . 52 5.1.3 Propositional logic: completeness . 53 5.2 First order logic . 56 5.2.1 First order logic: syntax . 56 5.2.2 First order logic: semantics . 59 5.2.3 Completeness theorem . 60 6 Model theory 67 6.1 Basic notions . 67 6.2 Ultraproducts and nonstandard analysis . 68 6.3 Quantifier elimination and the real closed fields .
    [Show full text]
  • Probability Theory Review 1 Basic Notions: Sample Space, Events
    Fall 2018 Probability Theory Review Aleksandar Nikolov 1 Basic Notions: Sample Space, Events 1 A probability space (Ω; P) consists of a finite or countable set Ω called the sample space, and the P probability function P :Ω ! R such that for all ! 2 Ω, P(!) ≥ 0 and !2Ω P(!) = 1. We call an element ! 2 Ω a sample point, or outcome, or simple event. You should think of a sample space as modeling some random \experiment": Ω contains all possible outcomes of the experiment, and P(!) gives the probability that we are going to get outcome !. Note that we never speak of probabilities except in relation to a sample space. At this point we give a few examples: 1. Consider a random experiment in which we toss a single fair coin. The two possible outcomes are that the coin comes up heads (H) or tails (T), and each of these outcomes is equally likely. 1 Then the probability space is (Ω; P), where Ω = fH; T g and P(H) = P(T ) = 2 . 2. Consider a random experiment in which we toss a single coin, but the coin lands heads with 2 probability 3 . Then, once again the sample space is Ω = fH; T g but the probability function 2 1 is different: P(H) = 3 , P(T ) = 3 . 3. Consider a random experiment in which we toss a fair coin three times, and each toss is independent of the others. The coin can come up heads all three times, or come up heads twice and then tails, etc.
    [Show full text]
  • 1 Probability Measure and Random Variables
    1 Probability measure and random variables 1.1 Probability spaces and measures We will use the term experiment in a very general way to refer to some process that produces a random outcome. Definition 1. The set of possible outcomes is called the sample space. We will typically denote an individual outcome by ω and the sample space by Ω. Set notation: A B, A is a subset of B, means that every element of A is also in B. The union⊂ A B of A and B is the of all elements that are in A or B, including those that∪ are in both. The intersection A B of A and B is the set of all elements that are in both of A and B. ∩ n j=1Aj is the set of elements that are in at least one of the Aj. ∪n j=1Aj is the set of elements that are in all of the Aj. ∩∞ ∞ j=1Aj, j=1Aj are ... Two∩ sets A∪ and B are disjoint if A B = . denotes the empty set, the set with no elements. ∩ ∅ ∅ Complements: The complement of an event A, denoted Ac, is the set of outcomes (in Ω) which are not in A. Note that the book writes it as Ω A. De Morgan’s laws: \ (A B)c = Ac Bc ∪ ∩ (A B)c = Ac Bc ∩ ∪ c c ( Aj) = Aj j j [ \ c c ( Aj) = Aj j j \ [ (1) Definition 2. Let Ω be a sample space. A collection of subsets of Ω is a σ-field if F 1.
    [Show full text]
  • CSE 21 Mathematics for Algorithm and System Analysis
    CSE 21 Mathematics for Algorithm and System Analysis Unit 1: Basic Count and List Section 3: Set (cont’d) Section 4: Probability and Basic Counting CSE21: Lecture 4 1 Quiz Information • The first quiz will be in the first 15 minutes of the next class (Monday) at the same classroom. • You can use textbook and notes during the quiz. • For all the questions, no final number is necessary, arithmetic formula is enough. • Write down your analysis, e.g., applicable theorem(s)/rule(s). We will give partial credit if the analysis is correct but the result is wrong. CSE21: Lecture 4 2 Correction • For set U={1, 2, 3, 4, 5}, A={1, 2, 3}, B={3, 4}, – Set Difference A − B = {1, 2}, B − A ={4} – Symmetric Difference: A ⊕ B = ( A − B)∪(B − A)= {1, 2} ∪{4} = {1, 2, 4} CSE21: Lecture 4 3 Card Hand Illustration • 5 card hand of full house: a pair and a triple • 5 card hand with two pairs CSE21: Lecture 4 4 Review: Binomial Coefficient • Binomial Coefficient: number of subsets of A of size (or cardinality) k: n n! C(n, k) = = k k (! n − k)! CSE21: Lecture 4 5 Review : Deriving Recursions • How to construct the things of a given size by using the same type of things of a smaller size? • Recursion formula of binomial coefficient – C(0,0) = 1, – C(0, k) = 0 for k ≠ 0 and – C(n,k) = C(n−1, k−1)+ C(n−1, k) for n > 0; • It shows how recursion works and tells another way calculating C(n,k) besides the formula n n! C(n, k) = = k k (! n − k)! 6 Learning Outcomes • By the end of this lesson, you should be able to – Calculate set partition number by recursion.
    [Show full text]