Basic Statistics for SGPE Students [.3Cm] Part I: Probability Theory=1[Frame]Thanks to Achim Ahrens, Anna Babloyan and Erkal

Total Page:16

File Type:pdf, Size:1020Kb

Basic Statistics for SGPE Students [.3Cm] Part I: Probability Theory=1[Frame]Thanks to Achim Ahrens, Anna Babloyan and Erkal Basic Statistics for SGPE Students Part I: Probability theory1 Nicolai Vitt [email protected] University of Edinburgh September 2019 1Thanks to Achim Ahrens, Anna Babloyan and Erkal Ersoy for creating these slides and allowing me to use them. Outline 1. Probability theory I Conditional probabilities and independence I Bayes’ theorem 2. Probability distributions I Discrete and continuous probability functions I Probability density function & cumulative distribution function I Binomial, Poisson and Normal distribution I E[X] and V[X] 3. Descriptive statistics I Sample statistics (mean, variance, percentiles) I Graphs (box plot, histogram) I Data transformations (log transformation, unit of measure) I Correlation vs. Causation 4. Statistical inference I Population vs. sample I Law of large numbers I Central limit theorem I Confidence intervals I Hypothesis testing and p-values 1 / 35 Probability Example II.1 A fair coin is tossed three times. Sample space and event The (mutually exclusive and exhaustive) list of possible outcomes of an experiment is known as the sample space and is denoted as S. An event E is a single outcome or group of outcomes in the sample space. That is, E is a subset of S. In this example, S = {HHH ; THH ; HTH ; HHT; HTT; THT; TTH ; TTT} where H and T denote head and tail. Suppose we are interested in the event ‘at least two heads’. The corresponding subspace is E = {HHH ; THH ; HTH ; HHT}. What is the probability of the event E? 2 / 35 Probability Let’s take a step back: What is probability? Classical Interpretation (Jacob Bernoulli, Pierre-Simon Laplace) If outcomes are equally likely, they must have the same probability. For example, when a coin is tossed, there are two possible outcomes: head and tail. More general, if there are n equally likely outcomes, then the probability of each outcome is 1/n. Frequency Interpretation The probability that a specific outcome of a process will be obtained is the relative frequency with which that outcome would be obtained if the process were repeated a large number of times under the same conditions. 1 Trial 1 Trial 2 .8 As we make more and more .6 tosses, the proportion of tosses that produce head approaches 0.5. We say that .4 0.5 is the probability of head. Rel. frequency of heads .2 0 0 20 40 60 80 100 Number of tosses 3 / 35 Probability Let’s take a step back: What is probability? Subjective Interpretation (Bayesian approach) The probability that a person assigns to a possible outcome represents his own judgement (based on the person’s beliefs and information). Another person, who may have different beliefs or different information, may assign a different probability to the same outcome. Distinction between prior and posterior beliefs. Thinking about randomness [Carl Friedrich] Gauss’s conversation turned to chance, the enemy of all knowledge, and the thing he had always wished to overcome. Viewed from up close, one could detect the infinite fineness of the web of causality behind every event. Step back and larger patterns appeared: Freedom and Chance were a question of distance, a point of view. Did he understand? Sort of, said Eugen wearily, looking at his pocket watch. from Measuring the World by Daniel Kehlmann 4 / 35 Probability Properties of probability S Rule 1 A For any event A, 0 ≤ P(A) ≤ 1. Furthermore, P(S) = 1. S Rule 2: Complement rule A Ac denotes the complement of event A. P(Ac) = 1 − P(A) Ac S Rule 3: Multiplication rule Two events A and B are independent of each other ABc AB AcB if and only if P(AB) = P(A and B) = P(A ∩ B) = P(A)P(B). 5 / 35 Probability Properties of probability S Rule 4: Addition rule A B If two events A and B are mutually exclusive, then P(A or B) = P(A ∪ B) = P(A) + P(B). S c Rule 5 AB B If event B is a subset of event A, then P(B) < P(A). 6 / 35 Probability What is the probability of E? Example II.1 A fair coin is tossed three times. S = {HHH , THH , HTH , HHT, HTT, THT, TTH , TTT} E = {HHH , THH , HTH , HHT} What is P(E)? First, note that – because the coin is fair – 1 P(H ) = P(T) = . 2 Second, since each toss is independent of the previous, we can use Rule 3 (Multiplication Rule), 1 1 1 1 P(HHH ) = P(H )P(H )P(H ) = × × = . 2 2 2 8 and following the same reasoning, P(HHT) = P(HHT) = ... = 1/8. Third, using Rule 4 (Addition Rule) 4 1 P(E) = P(HHH ) + P(THH ) + P(HTH ) + P(HHT) = = . 8 2 7 / 35 Probability Generalised addition rule Example II.2 A fair six-sided die is rolled. The sample space is given by S = {1, 2, 3, 4, 5, 6}. Let E1 be the event ‘obtain 3 or 4’ and let E2 denote the event ‘smaller than 4’. Thus, E1 = {3, 4} and E2 = {1, 2, 3} 2 3 It is immediately clear that P(E1) = /6 and P(E2) = /6. But what is the probability that either E1 or E2? That is, what is P(E1 ∪ E2)? Since E1 and E2 are not mutually exclusive, we cannot apply Rule 4 (Addition Rule). But we can generalise Rule 4. 8 / 35 Probability Generalised addition rule S Rule 4’: (General) Addition rule For any two events A and B P(A or B) = P(A ∪ B) = P(A) + P(B) − P(AB). ABc AB AcB Note that if A and B are mutually exclusive, P(AB) = 0. Therefore, Rule 4 is a special case of Rule 4’. Applying Rule 4’, we get P(E1 ∪ E2) = P(E1) + P(E2) − P(E1E2) = P(3) + P(4) + P(1) + P(2) + P(3) − P(3) | {z } | {z } |{z} P(E1) P(E2) P(E1E2) 1 1 1 1 1 1 4 = + + + + − = . 6 6 6 6 6 6 6 9 / 35 Conditional probability Example II.3 Suppose that, on any particular day, Anna is either in a good mood (A) or in a bad mood (Ac). Also, on any particular day, the sun is either shining (B) or not (Bc). Anna’s mood depends on the weather, such that she is more likely to be in a good mood when the sun is shining. S A The blue area A which represents the probability that Anna is in a good mood is rather small compared to the full rectangle (≈ 35%). In general, it is more likely that Anna is in a bad mood. 10 / 35 Conditional probability Example II.3 Suppose that, on any particular day, Anna is either in a good mood (A) or in a bad mood (Ac). Also, on any particular day, the sun is either shining (B) or not (Bc). Anna’s mood depends on the weather, such that she is more likely to be in a good mood when the sun is shining. S AB ←−AcB ABc −→ This graph shows both events, A and B, and their overlap. 11 / 35 Conditional probability Example II.3 Suppose that, on any particular day, Anna is either in a good mood (A) or in a bad mood (Ac). Also, on any particular day, the sun is either shining (B) or not (Bc). Anna’s mood depends on the weather, such that she is more likely to be in a good mood when the sun is shining. AB ←−AcB Now, suppose the sun is shining. We can discard the remaining sample space and focus on B. The area AB takes up most of the area in the circle. That is, given that B occured, it is more likely that Anna is in a good mood, although – in general – she is more often in a bad mood. 12 / 35 Conditional probability S Rule 3’: General Multiplication rule If A and B are any two events and P(B) > 0, ABc AB AcB then P(AB) = P(A)P(B|A) = P(B)P(A|B). P(A|B) is the conditional probability of the event A given that the event B has occurred. Conditional probability From Rule 3’ follows the definition for conditional probability P(AB) P(A|B) = . P(B) Note that, if A and B are independent, then P(A)P(B) P(A|B) = = P(A). P(B) Thus, Rule 3 is a special case of Rule 3’. 13 / 35 Conditional probability Example II.4 The following table contains counts (in thousands) of persons aged 25 and older, classified by educational attainment and employment status: Not in Education Employed Unemployed Total labor force Did not finish high school 11,521 886 14,226 26,633 High school degree 36,857 1,682 22,834 61,373 Some college 34,612 1,275 13,944 49,831 Bachelor’s degree or higher 43,182 892 12,546 56,620 Total 126,172 4,735 63,550 194,457 Is employment status independent of educational attainment? Suppose we randomly draw a person from the population. What is the probability that the person is employed? 126,172 P(employed) = = 0.6488. 194,457 Now, suppose we randomly draw another person and are given the information that the person did not finish high school. What is the probability that the person is employed given that the person did not finish high school? 11,521 P(employed|did not finish high school) = = 0.4326. 26,633 14 / 35 Conditional probability We can display the relationship between education and employment in a probability table. Not in Education Employed Unemployed Total labor force Did not finish high school 0.05925 0.00456 0.07316 0.13696 High school degree 0.18954 0.00865 0.11742 0.31561 Some college 0.17800 0.00656 0.07171 0.25626 Bachelor’s degree or higher 0.22206 0.00459 0.06452 0.29117 Total 0.64884 0.02435 0.32681 1.00000 The probabilities in the central enclosed rectangle are joint probabilities.
Recommended publications
  • Random Variable = a Real-Valued Function of an Outcome X = F(Outcome)
    Random Variables (Chapter 2) Random variable = A real-valued function of an outcome X = f(outcome) Domain of X: Sample space of the experiment. Ex: Consider an experiment consisting of 3 Bernoulli trials. Bernoulli trial = Only two possible outcomes – success (S) or failure (F). • “IF” statement: if … then “S” else “F” • Examine each component. S = “acceptable”, F = “defective”. • Transmit binary digits through a communication channel. S = “digit received correctly”, F = “digit received incorrectly”. Suppose the trials are independent and each trial has a probability ½ of success. X = # successes observed in the experiment. Possible values: Outcome Value of X (SSS) (SSF) (SFS) … … (FFF) Random variable: • Assigns a real number to each outcome in S. • Denoted by X, Y, Z, etc., and its values by x, y, z, etc. • Its value depends on chance. • Its value becomes available once the experiment is completed and the outcome is known. • Probabilities of its values are determined by the probabilities of the outcomes in the sample space. Probability distribution of X = A table, formula or a graph that summarizes how the total probability of one is distributed over all the possible values of X. In the Bernoulli trials example, what is the distribution of X? 1 Two types of random variables: Discrete rv = Takes finite or countable number of values • Number of jobs in a queue • Number of errors • Number of successes, etc. Continuous rv = Takes all values in an interval – i.e., it has uncountable number of values. • Execution time • Waiting time • Miles per gallon • Distance traveled, etc. Discrete random variables X = A discrete rv.
    [Show full text]
  • Is the Cosmos Random?
    IS THE RANDOM? COSMOS QUANTUM PHYSICS Einstein’s assertion that God does not play dice with the universe has been misinterpreted By George Musser Few of Albert Einstein’s sayings have been as widely quot- ed as his remark that God does not play dice with the universe. People have naturally taken his quip as proof that he was dogmatically opposed to quantum mechanics, which views randomness as a built-in feature of the physical world. When a radioactive nucleus decays, it does so sponta- neously; no rule will tell you when or why. When a particle of light strikes a half-silvered mirror, it either reflects off it or passes through; the out- come is open until the moment it occurs. You do not need to visit a labora- tory to see these processes: lots of Web sites display streams of random digits generated by Geiger counters or quantum optics. Being unpredict- able even in principle, such numbers are ideal for cryptography, statistics and online poker. Einstein, so the standard tale goes, refused to accept that some things are indeterministic—they just happen, and there is not a darned thing anyone can do to figure out why. Almost alone among his peers, he clung to the clockwork universe of classical physics, ticking mechanistically, each moment dictating the next. The dice-playing line became emblemat- ic of the B side of his life: the tragedy of a revolutionary turned reaction- ary who upended physics with relativity theory but was, as Niels Bohr put it, “out to lunch” on quantum theory.
    [Show full text]
  • Topic 1: Basic Probability Definition of Sets
    Topic 1: Basic probability ² Review of sets ² Sample space and probability measure ² Probability axioms ² Basic probability laws ² Conditional probability ² Bayes' rules ² Independence ² Counting ES150 { Harvard SEAS 1 De¯nition of Sets ² A set S is a collection of objects, which are the elements of the set. { The number of elements in a set S can be ¯nite S = fx1; x2; : : : ; xng or in¯nite but countable S = fx1; x2; : : :g or uncountably in¯nite. { S can also contain elements with a certain property S = fx j x satis¯es P g ² S is a subset of T if every element of S also belongs to T S ½ T or T S If S ½ T and T ½ S then S = T . ² The universal set ­ is the set of all objects within a context. We then consider all sets S ½ ­. ES150 { Harvard SEAS 2 Set Operations and Properties ² Set operations { Complement Ac: set of all elements not in A { Union A \ B: set of all elements in A or B or both { Intersection A [ B: set of all elements common in both A and B { Di®erence A ¡ B: set containing all elements in A but not in B. ² Properties of set operations { Commutative: A \ B = B \ A and A [ B = B [ A. (But A ¡ B 6= B ¡ A). { Associative: (A \ B) \ C = A \ (B \ C) = A \ B \ C. (also for [) { Distributive: A \ (B [ C) = (A \ B) [ (A \ C) A [ (B \ C) = (A [ B) \ (A [ C) { DeMorgan's laws: (A \ B)c = Ac [ Bc (A [ B)c = Ac \ Bc ES150 { Harvard SEAS 3 Elements of probability theory A probabilistic model includes ² The sample space ­ of an experiment { set of all possible outcomes { ¯nite or in¯nite { discrete or continuous { possibly multi-dimensional ² An event A is a set of outcomes { a subset of the sample space, A ½ ­.
    [Show full text]
  • Sample Space, Events and Probability
    Sample Space, Events and Probability Sample Space and Events There are lots of phenomena in nature, like tossing a coin or tossing a die, whose outcomes cannot be predicted with certainty in advance, but the set of all the possible outcomes is known. These are what we call random phenomena or random experiments. Probability theory is concerned with such random phenomena or random experiments. Consider a random experiment. The set of all the possible outcomes is called the sample space of the experiment and is usually denoted by S. Any subset E of the sample space S is called an event. Here are some examples. Example 1 Tossing a coin. The sample space is S = fH; T g. E = fHg is an event. Example 2 Tossing a die. The sample space is S = f1; 2; 3; 4; 5; 6g. E = f2; 4; 6g is an event, which can be described in words as "the number is even". Example 3 Tossing a coin twice. The sample space is S = fHH;HT;TH;TT g. E = fHH; HT g is an event, which can be described in words as "the first toss results in a Heads. Example 4 Tossing a die twice. The sample space is S = f(i; j): i; j = 1; 2;:::; 6g, which contains 36 elements. "The sum of the results of the two toss is equal to 10" is an event. Example 5 Choosing a point from the interval (0; 1). The sample space is S = (0; 1). E = (1=3; 1=2) is an event. Example 6 Measuring the lifetime of a lightbulb.
    [Show full text]
  • Probability Spaces Lecturer: Dr
    EE5110: Probability Foundations for Electrical Engineers July-November 2015 Lecture 4: Probability Spaces Lecturer: Dr. Krishna Jagannathan Scribe: Jainam Doshi, Arjun Nadh and Ajay M 4.1 Introduction Just as a point is not defined in elementary geometry, probability theory begins with two entities that are not defined. These undefined entities are a Random Experiment and its Outcome: These two concepts are to be understood intuitively, as suggested by their respective English meanings. We use these undefined terms to define other entities. Definition 4.1 The Sample Space Ω of a random experiment is the set of all possible outcomes of a random experiment. An outcome (or elementary outcome) of the random experiment is usually denoted by !: Thus, when a random experiment is performed, the outcome ! 2 Ω is picked by the Goddess of Chance or Mother Nature or your favourite genie. Note that the sample space Ω can be finite or infinite. Indeed, depending on the cardinality of Ω, it can be classified as follows: 1. Finite sample space 2. Countably infinite sample space 3. Uncountable sample space It is imperative to note that for a given random experiment, its sample space is defined depending on what one is interested in observing as the outcome. We illustrate this using an example. Consider a person tossing a coin. This is a random experiment. Now consider the following three cases: • Suppose one is interested in knowing whether the toss produces a head or a tail, then the sample space is given by, Ω = fH; T g. Here, as there are only two possible outcomes, the sample space is said to be finite.
    [Show full text]
  • Section 4.1 Experiments, Sample Spaces, and Events
    Section 4.1 Experiments, Sample Spaces, and Events Experiments An experiment is an activity with specified observable results. Outcomes An outcome is a possible result of an experiment. Before discussing sample spaces and events, we need to discuss sets, elements, and subsets. Sets A set is a well-defined collection of objects. Elements The objects inside of a set are called elements of the set. a 2 A ( a is an element of A ) a2 = A ( a is not an element of A) Roster Notation for a Set Roster notation will be used exclusively in this class, and consists of listing the elements of a set in between curly braces. Subset If every element of a set A is also an element of a set B, then we say that A is a subset of B and we write A ⊆ B. Sample Spaces and Events A sample space, S, is the set consisting of ALL possible outcomes of an experiment. Therefore, the outcomes of an experiment are the elements of the set S, the sample space. An event, E, is a subset of the sample space. The subsets with just one outcome/element of the sample space are called simple events. The event ? is called the empty set, and represents the event where nothing happens. 1. An experiment consists of tossing a coin and observing the side that lands up and then rolling a fair 4-sided die and observing the number rolled. Let H and T represent heads and tails respectively. (a) Describe the sample space S corresponding to this experiment.
    [Show full text]
  • 7.1 Sample Space, Events, Probability
    7.17.1 SampleSample space,space, events,events, probabilityprobability • In this chapter, we will study the topic of probability which is used in many different areas including insurance, science, marketing, government and many other areas. BlaiseBlaise PascalPascal--fatherfather ofof modernmodern probabilityprobability http://www-gap.dcs.st- and.ac.uk/~history/Mathematicians/Pascal.html • Blaise Pascal • Born: 19 June 1623 in Clermont (now Clermont-Ferrand), Auvergne, France Died: 19 Aug 1662 in Paris, France • In correspondence with Fermat he laid the foundation for the theory of probability. This correspondence consisted of five letters and occurred in the summer of 1654. They considered the dice problem, already studied by Cardan, and the problem of points also considered by Cardan and, around the same time, Pacioli and Tartaglia. The dice problem asks how many times one must throw a pair of dice before one expects a double six while the problem of points asks how to divide the stakes if a game of dice is incomplete. They solved the problem of points for a two player game but did not develop powerful enough mathematical methods to solve it for three or more players. PascalPascal ProbabilityProbability • 1. Important in inferential statistics, a branch of statistics that relies on sample information to make decisions about a population. • 2. Used to make decisions in the face of uncertainty. TerminologyTerminology •1. Random experiment : is a process or activity which produces a number of possible outcomes. The outcomes which result cannot be predicted with absolute certainty. • Example 1: Flip two coins and observe the possible outcomes of heads and tails ExamplesExamples •2.
    [Show full text]
  • Probabilities of Outcomes and Events
    Probabilities of outcomes and events Outcomes and events Probabilities are functions of events, where events can arise from one or more outcomes of an experiment or observation. To use an example from DNA sequences, if we pick a random location of DNA, the possible outcomes to observe are the letters A, C, G, T. The set of all possible outcomes is S = fA, C, G, Tg. The set of all possible outcomes is also called the sample space. Events of interest might be E1 = an A is observed, or E2 = a purine is observed. We can think of an event as a subset of the set of all possible outcomes. For example, E2 can also be described as fA; Gg. A more traditional example to use is with dice. For one roll of a die, the possible outcomes are S = f1; 2; 3; 4; 5; 6g. Events can be any subset of these. For example, the following are events for this sample space: • E1 = a 1 is rolled • E2 = a 2 is rolled • Ei = an i is rolled, where i is a particular value in f1,2,3,4,5,6g • E = an even number is rolled = f2; 4; 6g • D = an odd number is rolled = f1; 3; 5g • F = a prime number is rolled = f2; 3; 5g • G = f1; 5g where event G is the event that either a 1 or a 5 is rolled. Often events have natural descriptions, like \an odd number is rolled", but events can also be arbitrary subsets like G. Sample spaces and events can also involve more complicated descriptions.
    [Show full text]
  • Chapter 2 Conditional Probability and Independence
    CHAPTER 2 CONDITIONAL PROBABILITY AND INDEPENDENCE INTRODUCTION This chapter introduces the important concepts of conditional probability and statistical independence. Conditional probabilities arise when it is known that a certain event has occurred. This knowledge changes the probabilities of events within the sample space of the experiment. Conditioning on an event occurs frequently, and understanding how to work with conditional probabilities and apply them to a particular problem or application is extremely important. In some cases, knowing that a particular event has occurred will not effect the probability of another event, and this leads to the concept of the statistical independence of events that will be developed along with the concept of conditional independence. 2-1 CONDITIONAL PROBABILITY The three probability axioms introduced in the previous chapter provide the foundation upon which to develop a theory of probability. The next step is to understand how the knowledge that a particular event has occurred will change the probabilities that are assigned to the outcomes of an experiment. The concept of a conditional probability is one of the most important concepts in probability and, although a very simple concept, conditional probability is often confusing to students. In situations where a conditional probability is to be used, it is often overlooked or incorrectly applied, thereby leading to an incorrect answer or conclusion. Perhaps the best starting point for the development of conditional probabilities is a simple example that illustrates the context in which they arise. Suppose that we have an electronic device that has a probability pn of still working after n months of continuous operation, and suppose that the probability is equal to 0.5 that the device will still be working after one year (n = 12).
    [Show full text]
  • Basic Probability Theory
    TU Eindhoven Advanced Algorithms (2IL45) — Course Notes Basic probability theory Events. Probability theory is about experiments that can have different outcomes. The possible outcomes are called the elementary events, and the sample space is the set of all elementary events. A subset of the sample space is an event.1 (Note that if the subset is a singleton, then the event is an elementary event.) We call the set of all events defined by a sample space S the event space defined by S, and we denote it by Events(S). As an example, consider the experiment where we flip a coin two times. One possible outcome of the experiment is HT: we first get heads (H) and then tails (T). The sample space is the set {HH,HT,TH,TT }. The subset {HT, T H, T T } is one of the 16 events defined by this sample space, namely “the event that we get at least one T”. Probability distributions. A sample space S comes with a probability distribution, which is a mapping Pr : Events(S) → R such that 1. Pr[A] > 0 for all events A ∈ Events(S). 2. Pr[S] = 1. 3. Pr[A ∪ B] = Pr[A] + Pr[B] for any two events A, B ∈ Events(S) that are mutually exclusive (that is, events A, B such that A ∩ B = ∅). Another way of looking at this is that we assign non-negative probabilities to the elementary events (which sum to 1), and that the probability of an event is the sum of the probabilities of the elementary events it is composed of.
    [Show full text]
  • Outcomes, Sample Space Outcomes, Sample Space
    Outcomes, Sample Space Outcomes, Sample Space Toss of a coin. Sample space is S = {H, T } An outcome is a result of an experiment. I An experiment The model for probabiliteis is P (H) = 0 .5, P (T ) = 0 .5. means any action that can have a number of possible results, but which result will actually occur cannot be predicted with Roll a dice. Sample space is S = {1, 2, 3, 4, 5, 6} certainty prior to the experiment. e.g. Tossing of a coin. 1 with P (i) = 6 for i = 1 , 2,... 6. The set of all possible outcomes of an experiment is the Toss a quarter, a dime and a nickle together. Sample space is sample space or the outcome space . S = {HHH,HHT,HTH,HTT,THH,THT,TTH,TTT } Reasonable to model that the outcomes are equally likely. So A set of outcomes or a subset of the sample space is an event . 1 each outcome carries probability 8 . Outcomes, Sample Space Outcomes, Sample Space, Events Again toss a quarter, a dime and a nickle together, and concentrate only on the number of heads. The sample space now is S = {0, 1, 2, 3}. Do we have reason to model these as equally likely? When an experiment E results in m equally likely outcomes A little thought would convince us otherwise. An experiment {e1, e 2,...,e m}, probability of any event A is simply would also reveal the same: if we toss the three coins 100 times we #A P (A) = would observe that 1, 2 occur far more than 0, 3.
    [Show full text]
  • Discrete Random Variables and Probability Distributions
    Discrete Random 3 Variables and Probability Distributions Stat 4570/5570 Based on Devore’s book (Ed 8) Random Variables We can associate each single outcome of an experiment with a real number: We refer to the outcomes of such experiments as a “random variable”. Why is it called a “random variable”? 2 Random Variables Definition For a given sample space S of some experiment, a random variable (r.v.) is a rule that associates a number with each outcome in the sample space S. In mathematical language, a random variable is a “function” whose domain is the sample space and whose range is the set of real numbers: X : R S ! So, for any event s, we have X(s)=x is a real number. 3 Random Variables Notation! 1. Random variables - usually denoted by uppercase letters near the end of our alphabet (e.g. X, Y). 2. Particular value - now use lowercase letters, such as x, which correspond to the r.v. X. Birth weight example 4 Two Types of Random Variables A discrete random variable: Values constitute a finite or countably infinite set A continuous random variable: 1. Its set of possible values is the set of real numbers R, one interval, or a disjoint union of intervals on the real line (e.g., [0, 10] ∪ [20, 30]). 2. No one single value of the variable has positive probability, that is, P(X = c) = 0 for any possible value c. Only intervals have positive probabilities. 5 Probability Distributions for Discrete Random Variables Probabilities assigned to various outcomes in the sample space S, in turn, determine probabilities associated with the values of any particular random variable defined on S.
    [Show full text]