Lecture 10: Central Limit Theorem and Cdfs Raw Moment: 0 N Μn = E(X ) Sta230 / Mth 230 Central Moment: 2 Colin Rundel Μn = E[(X − Μ) ]
Total Page:16
File Type:pdf, Size:1020Kb
CLT Moments of Distributions Moments Lecture 10: Central Limit Theorem and CDFs Raw moment: 0 n µn = E(X ) Sta230 / Mth 230 Central moment: 2 Colin Rundel µn = E[(X − µ) ] February 25, 2014 Normalized / Standardized moment: µn σn Sta230 / Mth 230 (Colin Rundel) Lecture 10 February 25, 2014 1 / 23 CLT Moments of Distributions CLT Moments of Distributions Moment Generating Function Moment Generating Function - Properties The moment generating function of a random variable X is defined for all If X and Y are independent random variables then the moment generating real values of t by function for the distribution of X + Y is (P tx tX x e P(X = x) If X is discrete MX (t) = E[e ] = R tx t(X +Y ) tX tY tX tY x e P(X = x)dx If X is continuous MX +Y (t) = E[e ] = E[e e ] = E[e ]E[e ] = MX (t)MY (t) This is called the moment generating function because we can obtain the raw moments of X by successively differentiating MX (t) and evaluating at t = 0. Similarly, the moment generating function for Sn, the sum of iid random 0 0 MX (0) = E[e ] = 1 = µ0 variables X1; X2;:::; Xn is 0 d tX d tX tX n MX (t) = E[e ] = E e = E[Xe ] dt dt MSn (t) = [MXi (t)] 0 0 0 MX (0) = E[Xe ] = E[X ] = µ1 d d d M00(t) = M0 (t) = E[XetX ] = E (XetX ) = E[X 2etX ] X dt X dt dt 00 2 0 2 0 MX (0) = E[X e ] = E[X ] = µ2 Sta230 / Mth 230 (Colin Rundel) Lecture 10 February 25, 2014 2 / 23 Sta230 / Mth 230 (Colin Rundel) Lecture 10 February 25, 2014 3 / 23 CLT Moments of Distributions CLT Moments of Distributions Moment Generating Function - Unit Normal Moment Generating Function - Unit Normal, cont. Let Z ∼ N (0; 1) then Sta230 / Mth 230 (Colin Rundel) Lecture 10 February 25, 2014 4 / 23 Sta230 / Mth 230 (Colin Rundel) Lecture 10 February 25, 2014 5 / 23 CLT Moments of Distributions CLT Proof of the CLT Central Limit Theorem Sketch of Proof Let X1;:::; Xn be a sequence of independent and identically distributed Proposition 2 random variables each having mean µ and variance σ . Then the Let X1; X2;::: be a sequence of independent and identically distributed random distribution of variables and Sn = X1 +···+Xn. The distribution of Sn is given by the distribution function fSn which has a moment generating function MSn with n ≥ 1. X + ··· + X − nµ 1 p n σ n Let Z being a random variable with distribution function fZ and moment generat- ing function MZ . tends to the unit normal as n ! 1. If MSn (t) ! MZ (t) for all t, then fSn (t) ! fZ (t) for all t at which fZ (t) is That is, for −∞ < a < 1, continuous. Z a t2=2 X1 + ··· + Xn − nµ 1 −x2=2 We can prove the CLT by letting Z ∼ N (0; 1), MZ (t) = e and then P p ≤ a ! p e dx = Φ(a) as n ! 1 p t2=2 σ n 2π −∞ showing for any Sn that MSn= n ! e as n ! 1. Sta230 / Mth 230 (Colin Rundel) Lecture 10 February 25, 2014 6 / 23 Sta230 / Mth 230 (Colin Rundel) Lecture 10 February 25, 2014 7 / 23 CLT Proof of the CLT CLT Proof of the CLT Proof of the CLT Proof of the CLT, cont. p Some simplifying assumptions and notation: The moment generating function of Xi = n is given by tX t E(Xi ) = 0 M p (t) = E exp p i = M p Xi = n n Xi n Var(Xi ) = 1 p p MXi (t) exists and is finite Pn and this the moment generating function of Sn= n = i=1 Xi = n is L(t) = log M(t) given by t n Also, remember L'Hospital's Rule: M p (t) = M p Sn= n Xi n f (x) f 0(x) lim = lim x!1 g(x) x!1 g 0(x) p Therefore in order to show MSn= n ! MZ (t) we need to show n t 2 M p ! et =2 Xi n Sta230 / Mth 230 (Colin Rundel) Lecture 10 February 25, 2014 8 / 23 Sta230 / Mth 230 (Colin Rundel) Lecture 10 February 25, 2014 9 / 23 CLT Proof of the CLT CLT Proof of the CLT Proof of the CLT, cont. Proof of the CLT, cont. Sta230 / Mth 230 (Colin Rundel) Lecture 10 February 25, 2014 10 / 23 Sta230 / Mth 230 (Colin Rundel) Lecture 10 February 25, 2014 11 / 23 CLT Proof of the CLT Continuous Random Variables Cumulative Distribution Function Proof of the CLT, Final Comments Cumulative Distribution Function The preceding proof assumes that E(Xi ) = 0 and Var(Xi ) = 1. We have already seen a variety of problems where we find P(X <= x) or P(X > x) etc. The former is given a special name - the cumulative We can generalize this result to any collection of random variables Yi by distribution function. considering the standardized form Y ∗ = (Y − µ)/σ. i i If X is discrete with probability mass function f (x) then x X Y + ··· + Y − nµ Y − µ Y − µ p P(X ≤ x) = F (x) = f (z) 1 p n = 1 + ··· + n n σ n σ σ z=−∞ ∗ ∗ p = (Y1 + ··· + Yn ) = n If X is continuous with probability density function f (x) then Z x P(X ≤ x) = F (x) = f (z)dz ∗ E(Yi ) = 0 −∞ ∗ Var(Yi ) = 1 CDF is defined for for all −∞ < x < 1 and follows the following rules: lim F (x) = 0 lim F (x) = 1 x < y ) F (x) < x→−∞ x!1 F (y) Sta230 / Mth 230 (Colin Rundel) Lecture 10 February 25, 2014 12 / 23 Sta230 / Mth 230 (Colin Rundel) Lecture 10 February 25, 2014 13 / 23 Continuous Random Variables Cumulative Distribution Function Continuous Random Variables Cumulative Distribution Function Binomial CDF Uniform CDF Let X ∼ Binom(n; p) then Let X ∼ Unif(a; b) then Probability Mass Function Cumulative Density Function Probability Mass Function Cumulative Density Function ( 8 1 0 for x ≤ a ! b−a for x 2 [a; b] > n bxc ! f (x) = < k n−k X n k n−k F (x) = x−a P(X = k) = f (k) = p (1 − p) P(X ≤ x) = F (x) = p (1 − p) 0 otherwise b−a for x 2 [a; b] k k > k=0 :1 for x ≥ b Sta230 / Mth 230 (Colin Rundel) Lecture 10 February 25, 2014 14 / 23 Sta230 / Mth 230 (Colin Rundel) Lecture 10 February 25, 2014 15 / 23 Continuous Random Variables Cumulative Distribution Function Continuous Random Variables Exponential Distribution Normal CDF Exponential Distribution Let X ∼ N (µ, σ2) then We derive the Exponential distribution by thinking of it as a RV that describes the waiting time between events which occur continuously with Probability Mass Function Cumulative Density Function the rate λ. 1 (x−µ)2 − 2 F (x) = Φ(x) f (x) = φ(x) = p e 2σ λ here has the same meaning as in the Poisson distribution, it is the 2πσ expected number of events in a given unit of time. Let us consider one such unit of time, we expect that there will be λ events in this time span. If we subdivide that unit of time into n subinterval then the probability that one of the events falls with a certain subinterval should be approximately λ/n. Sta230 / Mth 230 (Colin Rundel) Lecture 10 February 25, 2014 16 / 23 Sta230 / Mth 230 (Colin Rundel) Lecture 10 February 25, 2014 17 / 23 Continuous Random Variables Exponential Distribution Continuous Random Variables Exponential Distribution Exponential Distribution, cont. Exponential Distribution, cont. Let X ∼ Exp(λ), we start by examining P(X ≤ b) where b is a positive From calculus remember that: integer. This is in essence asking, what is the probability that we do not m have to wait longer than b units of time before the first event occurs. X 1 − am+1 ak = 1 − a Since we have divided each unit of time up into n subdivisions, this is the k=0 same as asking what is the probability that the event occurs in the first nb Therefore, sub-intervals. Since we have the (approximate) probability of the event for each subinterval we can model this probability with a Geometric random variable Y with p = λ/n. bn−1 bn−1 k X X λ λ P(X ≤ b) ≈ P(Y ≤ nb) = P(Y = k) = 1 − n n k=0 k=0 Sta230 / Mth 230 (Colin Rundel) Lecture 10 February 25, 2014 18 / 23 Sta230 / Mth 230 (Colin Rundel) Lecture 10 February 25, 2014 19 / 23 Continuous Random Variables Exponential Distribution Continuous Random Variables Exponential Distribution Exponential Distribution, cont. Exponential Distribution, cont. In this case we have the CDF but not the PDF, how do we get the PDF? Let X be a random variable that reflects the time between events which occur continuously with a rate λ, X ∼ Exp(λ) f (xjλ) = λe−λx P(X ≤ x) = F (xjλ) = 1 − e−λx t −1 M (t) = 1 − X λ E(X ) = λ−1 Var(X ) = λ−2 log 2 Median(X ) = λ Sta230 / Mth 230 (Colin Rundel) Lecture 10 February 25, 2014 20 / 23 Sta230 / Mth 230 (Colin Rundel) Lecture 10 February 25, 2014 21 / 23 Continuous Random Variables Exponential Distribution Continuous Random Variables Exponential Distribution Exponential Distribution - Memoryless Property Exponential Distribution - Example Let X ∼ Exp(λ) (assume λ has units of events/min) then if we have Strontium 90 is a radioactive component of fallout from nuclear explosions.