More Connuous Distribuons

Keegan Korthauer Department of Stascs UW Madison

1 Some Common Distribuons

Probability Distribuons

Discrete Connuous

Finite Infinite Finite Infinite

Bernoulli and Uniform Normal Exponenal Poisson Geometric Binomial

2 CONTINUOUS DISTRIBUTIONS

Normal distribuon Connuous Uniform Distribuon Exponenal distribuon

3 Uniform Distribuon

• A normally distributed RV tends to be close to its mean • Instead consider a random variable X that is equally likely to take on any value in a connuous interval (a,b) – Example: waing me for a train if we know it will arrive within a certain me range but have no prior knowledge beyond that • Then X is distributed uniformly on the interval (a,b) • The Uniform distribuon is especially useful in computer simulaons

4 X ~ Uniform(a,b)

• Probability Density Funcon " 1 $ if a < x < b f (x) = #b − a $ %0 otherwise • Mean and Variance a + b µ = X 2

2 2 (b − a) σ = X 12 5 Uniform PDF

• Connuous f(x) • Finite • Symmetric • Constant

x

6 Example – Waing

• Suppose we know that the me a train will arrive is distributed uniformly between 5:00pm and 5:15pm

• Let X represent the number of minutes we have to wait if we arrive at 5:00pm.

• What is the expected me we will wait? 7.5 minutes • What is the probability that we will wait longer than 10 minutes? 1/3 7 Infinite Waing Times?

• Say we want to model the lifeme X of a component and we do not have an upper bound • We want to allow for the possibility that the component will last a really long me (let X approach infinity) • Our PDF sll has to integrate to one, so it will have to asymptocally decrease to zero as X goes to infinity

• In this case, we can model X as an exponenal random variable with shape parameter λ • Can think of the exponenal as the connuous analog to the geometric distribuon 8 X ~ Exponenal(λ)

• Probability Density Funcon

−λx $"λe if x > 0 f (x) = # %$0 otherwise • Mean and Variance 1 µ = X λ 1 σ 2 = X λ 2 9 Exponenal Distribuon

• Connuous • Infinite • Support: x>0 • Shape determined by λ

10 Connecon to the Poisson

• Recall that the Poisson distribuon models the number of events occurring within one unit of me or space given some rate parameter λ

• If instead of counng the number of events that occur we are interested in modeling the me between two events, we can use the exponenal distribuon

• Formally, if events follow a Poisson process with rate parameter λ, then the waing me T from any starng point unl the next event me is distributed Exp(λ)

11 Center Example

hp://www.statlect.com/uddpoi1.htm 12 Example 4.58

• A radioacve mass emits parcles according to a Poisson process at a rate of 15 parcles per minute. At some point, we start a stopwatch.

• What is the probability that more than 5 seconds will elapse before the next emission? P(T>5) = 0.2865 • What is the mean waing me unl the next parcle is ed? E(T) = 4 seconds

13 Lack of Memory Property

• The exponenal distribuon does not ‘remember’ how long we’ve been waing

• The me unl the next event in a Poisson process from any starng point is exponenal

• The distribuon of the waing me is the same whether the me period just started or we have already waited t me units

• We call this memorylessness or the lack of memory property

If T ~ Exp(λ) then P(T > t | T > s) = P(T > t) for s, t > 0 14 Example 4.59-60 – Circuit Lifeme

• The lifeme of a parcular circuit has an exponenal distribuon with mean 2 years.

• Find the probability that the circuit lasts longer than 3 years.

• Assume the circuit is already 4 years old. What is the probability that it will last more than 3 addional years?

15 When λ is Unknown

How can we esmate it?

• If X ~ Exp(λ) then μX = 1/λ so λ = 1/μX 1 • Then we can esmate λ with λˆ = X

• Problem - although μX is unbiased for the sample mean, 1/ μX is not a linear funcon of μX so our esmator is biased: E(λˆ) ≈ λ + λ / n • For large samples, the bias will be negligible but for small samples we can correct for the bias with the esmator 1 n λˆ = = corr X + X / n (n +1)X 16

When λ is Unknown

What is the uncertainty in our esmate?

• Using the propagaon of error method from Chapter 3, we can esmate it with

d 1 1 1 plug in λ =λˆ 1 σ ˆ ≈ σ = = λ dX X X X 2 λ n X n

• Because of the bias in the esmate of λ, we will underesmate the uncertainty when we have a small sample size

17 Next

• Skim secon 4.6 (the lognormal distribuon) and the rest of secon 4.8 (the gamma and Weibull distribuons)

• Exam 1 (next Friday 2/28) covers material up to and including this lecture – Pracce exam handed out this Friday – Review session on Wednesday

• In the next few lectures we will – finish up Chapter 4 -Probability Plots and the Central Limit Theorem – introduce the stascal language R

18