<<

116Chapter 4. Continuous Variables and Their Probability Distributions (ATTENDANCE 7)

4.6 The Gamma

The continuous gamma random has density

α−1 −y/β y e , 0 y< , f(y)= βαΓ(α) ≤ ∞ ( 0, elsewhere, where the gamma is defined as

∞ α 1 y Γ(α)= y − e− dy Z0 and its (), and standard are,

µ = E(Y )= αβ, σ2 = V (Y )= αβ2, σ = V (Y ).

One important special case of the gamma, is the continuous pchi–square random vari- ν able Y where α = 2 and β = 2; in other words, with density

ν−2 − y y 2 e 2 2ν/2Γ( ν ) , 0 y < , f(y)= 2 ≤ ∞ ( 0, elsewhere,

and its expected value (mean), variance and are,

µ = E(Y )= ν, σ2 = V (Y )=2ν, σ = V (Y ).

Another important special case of the gamma, is the continuousp exponential Y where α = 1; in other words, with density

1 y/β e− , 0 y< , f(y)= β ≤ ∞ 0, elsewhere,  and its expected value (mean), variance and standard deviation are,

µ = E(Y )= β, σ2 = V (Y )= β2, σ = β.

Exercise 4.6 (The Gamma Probability Distribution)

1. .

(a) Gamma function8, Γ(α).

8The is a part of the gamma density. There is no closed–form expression for the gamma function except when α is an integer. Consequently, numerical integration is required. We will mostly use the calculator to do this integration. Section 6. The Gamma Probability Distribution (ATTENDANCE 7) 117

∞ 1.2 1 y i. Γ(1.2) = 0 y − e− dy = (choose one) (i) 0.92 (ii) 1.12 (iii) 2.34 (iv) 2.67. PRGM GAMFUNC ENTER ENTER (again!) 1.2 ENTER ii. Γ(2.2) (choose one) (i) 0.89 (ii) 1.11 (iii) 1.84 (iv) 2.27. ≈ PRGM GAMFUNC ENTER ENTER (again!) 2.2 ENTER iii. Notice Γ(2.2) 1.11 = (2.2 1)Γ(2.2 1)=1.2Γ(1.2) 1.2(0.92) 1.11. ≈ − − ≈ ≈ In other words, in general,

Γ(α)=(α 1)Γ(α 1), α> 1 − − (i) True (ii) False iv. Γ(1) = (choose one) (i) 0 (ii) 0.5 (iii) 0.7 (iv) 1. PRGM GAMFUNC ENTER ENTER 1 ENTER v. Γ(2) = (2 1)Γ(2 1) = (i) 0 (ii) 0.5 (iii) 0.7 (iv) 1. − − vi. Γ(3) = 2Γ(2) = (i) 1 (ii) 2! (iii) 3! (iv) 4!. vii. Γ(4) = 3Γ(3) = (i) 1 (ii) 2! (iii) 3! (iv) 4!. viii. In general, if n is a positive integer,

Γ(n)=(n 1)! − (i) True (ii) False (b) Graphs of gamma density. Consider the graphs in Figure 4.9.

f(y) f(y) 0.2 (1) 0.2

(3) (4) (2) (6) (5)

y y 0 20 0 20

(a) increasing a, b = 3 (b) increasing a, b = 5

Figure 4.9: Gamma densities

i. Match gamma density, (α, β), to graph, (1) to (6). (α, β)= (1, 3) (2, 3) (3, 3) (1, 5) (2, 5) (3, 5) graph (1) For (α, β) = (1, 3), for example, PRGM GAMGRPH ENTER ENTER 1 ENTER 3 ENTER 20 ENTER 0.2 ENTER 118Chapter 4. Continuous Variables and Their Probability Distributions (ATTENDANCE 7)

ii. As α and β grow larger, gamma density becomes (choose one) (i) more symmetric. (ii) more skewed. iii. As α and β grow larger, “center” (mean) of gamma density (i) decreases. (ii) remains the same. (iii) increases. iv. As α and β grow larger, “dispersion” (variance) of gamma density (i) decreases. (ii) remains the same. (iii) increases. () Gamma distribution: area under gamma density. i. If (α, β)=(1, 3), P (Y < 1.3) = F (1.3) (choose one) (i) 0.23 (ii) 0.35 (iii) 0.43 (iv) 0.43≈. PRGM GAMDSTR ENTER ENTER 1 ENTER 3 ENTER 1.3 ENTER ii. If (α, β)=(1, 3), P (Y > 2.7)=1 P (Y 2.7)=1 F (2.7) (i) 0.13 (ii) 0.32 (iii) 0.41 (iv)− 0.63.≤ − ≈ First, PRGM GAMDSTR ENTER ENTER 1 ENTER 3 ENTER 2.7 ENTER then subtract result from 1! iii. If (α, β)=(2, 5), P (1.4

2 µ 2 then β = 2 = 1 and also α = β = 1 = (choose one) (i) 2 (ii) 3 (iii) 4 (iv) 5. Section 6. The Gamma Probability Distribution (ATTENDANCE 7) 119

(b) In this case, the gamma density,

α−1 −y/β y e , 0 y < , f(y)= βαΓ(α) ≤ ∞ ( 0, elsewhere,

is given by (choose one) (i) ye y, 0 y < , f(y)= − 0, elsewhere≤ ∞,  (ii) −y ye , 0 y < , f(y)= Γ(3) ≤ ∞ ( 0, elsewhere, (iii) 2 −y/2 y e , 0 y < , f(y)= 22Γ(1) ≤ ∞ ( 0, elsewhere. (c) What is the chance of waiting at most 4.5 hours? Since (α, β)=(2, 1), P (Y < 4.5) = F (4.5) (choose one) (i) 0.65 (ii) 0.78 (iii) 0.87 (iv) 0.94. ≈ PRGM GAMDSTR ENTER ENTER 2 ENTER 1 ENTER 4.5 ENTER (d) P (Y > 3.1)=1 P (Y 3.1)=1 F (3.1) (choose one) − ≤ − ≈ (i) 0.18 (ii) 0.28 (iii) 0.32 (iv) 0.41. Subtract PRGM GAMDSTR ENTER ENTER 2 ENTER 1 ENTER 3.1 ENTER from one. (e) What is the 90th waiting time; in other words, what is that time such that 90% of waiting times are less than this time? If P (Y <φ0.90)=0.90, then φ0.90 (choose one) (i) 1.89 (ii) 2.53 (iii) 3.72 (iv)≈ 3.89. PRGM GAMINV ENTER ENTER 2 ENTER 1 ENTER 0.9 ENTER

3. -square distribution: waiting time to order. At McDonalds in Westville, waiting time to order (in minutes), Y , follows a chi–square distribution.

(a) Probabilities. Consider graphs in Figure 4.10. i. If ν = 4, the probability of waiting less than 3.9 minutes is P (Y < 3.9) = F (3.9) (choose one) (i) 0.35 (ii) 0.45 (iii)≈ 0.58 (iv) 0.66. 2nd DISTR χ2cdf(0,3.9,4). ii. If ν = 10, P (3.6

f(y) f(y)

P(Y < 3.9) = ? P(3.6 < Y < 7.0) = ?

2 4 6 8 10 2 4 6 8 10 y y 3.6 7.0 3.9 (a) chi-square with 4 df (b) chi-square with 10 df

Figure 4.10: Chi–square probabilities

iii. Chance of waiting time exactly 3 minutes, say, is zero, P (Y = 3) = 0. (i) True (ii) False (b) . Consider graphs in Figure 4.11.

f(y) f(y) 0.72 0.72

y y 2 4 6 8 10 2 4 6 8 10

72nd percentile? 72nd percentile?

(a) Chi-square with 4 df (b) Chi-square with 10 df

Figure 4.11: Chi–square percentiles

i. If ν = 4 and P (Y <φ0.72)=0.72, then φ0.72 (choose one) (i) 3.1 (ii) 5.1 (iii) 8.3 (iv) 9.1. ≈ PRGM CHI2INV ENTER 4 ENTER 0.72 ENTER

ii. If ν = 10 and P (Y <φ0.72)=0.72, then φ0.72 (choose one) (i) 2.5 (ii) 10.5 (iii) 12.1 (iv) 20.4. ≈ PRGM CHI2INV ENTER 10 ENTER 0.72 ENTER iii. The 32nd percentile for a chi-square with ν = 18 df, is (i) 2.5 (ii) 10.5 (iii) 14.7 (iv) 20.4. PRGM CHI2INV ENTER 18 ENTER 0.32 ENTER iv. The 32nd percentile is that waiting time such that 32% of the waiting times are less than this waiting time and 68% are more than this time. (i) True (ii) False

4. Chi–square distribution again. Section 6. The Gamma Probability Distribution (ATTENDANCE 7) 121

(a) If ν = 3, the chi–square density,

ν−2 − y y 2 e 2 2ν/2Γ( ν ) , 0 y < , f(y)= 2 ≤ ∞ ( 0, elsewhere,

is given by (choose one) (i) y 1 e− 2Γ(2) , 0 y < , f(y)= 2 0, elsewhere≤ ∞,  (ii) 1 y e− 2 , 0 y < , f(y)= 2Γ(2) ≤ ∞ 0, elsewhere,  (iii) 1 − y y 2 e 2 23/2Γ( 3 ) , 0 y < , f(y)= 2 ≤ ∞ ( 0, elsewhere, (b) If ν = 3, P (Y < 3.1) (choose one) (i) 0.62 (ii) 0.67 (iii)≈ 0.72 (iv) 0.81. 2nd DISTR χ2cdf(0,3.1,3). (c) If ν = 3, P (1.3

1 y/β e− , 0 y< , f(y)= β ≤ ∞ 0, elsewhere,  122Chapter 4. Continuous Variables and Their Probability Distributions (ATTENDANCE 7)

(a) If β = 2, the chance of waiting at most 1.1 minutes is9

1.1 1 y/2 y/2 1.1 1 (1.1) P (Y 1.1) = F (1.1) = e− dy = e− =1 e− 2 = ≤ 2 − 0 − Z0   (i) 0.32 (ii) 0.42 (iii) 0.45 (iv) 0.48. 1 (b) If β = 3 ,

1.1 3y 3y 1.1 3(1.1) P (Y 1.1) = F (1.1) = 3e− dy = e− =1 e− = ≤ − 0 − Z0   (i) 0.32 (ii) 0.42 (iii) 0.75 (iv) 0.96. 1 5(1.1) (c) If β = 5 , P (Y < 1.1) = F (1.1)=1 e− (circle one) (i) 0.312 (ii) 0.432 −(iii) 0.785≈ (iv) 0.996. It is also possible to use: PRGM EXPDSTR ENTER 1/5 ENTER 1.1 ENTER 1 (d) If β = 3 , the chance of waiting at least 0.54 minutes (3)(0.54) (3)(0.54) P (Y > 0.54) = 1 F (0.54) = 1 1 e− = e− (i) 0.20 (ii) 0.22− (iii) 0.29 (iv)− 0−.34. ≈  1 (e) If β = 3 , (3)(1.13) (3)(1.62) P (1.13

3(10) P (Y > 10) = 1 F (10) = 1 (1 e− )= − − − (choose one) (i) e−10 (ii) e−20 (iii) e−30 (iv) e−40.

9Unlike the gamma and chi-square distributions, it is fairly easy to integrate exponential densities. Section 7. The Probability Distribution (ATTENDANCE 7) 123

(b) The chance batteries last at least 15 hours, given that they have already lasted at least 5 hours is

3(15) P (Y > 15,Y > 5) P (Y > 15) 1 (1 e− ) P (Y > 15 Y > 5) = = = − − = | P (Y > 5) P (Y > 5) 1 (1 e 3(5)) − − − (choose one) (i) e−10 (ii) e−20 (iii) e−30 (iv) e−40. (c) In other words10,

P (Y > 15 Y > 5) = P (Y > 10). | This is an example of the “memoryless” property of the exponential. (i) True (ii) False (d) Implication of memoryless property of exponential: independence. If

P (Y >s + t Y > t)= P (Y >s); s, t 0 | ≥ and also P (Y >s + t, Y > t) P (Y >s + t) P (Y >s + t Y > t)= = | P (Y > t) P (Y > t) then, combining the last two , P (Y >s + t) = P (Y >s) P (Y > t) or P (Y >s + t)= P (Y > t)P (Y >s) a But P (Y > a)= e− β , and so

s+t s t e− β = e− β e− β

(i) True (ii) False

4.7 The Beta Probability Distribution

The beta random variable Y , with parameters α> 0 and β > 0, has density

yα−1(1 y)β−1 − , 0 y 1 f(y)= B(α,β) ≤ ≤ ( 0, elsewhere,

10The chance a battery lasts at least 10 hours or more, is the same as the chance a battery lasts at least 15 hours, given that it has already lasted 5 hours or more. This is kind of surprising, because it seems to imply the battery’s life starts “fresh” after 5 hours, as though the battery “forgot” about the first five hours of its life. 124Chapter 4. Continuous Variables and Their Probability Distributions (ATTENDANCE 7) and distribution function, called an incomplete ,

y α 1 β 1 t − (1 t) − F (y)= − dt = I (α, β), B(α, β) y Z0 where 1 α 1 β 1 Γ(α)Γ(β) B(α, β)= y − (1 y) − dy = , − Γ(α + β) Z0 and its expected value (mean), variance and standard deviation are, α αβ µ = E(Y )= , σ2 = V (Y )= , σ = V (Y ). α + β (α + β)2(α + β + 1) p

Exercise 4.7 (The Beta Probability Distribution) 1. .

(a) Graphs of beta density. Consider the graphs in Figure 4.12.

5 f(y) f(y) (4) 3 3 (1) (3) (5) (2) (6)

0 1 y 0 1 y (a) increasing a, b = 3 (b) increasing a, b = 5

Figure 4.12: Beta densities

i. Match beta density, (α, β), to graph (1) to (6). (a, b)= (1, 3) (2, 3) (3, 3) (1, 5) (2, 5) (3, 5) graph (1) For (α, β) = (1, 3), for example, PRGM BETAGRPH ENTER ENTER 1 ENTER 3 ENTER 3 ENTER ii. If α and β become more equal, beta density becomes (choose one) (i) more symmetric. (ii) more skewed. iii. If β decreases, “center” (mean) of beta density (i) decreases. (ii) remains the same. (iii) increases. Section 7. The Beta Probability Distribution (ATTENDANCE 7) 125

iv. All beta densities are defined on (choose one) (i) (0, 1) (ii) (−1, 1) (iii) (0, ∞) (iv) (−∞, 0) v. If α = 1 and β = 1 beta is the (choose one) (i) uniform. (ii) normal. (iii) gamma. (iv) exponential. PRGM BETAGRPH ENTER ENTER 1 ENTER 1 ENTER 2 ENTER (b) Beta distribution: area under beta density. i. For (α, β)=(1, 5), P (Y < 0.2) = F (0.2) (choose one) (i) 0.35 (ii) 0.41 (iii) 0.67 (iv) 0.77≈. PRGM BETADSTR ENTER ENTER 1 ENTER 5 ENTER 0.2 ENTER ii. For (α, β)=(2, 5), P (Y < 0.2) = F (0.2) (choose one) (i) 0.34 (ii) 0.41 (iii) 0.67 (iv) 0.77≈. PRGM BETADSTR ENTER ENTER 2 ENTER 5 ENTER 0.2 ENTER iii. For (α, β)=(2, 5), P (0.1 Y < 0.7) = F (0.7) F (0.1) ≤ − ≈ (i) 0.65 (ii) 0.71 (iii) 0.87 (iv) 0.96. PRGM BETADSTR ENTER ENTER 2 ENTER 5 ENTER 0.7 ENTER subtract PRGM BE- TADSTR ENTER ENTER 2 ENTER 5 ENTER 0.1 ENTER (c) Mean, variance and standard deviation of beta. α 1 i. If (α, β)=(1, 5), µ = E(Y )= α+β = 1+5 = (choose one) 1 1 1 1 (i) 5 (ii) 6 (iii) 7 (iv) 8 . 2 αβ ii. If (α, β)=(1, 5), σ = V (Y )= (α+β)2(α+β+1) = (choose one) 4 5 6 7 (i) 252 (ii) 252 (iii) 252 (iv) 252 . 2. Beta distribution again: proportion of time car broken. Assume the proportion of time, Y , a car is broken is approximately a beta with the following density,

5(1 y)4, 0

yα−1(1 y)β−1 − , 0 y 1 f(y)= B(α,β) ≤ ≤ ( 0, elsewhere,

which, when compared to the density in this example11,

α 1=0, β 1=4, − − and so α = 1 and β = (choose one) (i) 2 (ii) 3 (iii) 4 (iv) 5.

11 Γ(1)Γ(5) (1)4! 1 1 Also notice B(1, 5) = Γ(1+5) = 5! = 5 , so B(1,5) = 5. 126Chapter 4. Continuous Variables and Their Probability Distributions (ATTENDANCE 7)

(b) What is the chance the car is broken at most 65% of the time? Since (α, β)=(1, 5), P (Y < 0.65) = F (0.65) (choose one) (i) 0.65 (ii) 0.78 (iii) 0.87 (iv) 0.99. ≈ PRGM BETADSTR ENTER ENTER 1 ENTER 5 ENTER 0.65 ENTER (c) P (0.15 Y 0.45) = F (0.45) F (0.15) (choose one) (i) 0.34≤ (ii)≤0.39 (iii) 0.45 −(iv) 0.56.≈ PRGM BETADSTR ENTER ENTER 1 ENTER 5 ENTER 0.45 ENTER subtract PRGM BETADSTR ENTER ENTER 1 ENTER 5 ENTER 0.15 ENTER. (d) What is the 80th percentile proportion broken time? If P (Y <φ0.80)=0.80, then φ0.80 (choose one) (i) 0.13 (ii) 0.28 (iii) 0.35 (iv)≈ 0.54. PRGM BETAINV ENTER ENTER 1 ENTER 5 ENTER 0.8 ENTER

4.8 Some General Comments

All of the probability distributions discussed could be used to model “real” data. A good probability model is one that closely matches the actual data.

4.9 Other Expected Values

Just as for the discrete case, –generating functions, m(t), are useful in cal- culating the moments of the distribution of any12 continuous random variable Y . Furthermore, m(t) uniquely identifies any probability distribution. Consequently, it is possible to use either the probability distribution or its associated (and possibly eas- ier to mathematically manipulate) moment–generating function when working with probability distributions.

The moment of random variable Y taken about the origin is defined by, • k µk′ = E Y .  The moment of random variable Y taken about its mean (or kth central moment • of Y ) is defined by, µ = E((Y µ)k). k − 12This is true as long as m(t) exists. The function m(t) exists as long as there is a constant b such that m(t) < for t b. ∞ | |≤ Section 9. Other Expected Values (ATTENDANCE 7) 127

In the continuous case, the moment–generating function of Y is defined by, • ∞ m(t) = E etY = etyf(y) dy −∞ Z 2  ∞ (ty) = 1+ ty + + f(y) dy 2! ··· −∞   Z 2 ∞ ∞ t ∞ = f(y)+ t yf(y)+ y2f(y)+ 2! ··· Z−∞ Z−∞ Z−∞ t2 = 1+ tµ′ + µ′ + 1 2! 2 ···

Function m(t) “generates” moments of a distribution by successively differenti- • ating m(t) and evaluating the results at t = 0,

k d m(t) (k) = m (0) = µ′ , k =1, 2,... dtk k t=0

Exercise 4.8 (Other Expected Values) 1. Exponential. Let Y have an exponential density given by

1 y/β e− , 0 y< , f(y)= β ≤ ∞ 0, elsewhere.  (a) Determine m(t).

tY ∞ ty ∞ ty 1 y/β 1 ∞ y/β∗ m(t)= E(e )= e f(y) dy = e e− dy = e− dy 0 β β 0 Z−∞ Z Z β ∗ 1 ∞ y/β where β∗ = 1 βt . However, since β∗ 0 e− dy = 1, then − R 1 1 ∞ y/β∗ 1 1 β m(t)= β∗ e− dy = β∗ = = β × β β × β × 1 βt  ∗ Z0  − 1 2 3 4 (i) 1−βt (ii) 1−βt (iii) 1−βt (iv) 1−βt . (b) Determine E(Y ); that is, show E(Y )= β using m(t). First notice

1 E(Y )= E Y = µ1′ . So  1 1 (1) d m(t) d (1 βt)− 2 µ′ = m (0) = = { − } = β(1 βt)− = 1 dt1 dt − t=0 t=0 t=0   (i) β (ii) 2β (iii) 3β (iv) β +1. 128Chapter 4. Continuous Variables and Their Probability Distributions (ATTENDANCE 7)

(c) Determine E (Y 2). First notice

2 E Y = µ2′ . So 

2 2 1 (2) d m(t) d (1 βt)− 2 3 µ′ = m (0) = = { − } = 2β (1 βt)− 2 dt2 dt2 − t=0 t=0 t=0   which equals (choose one) (i) 2β2 (ii) β2 + β (iii) β +3β (iv) β3 + β. (d) Determine V (Y ); that is, show V (Y )= β2. V (Y )= E[Y 2] (E[Y ])2 =2β2 β2 = (i) β2 (ii) 2β2−+ β (iii) β2 +3− β (iv) β3 + β.

2. Gamma. Assume moment–generating function for gamma Y is 1 m(t)= . (1 βt)α − 1 (a) Moment–generating function m(t)= (1 1.5t)2.5 is gamma where (α, β) = (i) (1.5, 2.5) (ii) (1.5, 2.0) −(iii) (2.5, 1.0) (iv) (2.5, 1.5), and µ = αβ = (2.5)(1.5) = (i) 1.00 (ii) 2.75 (iii) 3.00 (iv) 3.75, and σ2 = αβ2 = (circle one) (i) 5.083 (ii) 5.095 (iii) 5.123 (iv) 5.625, and so P (0 Y 3) (i) 0.35 (ii) 0.40 (iii) 0.45 (iv) 0.58. ≤ ≤ ≈ PRGM GAMDSTR ENTER ENTER 2.5 ENTER 1.5 ENTER 3 ENTER. (b) Moment–generating function for W = 2Y 9. tY − − In general, mY (t)= E e and

tW t( 2Y 9) ( 2t)Y 9t 9t ( 2t)Y mW (t)= E e = E e − − = E e − − = e− E e − =

−9t −9t (i) 5e mY (t) (ii) e mY (−2t)   −9t −9t 2 (iii) e mW (−2t) (iv) e mY (−2t ). 1 For gamma moment–generating function, mY (t)= (1 1.5t)2.5 , − 9t m (t)= e− m ( 2t)= W Y − e−2t 1 e−9t 1 (i) (1−1.5t)2.5 (ii) (1+3t)2.5 −9 1 −9 1 e  2.5 e  2.5 (iii) (1+1.5t) (iv) (1−1.5t) .     3. Normal. Assume moment–generating function for normal Y is

2 2 m(t)= eµt+(1/2)t σ . Section 9. Other Expected Values (ATTENDANCE 7) 129

(a) Moment–generating function m(t)= e2t+(1/2)t2 (2) is normal where µ = (circle one) (i) 1 (ii) 2 (iii) 3 (iv) 4, and σ = √σ2 (circle one) (i) 1.41 (ii) 2.41 (iii) 3.41 (iv) 4.41, ≈ and so P (0 Y 3) (circle one)≤ (i) 0.≤41 ≈(ii) 0.68 (iii) 0.91 (iv) 0.98. 2nd DISTR normalcdf(0,3,2,√2) (b) Moment–generating function m(t)= e2t+(1/2)t2 is normal where µ = (circle one) (i) 1 (ii) 2 (iii) 3 (iv) 4, and σ = √σ2 (circle one) (i) 1 (ii) 1.41 (iii) 1.51 (iv) 2, ≈ and so P (0 Y 3) (circle one)≤ (i) 0.≤41 ≈(ii) 0.68 (iii) 0.82 (iv) 0.98. 2nd DISTR normalcdf(0,3,2,1)

4. Uniform13. Assume moment–generating function for uniform Y is

etθ2 etθ1 m(t)= − . t(θ θ ) 2 − 1 e2t et (a) Moment–generating function m(t)= −t is uniform which is defined on (θ1, θ2) = (i) (0, 1) (ii) (1, 2) (iii) (−1, 1) (iv) (1, 3), θ1+θ2 and µ = 2 = (circle one) (i) 1.0 (ii) 1.5 (iii) 2.0 (iv) 2.5, θ2 θ1 and σ = − (circle one) √12 ≈ (i) 0.289 (ii) 0.295 (iii) 0.323 (iv) 0.433, and P (0 Y 3) (circle one) (i) 0.25 (ii) 0.50 (iii) 0.75 (iv) 1. ≤ ≤ ≈ Since uniform defined on (1,2), probability must be 1. (b) Moment–generating function for W =5Y +3. tY In general, mY (t)= E e and

tW  t(5Y +3) (5t)Y +3t 3t (5t)Y mW (t)= E e = E e = E e = e E e =

3t 3t  3t  3t  2 (i) 5e mY (t) (ii) e mY (5t) (iii) e mW (5t) (iv) e mY (5t ). e2t et For uniform moment–generating function, mY (t)= −t ,

3t mW (t)= e mY (5t)=

10t 5t 10t 5t 10t 5t 10t 5t e3t e −e e5t e −e e3 e −e e5 e −e (i) 5t (ii) 5t (iii) 5t (iv) 5t .         13Although the beta distribution does not have a closed–form moment–generating function, a special case of the beta distribution, the uniform distribution, does have a closed–form moment– generating function. 130Chapter 4. Continuous Variables and Their Probability Distributions (ATTENDANCE 7)

4.10 Tchebysheff’s Theorem

As was true in the discrete case, Tchebysheff’s theorem states, for continuous random variable Y with finite µ and σ2 and for k> 0, 1 1 P ( Y µ < kσ) 1 or P ( Y µ kσ) . | − | ≥ − k2 | − | ≥ ≤ k2 These two (equivalent) inequalities allow us to specify (very loose) lower bounds on probabilities when the distribution is not known.

Exercise 4.10 (Tchebysheff’s Theorem) 1. Tchebysheff’s theorem and exponential: battery lifetime. Suppose the distribu- tion of the lifetime of camera flash batteries, Y , is exponential, with parameter 1 β = 3 . 1 1 1 1 (a) µ = β = (choose one) (i) 3 (ii) 4 (iii) 5 (iv) 6 . (b) σ = β2 (choose one) (i) 1 (ii) 1 (iii) 1 (iv) 1 . ≈ 3 4 5 6 (c) Accordingp to Tchebysheff’s theorem, the probability battery lifetime is within k = 2 standard deviations of mean battery lifetime is at least 1 1 P ( Y µ < kσ) 1 =1 = | − | ≥ − k2 − 22 (i) 0.75 (ii) 0.85 (iii) 0.95 (iv) 0.98. 1 1 In fact, since µ = 3 and σ = 3 , P ( Y µ < kσ) = P (µ kσ

(i) 0.56 (ii) 0.74 (iii) 0.85 (iv) 0.88. 1 1 In actual fact, since µ = 3 and σ = 3 , P ( Y µ < kσ) = P (µ kσ

2. Tchebysheff’s theorem and gamma: time to fix a car. Suppose the distribution of the time to fix a car, Y , is gamma, with parameters α = 2, β = 4. Let repair costs equal C = 25Y + 250.

(a) E(Y )= µY = αβ = (choose one) (i) 6 (ii) 7 (iii) 8 (iv) 9. 132Chapter 4. Continuous Variables and Their Probability Distributions (ATTENDANCE 7)

(b) E(C)= µC = E(25Y +250) = 25E(Y )+250 = 25(8)+250 = (choose one) (i) 350 (ii) 400 (iii) 450 (iv) 500. (c) σ = αβ2 (choose one) (i) 5.34 (ii) 5.66 (iii) 6.54 (iv) 7.23. Y ≈ 2 2 (d) σC = pV (25Y +250) = 25 V (Y )=25√σ = 25σ (choose one) (i) 138.22 (ii) 139.98 (iii) 141.42 (iv)≈ 144.33. p p (e) According to Tchebysheff’s theorem, the probability repair cost is within k = 2 standard deviations of mean repair cost is at least 1 1 P ( Y µ < kσ) 1 =1 = | − | ≥ − k2 − 22 (i) 0.75 (ii) 0.85 (iii) 0.95 (iv) 0.98. In fact, since µ = 450 and σ 141.42 (equivalently, σ2 = 20000), so C C ≈ C µ = αβ = 450, σ2 = αβ2 =(αβ)β = (450)β = 20000

20000 400 450 and so βC = 450 = 9 and αC = 400 = (choose one) 9 81 82 83 84 (i) 8 (ii) 8 (iii) 8 (iv) 8 , then

P ( C µ < kσ) = P (µ kσ

4.11 Expectations of Discontinuous Functions and Mixed Probability Distributions

Not covered. Section 12. Summary (ATTENDANCE 7) 133

4.12 Summary

CONTINUOUS f(y) m(t) µ σ2 1 α 1 β 1 α αβ Beta y − (1 y) − B(α,β) − α+β (α+β)2(α+β+1) etθ2 etθ1 2 Uniform 1/(θ θ ) − (θ + θ )/2 (θ θ ) /12 2 1 t(θ2 θ1) 1 2 2 1 −1 − − 1 y 1 2 − β Exponential β e 1 βt β β α−1 −y/β − y e 1 2 Gamma βαΓ(α) (1 βt)α αβ αβ 2 2 − 2 2 Normal 1 e (y µ) /2σ eµt+σ t /2 µ σ2 √2πσ − −