<<

CHAPTER 7

Continuous distributions

7.1. Basic theory

7.1.1. Denition, PDF, CDF. We start with the denition a continuous . Denition (Continuous random variables)

A random variable X is said to have a continuous distribution if there exists a non- negative function f = f such that X ¢ b P(a 6 X 6 b) = f(x)dx a for every a and b. The function f is called the density function for X or the PDF for X.

¡More precisely, such an X is said to have an absolutely continuous¡ distribution. Note that ∞ . In particular, a for every . −∞ f(x)dx = P(−∞ < X < ∞) = 1 P(X = a) = a f(x)dx = 0 a

3 ¡Example 7.1. Suppose we are given that f(x) = c/x for x > 1 and 0 otherwise. Since ∞ and −∞ f(x)dx = 1 ¢ ¢ ∞ ∞ 1 c c f(x)dx = c 3 dx = , −∞ 1 x 2 we have c = 2.

PMF or PDF? Probability mass function (PMF) and (probability) density function (PDF) are two names for the same notion in the case of discrete random variables. We say PDF or simply a density function for a general random variable, and we use PMF only for discrete random variables.

Denition (Cumulative distribution function (CDF))

The distribution function of X is dened as ¢ y F (y) = FX (y) := P(−∞ < X 6 y) = f(x)dx. −∞ It is also called the cumulative distribution function (CDF) of X.

97 98 7. CONTINUOUS DISTRIBUTIONS

We can dene CDF for any random variable, not just continuous ones, by setting F (y) := P(X 6 y). Recall that we introduced it in Denition 5.3 for discrete random variables. In that case it is not particularly useful, although it does serve to unify discrete and continuous random variables. In the continuous case, the fundamental theorem of calculus tells us, provided f satises some conditions, that f (y) = F 0 (y) . By analogy with the discrete case, we dene the expectation of a continuous random variable.

7.1.2. Expectation, discrete approximation to continuous random variables. Denition (Expectation)

For a continuous random variable X with the density function f we dene its expec- tation by ¢ ∞ EX = xf(x)dx −∞ if this integral is absolutely convergent. In this case we call X integrable.

Recall that this integral is absolutely convergent if ¢ ∞ |x|f(x)dx < ∞. −∞ In the example above, ¢ ¢ ∞ ∞ 2 −2 EX = x 3 dx = 2 x dx = 2. 1 x 1 Later in Example 10.1 we will see that a continuous random variable with Cauchy distribution has innite expectation. Proposition 7.1 (Discrete approximation to continuous random variables)

Suppose X is a nonnegative continuous random variable with a nite expectation. Then there is a sequence of discrete random variables ∞ such that {Xn}n=1

Xn −−−→ X. E n→∞ E

Proof. First observe that if a continuous random variable X is nonnegative, then its density f (x) = 0 x < 0. In particular, F (y) = 0 for y 6 0, thought the latter is not needed for our proof. Thus for such a random variable ¢ ∞ EX = xf(x)dx. 0 Suppose , then we dene to be n if n n, for . n ∈ N Xn(ω) k/2 k/2 6 X(ω) < (k+1)/2 k ∈ N∪{0} This means that we are approximating X from below by the largest multiple of 2−n that is still below the value of X. Each Xn is discrete, and Xn increase to X for each ω ∈ S. 7.1. BASIC THEORY 99

Consider the sequence ∞ . This sequence is an increasing sequence of positive num- {EXn}n=1 bers, and therefore it has a limit, possibly innite. We want to show that it is nite and it is equal to EX. We have

∞ X k  k  X = X = E n 2n P n 2n k=1 ∞ X k  k k + 1 = X < 2n P 2n 6 2n k=1 ∞ ¢ n X k (k+1)/2 = n f(x)dx 2 n k=1 k/2 ∞ ¢ n X (k+1)/2 k = n f(x)dx. n 2 k=1 k/2

If x ∈ [k/2n, (k + 1)/2n), then x diers from k/2n by at most 1/2n, and therefore

¢ n ¢ n (k+1)/2 (k+1)/2 k 0 6 xf(x)dx − n f(x)dx k/2n k/2n 2 ¢ n ¢ n (k+1)/2  k  1 (k+1)/2 = x − n f(x)dx 6 n f(x)dx k/2n 2 2 k/2n

Note that

∞ ¢ n ¢ X (k+1)/2 ∞ xf(x)dx = xf(x)dx n k=1 k/2 0 and

∞ ¢ n ∞ ¢ n ¢ X 1 (k+1)/2 1 X (k+1)/2 1 ∞ 1 n f(x)dx = n f(x)dx = n f(x)dx = n . 2 n 2 n 2 2 k=1 k/2 k=1 k/2 0

Therefore 100 7. CONTINUOUS DISTRIBUTIONS

¢ ∞ ¢ n ∞ X (k+1)/2 k 0 6 EX − EXn = xf(x)dx − n f(x)dx n 2 0 k=1 k/2 ∞ ¢ n ∞ ¢ n X (k+1)/2 X (k+1)/2 k = xf(x)dx − n f(x)dx n n 2 k=1 k/2 k=1 k/2 ∞ ¢ n ¢ n ! X (k+1)/2 (k+1)/2 k = xf(x)dx − n f(x)dx n n 2 k=1 k/2 k/2 ∞ ¢ n X 1 (k+1)/2 1 6 n f(x)dx = n −−→ 0. 2 n 2 n→0 k=1 k/2 

We will not prove the following, but it is an interesting exercise: if Xm is any sequence of discrete random variables that increase up to , then will have the same value X limm→∞ EXm EX. This fact is useful to show linearity, if X and Y are positive random variables with nite expectations, then we can take Xm discrete increasing up to X and Ym discrete increasing up to Y . Then Xm + Ym is discrete and increases up to X + Y , so we have

(X + Y ) = lim (Xm + Ym) E m→∞ E = lim Xm + lim Ym = X + Y. m→∞ E m→∞ E E E Note that we can not easily use the approximations to X, Y and X + Y we used in the previous proof to use in this argument, since Xm + Ym might not be an approximation of the same kind. If X is not necessarily positive, we can show a similar result; we will not do the details. Similarly to the discrete case, we have Proposition 7.2

Suppose X is a continuous random variable with density fX and g is a real-valued function, then ¢ ∞ Eg(X) = g(x)f(x)dx −∞ as long as the expectation of the random variable g (X) makes sense.

As in the discrete case, this allows us to dene moments, and in particular the

2 Var X := E[X − EX] . As an example of these calculations, let us look at the uniform distribution. 7.1. BASIC THEORY 101

Uniform distribution

We say that a random variable has a uniform distribution on if 1 X [a, b] fX (x) = b−a if a 6 x 6 b and 0 otherwise.

To calculate the expectation of X

¢ ¢ ∞ b 1 X = xf (x)dx = x dx E X b − a −∞ ¢ a 1 b = x dx b − a a 1 b2 a2  a + b = − = . b − a 2 2 2 This is what one would expect. To calculate the variance, we rst calculate ¢ ¢ ∞ b 2 2 2 2 2 1 a + ab + b EX = x fX (x)dx = x dx = . −∞ a b − a 3 We then do some algebra to obtain (b − a)2 Var X = X2 − ( X)2 = . E E 12 102 7. CONTINUOUS DISTRIBUTIONS

7.2. Further examples and applications

Example 7.2. Suppose X has the following p.d.f. ( 2 x 1 f(x) = x3 > 0 x < 1. Find the CDF of , that is, nd . Use the CDF to nd . X FX (x) P (3 6 X 6 4)

Solution: we have F (x) = 0 if x 1 and will need to compute X 6 ¢ x 2 1 FX (x) = P (X 6 x) = 3 dy = 1 − 2 1 y x when x > 1. We can use this formula to nd the following probability P (3 6 X 6 4) = P (X 6 4) − P (X < 3)  1   1  7 = F (4) − F (3) = 1 − − 1 − = . X X 42 32 144

Example 7.3. Suppose X has density ( 2x 0 x 1 f (x) = 6 6 . X 0 otherwise

Find EX.

Solution: we have that ¢ ¢ 1 2 E [X] = xfX (x)dx = x · 2x dx = . 0 3

Example 7.4. The density of X is given by ( 1 if 0 x 2 f (x) = 2 6 6 . X 0 otherwise

 X  Find E e .

Solution: using Proposition 7.2 with g(x) = ex we have ¢ 2 X x 1 1 2  Ee = e · dx = e − 1 . 0 2 2

Example 7.5. Suppose X has density ( 2x 0 x 1 f(x) = 6 6 . 0 otherwise

© Copyright 2017 Phanuel Mariano, Patricia Alonso Ruiz, Copyright 2020 Masha Gordina. 7.2. FURTHER EXAMPLES AND APPLICATIONS 103

Find Var(X). 2 Solution: in Example 7.3 we found E [X] = . Now ¢ 3 ¢ 1 1  2 2 3 1 E X = x · 2xdx = 2 x dx = . 0 0 2 Thus 1 22 1 Var(X) = − = . 2 3 18

Example 7.6. Suppose X has density ( ax + b 0 x 1 f(x) = 6 6 . 0 otherwise and that 2 1 . Find the values of and . E [X ] = 6 a b ¡ Solution: We need to use the fact that ∞ and 2 1 . The rst one gives −∞ f(x)dx = 1 E [X ] = 6 us ¢ 1 a 1 = (ax + b) dx = + b, 0 2 and the second one give us ¢ 1 1 a b = x2 (ax + b) dx = + . 6 0 4 3 Solving these equations gives us a = −2, and b = 2. 104 7. CONTINUOUS DISTRIBUTIONS

7.3. Exercises

Exercise 7.1. Let X be a random variable with probability density function ( cx (5 − x) 0 x 5, f(x) = 6 6 0 otherwise.

(A) What is the value of c? (B) What is the cumulative distribution function of ? That is, nd . X FX (x) = P (X 6 x) (C) Use your answer in part (b) to nd P (2 6 X ≤ 3). (D) What is E [X]? (E) What is Var(X)?

Exercise 7.2. UConn students have designed the new U-phone. They have determined that the lifetime of a U-Phone is given by the random variable X (measured in hours), with probability density function ( 10 x 10, f(x) = x2 > 0 x ≤ 10.

(A) Find the probability that the u-phone will last more than 20 hours. (B) What is the cumulative distribution function of ? That is, nd . X FX (x) = P (X 6 x) (C) Use part (b) to help you nd P (X > 35)?

Exercise 7.3. Suppose the random variable X has a density function ( 2 x > 2, f(x) = x2 0 x 6 2.

Compute E [X].

Exercise 7.4. An insurance company insures a large number of homes. The insured value, X, of a randomly selected home is assumed to follow a distribution with density function ( 3 x > 1, f(x) = x4 0 otherwise.

Given that a randomly selected home is insured for at least 1.5, calculate the probability that it is insured for less than 2.

Exercise 7.5. The density function of X is given by ( a + bx2 0 x 1, f(x) = 6 6 0 otherwise.

If 7 , nd the values of and . E [X] = 10 a b 7.3. EXERCISES 105

Exercise 7.6. Let X be a random variable with density function ( 1 1 < x < a, f(x) = a−1 0 otherwise.

Suppose that E [X] = 6 Var(X). Find the value of a.

Exercise 7.7. Suppose you order a pizza from your favorite pizzeria at 7:00 pm, knowing that the time it takes for your pizza to be ready is uniformly distributed between 7:00 pm and 7:30 pm.

(A) What is the probability that you will have to wait longer than 10 minutes for your pizza? (B) If at 7:15pm, the pizza has not yet arrived, what is the probability that you will have to wait at least an additional 10 minutes?

Exercise 7.8. The grade of deterioration X of a machine part has a continuous distribution on the interval (0, 10) with probability density function fX (x), where fX (x) is proportional to x on the interval. The reparation costs of this part are modeled by a random variable 5 Y that is given by Y = 3X2. Compute the expected cost of reparation of the machine part.

Exercise 7.9. A bus arrives at some (random) time uniformly distributed between 10 : 00 and 10 : 20, and you arrive at a bus stop at 10 : 05. (A) What is the probability that you have to wait at least 5 minutes until the bus comes? (B) What is the probability that you have to wait at least 5 minutes, given that when you arrive today to the station the bus was not there yet (you are lucky today)?

Exercise∗ 7.1. For a continuous random variable X with nite rst and second moments prove that

E (aX + b) = aEX + b, Var (aX + b) = a2 Var X. for any a, b ∈ R.

Exercise∗ 7.2. Let X be a continuous random variable with probability density function

1 − x f (x) = xe 2 1 (x) , X 4 [0,∞) where the indicator function is dened as

   1, 0 6 x < ∞; 1[0,∞) (x) =   0, otherwise. Check that is a valid probability density function, and nd if it exists. fX E (X) 106 7. CONTINUOUS DISTRIBUTIONS

Exercise∗ 7.3. Let X be a continuous random variable with probability density function 4 ln x f (x) = 1 (x) , X x3 [1,∞) where the indicator function is dened as

   1, 1 6 x < ∞; 1[1,∞) (x) =   0, otherwise. Check that is a valid probability density function, and nd if it exists. fX E (X) 7.4. SELECTED SOLUTIONS 107

7.4. Selected solutions ¡ Solution to Exercise 7.1(A): We must have that ∞ , thus −∞ f(x)dx = 1 ¢ 5  5x2 x3 5 1 = cx(5 − x)dx = c − 0 2 3 0 and so we must have that c = 6/125. Solution to Exercise 7.1(B): We have that ¢ x FX (x) = P (X 6 x) = f(y)dy ¢ −∞ x 6 6 5y2 y3 x = y (5 − y) dx = − 0 125 125 2 3 0 6 5x2 x3  = − . 125 2 3 Solution to Exercise 7.1(C): We have

P (2 6 X 6 3) = P (X 6 3) − P (X < 2) 6 5 · 32 33  6 5 · 22 23  = − − − 125 2 3 125 2 3 = 0.296. Solution to Exercise 7.1(D): we have ¢ ¢ ∞ 5 6 E [X] = xfX (x)dx = x · x(5 − x)dx −∞ 0 125 = 2.5. Solution to Exercise 7.1(E): We need to rst compute ¢ ¢ ∞ 5  2 2 2 6 E X = x fX (x)dx = x · x(5 − x)dx −∞ 0 125 = 7.5. Then  2 2 2 Var(X) = E X − (E [X]) = 7.5 − (2.5) = 1.25.

Solution to Exercise 7.2(A): We have ¢ ∞ 10 1 2 dx = . 20 x 2 Solution to Exercise 7.2(B): We have ¢ x 10 10 F (x) = P(X 6 x) = 2 dy = 1 − 10 y x for x > 10, and F (x) = 0 for x < 10. 108 7. CONTINUOUS DISTRIBUTIONS

Solution to Exercise 7.2(C): We have

P (X > 35) = 1 − P (X < 35) = 1 − FX (35)  10 10 = 1 − 1 − = . 35 35 Solution to Exercise 7.3: +∞

Solution to Exercise 7.4: 37 . 64 ¡ Solution to Exercise 7.5: we need to use the fact that ∞ and 7 . −∞ f(x)dx = 1 E [X] = 10 The rst one gives us ¢ 1 b 1 = a + bx2 dx = a + 0 3 and the second one gives ¢ 7 1 a b = x a + bx2 dx = + . 10 0 2 4 Solving these equations gives 1 12 a = , and b = . 5 5 Solution to Exercise 7.6: Note that ¢ a x 1 1 EX = dx = a + . 1 a − 1 2 2 Also 2 2 Var(X) = EX − (EX) then we need ¢ a 2 2 x 1 2 1 1 EX = dx = a + a + . 1 a − 1 3 3 3 Then 1 1 1 1 12 Var(X) = a2 + a + − a + 3 3 3 2 2 1 1 1 = a2 − a + . 12 6 12 Then, using , we simplify and get 1 2 3 , which gives us . E [X] = 6 Var(X) 2 a − 2 a = 0 a = 3 Another way to solve this problem is to note that, for the uniform distribution on [a, b], the is a+b and the variance is (a−b)2 . This gives us an equation (a−1)2 a+1 . Hence 2 12 6 12 = 2 (a − 1)2 = a + 1, which implies a = 3. Solution to Exercise 7.7(A): Note that X is uniformly distributed over (0, 30). Then 2 (X > 10) = . P 3 Solution to Exercise 7.7(B): Note that X is uniformly distributed over (0, 30). Then P (X > 25) 5/30 P(X > 25 | X > 15) = = = 1/3. P(X > 15) 15/30 7.4. SELECTED SOLUTIONS 109

Solution to Exercise 7.8: First of all we need to nd the PDF of X. So far we know that ( cx 0 x 10, f(x) = 5 6 6 0 otherwise. Since ¢ 10 x c dx = 10c, 0 5 we have c = 1 . Now, applying Proposition 7.2 we get 10 ¢ 10 3 3 EY = x dx = 150. 0 50

Solution to Exercise 7.9(A): The probability that you have to wait at least 5 minutes until the bus comes is 1 . Note that with probability 1 you have to wait less than minutes, 2 4 5 and with probability 1 you already missed the bus. 4 Solution to Exercise 7.9(B): The conditional probability is 2 . 3