STAT 345 - HOMEWORK 3 CHAPTER 4 -CONTINUOUS RANDOM VARIABLES

Problem One - Trapezoidal Distribution

Consider the following density function f(x).   x+1 , −1 ≤ x < 0  5   1 , 0 ≤ x < 4 f(x) = 5  5−x , 4 ≤ x ≤ 5  5  0, otherwise a) Sketch the PDF. b) Show that the area under the curve is equal to 1 using geometry. c) Show that the area under the curve is equal to 1 by integrating. (You will have to split the integral into pieces). d) Find P (X < 3) using geometry. e) Find P (X < 3) with integration.

Problem Two - Probability Density Function

Consider the following function f(x), where θ > 0.   c (θ − x), 0 < x < θ f(x) = θ2 0, otherwise

a) Find the value of c so that f(x) is a valid PDF. Sketch the PDF. b) Find the mean and standard deviation of X. √ c) If nd , 2 and . θ = 2 P (X > 1) P (X > 3 ) P (X > 2 − 2) c) Compute θ 3θ . P ( 10 < X < 5 ) Problem Three - Cumulative Distribution Functions

For each of the following probability density functions, nd the CDF. The Median (M) of a distribution is dened by the property: 1 . Use the CDF to nd the Median of each distribution. F (M) = P (X ≤ M) = 2 a) (Special case of the )  √ 1.5 x, 0 < x < 1 f(x) = 0, otherwise b) (Special case of the )   1 , x > 1 f(x) = x2 0, otherwise

c) Take the derivative of your CDF for part b) and show that you can get back the PDF.

1 STAT 345 - HOMEWORK 3 CHAPTER 4 -CONTINUOUS RANDOM VARIABLES 2

Problem Four - Expected Values

a) Let X be a Normal RV with mean µ and stadard deviation σ > 0. Find E(X). b) Let X be Continuous Uniform RV with a = 0 to b = π. Find E(π2 · sin(X)). c) Let X be a Gamma RV with shape r = 6 and scale λ = 2. Find E(1/X). Hint: Use integration by parts, or tabular integration. d) Let X be an Exponential RV with parameter λ > 0. Find E(aX ) where a > 0. Does E(aX ) exist for all values of a > 0? If not, give a condition on a for which the exists. Hint: Similar to the example I did in class for St. Petersberg paradox e) Bonus: Let X ∼ Beta(α, β) and nd E(Xk) where k > 0. Hint: Manipulate integral to obtain a new Beta PDF inside the integral.

Problem Five - The

This problem will help you get comfortable with using the Standard Normal table. a) Assume Z ∼ N(0, 1). Find P (Z < 1.5), P (−1.5 < Z < 2.45), and P (|Z| < k) for k = 1, 2, 3. b) Assume X ∼ N(µ = 70, σ = 3), then X represents the height in inches of a randomly selected American male.  Find the probability that a randomly selected male is taller than legendary rapper Shaquille O'Neal (7 ft 1 in).  Find the probability that a randomly selected male is between 5 feet and 6 feet tall.  If exactly 25% of American males are taller than Timothy, how tall is Timothy? c) Bonus: In atmospheric science, LogNormal distributions are often used to characterize particle size distributions. In one study, the distribution of silicone nanoparticle size (in nm) was found to be approximately lognormal with logmean θ = 3.91 and logsd ω = 0.47. Mathematically, Y ∼ LogN(θ = 3.91, ω = 0.47). Suppose the desired range of partical sizes is (0.02µm, 0.1µm), what percentage of silicone nanoparticles do you expect to fall within this range?

Problem Six - Normal Approximations

a) Suppose that the number of cars that drive past Lomas on the I − 25 between 5pm and 6pm on Wednesdays in a Poisson random variable with mean λ = 2, 100. Use the Normal approximation to the to determine the probability that less than 1,800 cars drive past Lomas on the I − 25 between 5pm and 6pm. b) 37% of cars in New Mexico are white. Last wednesday, 2, 200 cars drove past Lomas on the I − 25 between 5pm and 6pm. Use the normal approximation to the to determine the probability that more than 1, 000 of them were white.

2 STAT 345 - HOMEWORK 3 CHAPTER 4 -CONTINUOUS RANDOM VARIABLES 3

Problem Seven - Probability Rules

a) Assume X and Y are independent Exponential RV's with λ = 1. Dene A = {X < 2} and B = {Y > 5}. Find P (Ac ∩ Bc). Hint: A and B are independent events since X and Y are inde- pendent. See DeMorgan's Law problem on rst homework if you get stuck!

b) Let Z ∼ N(0, 1), nd P (Z > −1|Z < 2). (Use denition of conditional probability)

c) Consider the following random experiment. Suppose you roll a fair 4-sided die, and let N be the outcome. Let X be a continuous uniform RV such that X ∼ U(0,N). For example, if you roll the die and get N = 2, then X ∼ U(0, 2). What is P (X > 0.5). Hint: Use law of total probability, where the possibilities for N form the partition.

d) Consider the random experiment from c). Given that X < 0.5, what is the probability that you rolled a 4 on the die? Hint: Use Bayes Rule, and I am asking for P (N = 4|X < 0.5)

Problem Eight - The

Suppose X ∼ Exponential(λ). a) Find P (5 < X < 10) in terms of λ. What is this probability when λ = 0.5? b) Find E(X3) using integration by parts or tabular integration. c) The Exponential distribution is skewed right. Sometimes this is called positive skew. Mathematically, the skew of a random variable with expected value µ and standard deviation σ is dened as " # X − µ3 E(X3) − 3µσ2 − µ3 Sk(X) = E = σ σ3

Show that for an exponentially distributed random variable X, Sk(X) = +2 regardless of λ. d) An engineering company produces a thing. The amount of time it takes for the company to produce said thing is an Exponential distribution with a mean of 60 minutes. The CEO demands that a batch of things should not take longer than 3 hours to produce. What percent of the time will the company meet this requirement? e) The exponential distribution has a special property called memorylessness (the also has this property). Assuming that s < t, this property is dened mathematically as follows:

P (X > t|X > s) = P (X > t − s)

Prove that this property holds for X ∼ Exponential(λ). (Just nd each side, and show they are the same)

3 STAT 345 - HOMEWORK 3 CHAPTER 4 -CONTINUOUS RANDOM VARIABLES 4

Figure 1. Memorylessness Property

Bonus Problem - Generating Functions

The Moment Generating function of a random variable X is dened as:

Xt MX (t) = E(e )

Inside the expected value, t can be regarded as a constant. However, MX () is a function of this t. (If any of you have seen Laplace Transforms before, this may look familiar). For instance, if X ∼ Bernoulli(p), then

Xt MX (t) = E(e ) = e0·tP (X = 0) + e1·tP (X = 1) = (1 − p) + pet

a) Find the Moment Generating Function (MGF) of X ∼ Exponential(λ). b) Take the rst derivative of and evaluate it at . That is, nd 0 . What do you get? MX 0 MX (0) c) Take the second derivative of and evaluate it at . That is, nd 00 . What do you get? MX 0 MX (0)

th Hence the name, it turns out that if you take the k derivative of MX and evaluate it at 0, you will get (k) k . I should note at this point, that this is only true if the Expected Value converges, which MX (0) = E(X ) need not be the case. The MGF does not exist for some distributions. Let's try to see why this works.

Xt Xt d) Expand MX (t) = E(e ) by writing e as a taylor series expansion. Now use lienarity of expectation. (k) k Explain why Mx (0) = E(X ).

4