<<

The Moment Generating

Gary Schurman, MBE, CFA October, 2011

The moment generating function (Mx(t)) of the x as a function of the variable t is...   tx Mx(t) = E e (1)

Note that etx can be approximated around zero using a Taylor Expansion. When x changes from zero to a non-zero value the approximate value of Equation (1) at the new value of x via a Expansion is...

 δM (t) 1 δ2M (t) 1 δ3M (t) 1 δnM (t)  M (t) = M (t) + 0 δx + 0 δx2 + 0 δx3 + ... + 0 δxn x E 0 δx 2 δx2 6 δx3 n! δxn  1  = et0 + tet0(x − 0) + t2et0(x − 0)2 + t3et0(x − 0)3 + ... + tnet0(x − 0)n E 2  1 1 1  = 1 + tx + t2x2 + t3x3 + ... + tnxn E 2 6 n!   1   1   1   = 1 + t x + t2 x2 + t3 x3 + ... + tn xn (2) E 2 E 6 E n! E To calculate the first moment of the distribution (i.e. M1, which is the expected value of x) we take the first derivative of Equation (2) with respect to t and evaluate it at t = 0 because... δM (t) M1 = lim x t→0 δt          2 1 2 3 1 n−1 n = lim E x + t E x + t E x + ... + t E x t→0 2 (n − 1)!   = E x (3)

To calculate the second moment of the distribution (i.e. M2, which is the expected value of the square of x) we take the second derivative of Equation (2) with respect to t and evaluate it at t = 0 because...

δ2M (t) M2 = lim x t→0 δt2        2 3 1 n−2 n = lim E x + t E x + ... + t E x t→0 (n − 2)!   2 = E x (4)

And so on...

The Normal Distribution The normal distribution is a distribution of continuous random variables. If the random variable x is continuous the moment generating function of the random variable x where f(x) is the probability density function can be defined as... ∞ Z tx Mx(t) = e f(x) δx (5) −∞

1 The equation for the probability density function of the normal distribution with mean m and variance v is...

1 − 1 (x−m)2 f(x) = √ e 2v (6) 2πv If we combine Equations (5) and (6) from above we can write the equation for the moment generating function of the normal distribution as... ∞ Z tx 1 − 1 (x−m)2 Mx(t) = e √ e 2v δx (7) 2πv −∞ To solve Equation (7) we will first define a new random variable z to be the normalized random variable x. The equation for the random variable z is... x − m z = √ (8) v The random variable x as a function of the new random variable z, which was defined in Equation (8) above, is... √ x = m + z v (9) The derivative of Equation (9) with respect to the random variable z is... δx √ = v (10) δz To solve Equation (7) we will next replace the variable x with the variable z as defined in Equations (8), (9) and (10) above. Equation (7) rewritten to be a function of the variable z is... ∞ Z √ t(m+z v) 1 − 1 z2 δx Mx(t) = e √ e 2 δz 2πv δz −∞ ∞ Z √ 1 − 1 z2+z v t mt √ = √ e 2 e v δz 2πv −∞ ∞ Z √ 1 − 1 z2+z v t mt = √ e 2 e δz (11) 2π −∞ Noting that after completing the square... 1 √ 2 1 √  1 √ 1 − z − v t = − z2 − 2z v t + vt2 = − z2 + z v t − vt2 (12) 2 2 2 2 We can use Equation (12) to rewrite Equation (11), which becomes... ∞ Z √ 1 − 1 (z− v t)2 mt 1 vt2 Mx(t) = √ e 2 e e 2 δz 2π −∞ ∞ Z √ mt+ 1 vt2 1 − 1 (z− v t)2 = e 2 √ e 2 δz 2π −∞ mt+ 1 vt2 = e 2 (13) The first derivative of Equation (13) with respect to the variable t is...

δMx(t) mt+ 1 vt2 = (m + vt) e 2 (14) δt The first moment of the distribution will be the limit of Equation (14) as t goes to zero   δMx(t) E x = lim t→0 δt m(0)+ 1 v(0)2 = (m + v(0)) e 2 = m (15)

2 The second derivative of Equation (13) with respect to the variable t is...

2 δ Mx(t) mt+ 1 vt2 2 mt+ 1 vt2 = v e 2 + (m + vt) e 2 (16) δt2 The second moment of the distribution will be the limit of Equation (16) as t goes to zero

  2 2 δ Mx(t) E x = lim t→0 δt2 m(0)+ 1 v(0)2 2 m(0)+ 1 v(0)2 = v e 2 + (m + v(0)) e 2 = v + m2 (17)

The Binomial Distribution The binomial distribution is a distribution of discrete random variables. If the random variable k, which is the number of successes out of n trials, is discrete then the moment generating function of the random variable k where P (k) is the probability mass function can be defined as...

n X tk Mk(t) = e P (k) (18) k=0 The equation for the probability mass function of the binomial distribution with the probability of success equal to p and the probability of failure equal to 1 − p is... n! P (k) = pk (1 − p)n−k (19) k!(n − k)! If we combine Equations (18) and (19) from above we can write the equation for the moment generating function of the binomial distribution as... n X n! M (t) = pk (1 − p)n−ketk k k!(n − k)! k=0 n X n! = (pet)k (1 − p)n−k (20) k!(n − k)! k=0 Pascal’s rule says that... n X n! ak bn−k = (a + b)n (21) k!(n − k)! k=0 If we define a = pet and b = 1 − p then Equation (20) becomes...

t n Mk(t) = (pe + 1 − p) (22) The first derivative of Equation (22) with respect to the variable t is... δM (t) k = n(pet + 1 − p)n−1pet (23) δt The first moment of the distribution will be the limit of Equation (23) as t goes to zero   δMk(t) E k = lim t→0 δt = n(pe0 + 1 − p)n−1pe0 = np (24)

The second derivative of Equation (23) with respect to the variable t is...

δ2M (t) k = n(n − 1)(pet + 1 − p)n−2petpet + n(pet + 1 − p)n−1pet (25) δt2

3 The second moment of the distribution will be the limit of Equation (23) as t goes to zero

  2 2 δ Mk(t) E k = lim t→0 δt2 = n(n − 1)(pe0 + 1 − p)n−2pe0pe0 + n(pe0 + 1 − p)n−1pe0 = n(n − 1)p2 + np = np − np2 + n2p2 = np(1 − p) + n2p2 (26)

4