Uniform, Binomial, Poisson and Exponential Distributions

Discrete uniform distribution is a discrete distribution: If a has any of n possible values k1, k2, …, kn that are equally probable, then it has a discrete uniform distribution. The probability of any ki is 1/ n. A simple example of the discrete uniform distribution is throwing a fair die. The possible values of k are 1, 2, 3, 4, 5, 6; and each time the die is thrown, the probability of a given score is 1/6.

Continuous uniform distribution is a family of probability distributions such that for each member of the family, all intervals of the same length on the distribution's are equally probable. The support is defined by the two parameters, a and b, which are its minimum and maximum values. The distribution is often abbreviated U(a,b), e.g., U(0,1) is a member of this family and so is U(1,2). The probability density function of the continuous uniform distribution is: 1/(ba− ) axb ≤ ≤ f( x ) =   0x< a or x > b Then the expectation of a continuous uniform variable X on [a, b] is 2 2 ∞ b 1 1 b− a ab + EX()()=∫ xfxdx = ∫ x dx = () = . −∞ a ba− ba − 2 2 The is b 1 a+ b VarX()(=− EXEX ())()()2222 =−= EX EX∫ x dx − () 2 a b− a 2

1ba3− 3 ab + ( ba − ) 2 = ( ) − ( ) 2 = b− a 3 2 12

Binomial distribution is the discrete of the number of successes in a sequence of n independent binary (yes/no) , each of which yields success with probability p. Such a success/failure is also called a Bernoulli experiment or . In fact, when n = 1, the is a Bernoulli distribution. Let X be a random variable following a Binomial distribution B(n, p), then for any integer k= 0, 1, 2, …, n, n! PXk(= ) = ppk (1 − ) n− k . k!( n− k )! The expectation of X is

n n n! EX()=∑ kPXk () == ∑ k pk (1) − p n− k k=0 k = 0 k!( n− k )!

n (n − 1)! k−1 n − k  =np∑ pp (1 − )  k=1 − − (k 1)!( n k )! 

n−1 (n − 1)! k n−1 − k  =np∑ pp (1 − )  k=0 k!( n− 1 − k )!  n−1 =np PXk ( * = ) = np (where X *~ B( n− 1, p )) (∑ k=0 )

The variance of X is n VarX()()()= EX222 − EX = kPX ( =− k ) () np 2 ∑ k=0 n n! =∑ k2 ppk (1 − ) n− k − ( np ) 2 k=0 k!( n− k )!

n n! =−+∑ (k 11) ppnpk (1 −− ) n− k ( ) 2 k=1 (k− 1)!( n − k )!

n (n − 1)! =npk∑ ( − 1) ppk−1 (1 − ) n − k k=1 (k− 1)!( n − k )!

n (n − 1)! + np∑ pk−1 (1− p ) n − k − ( np ) 2 k=1 (k− 1)!( n − k )! n−1 =npEX ( *) + np PX ( * =− k ) ( np )2 (∑ k=0 ) (where X *~ Bn (− 1, p ) an d EX ( *) = ( n − 1) p ) =npn (− 1) p + n p−( np )2 = np (1 − p )

The mean and variance of X can also be calculated by the following way. Let Yi be the outcome of the ith Bernoulli experiment. Then P ( Yi = 1) = p , P ( Yi = 0) = 1 − p , EY()1i =× p +× 0(1 − pp ) = 22 22 and VarY()()()ii= EY − EY i = EY ()() i − EY i =−=− pp p (1) p . Since X can be considered n as the number of successes in n independent Bernoulli trials with success rate p, X= Y . So ∑i=1 i n n EX()= EY () = npVarX , () = VarY () =− np (1) p . ∑i=1 i ∑ i = 1 i

Poisson distribution is a discrete probability distribution that expresses the probability of a number of events occurring in a fixed period of time if these events occur with a known average rate and independently of the time since the last event. For example, if you receive 3 calls on average between 8am-5pm each day, then the number of calls you will receive tomorrow between 8am-5pm should follow a with parameter λ = 3. This is under the assumption that the chance to receive a call at any time point between 8am-5pm is the same. For a random variable e−λ λ k X~Poisson ( λ ) and any integer k=0, 1, 2, … , …, P( X= k ) = . Then k ! −λk − λ k −− λ k 1 − λ ∞eλ ∞ e λ ∞ e λ ∞ e λ EX( ) =∑ k = ∑ k =λ ∑ = λ ∑ = λ k=0k! k = 1 k ! k = 1 (1)! k− k = 0 k !

−λk − λ k ∞eλ ∞ e λ VarX( ) =∑ k2 −=λ 2 ∑ k 2 − λ 2 k=0k! k = 1 k ! −λ k − 1 ∞ e λ =λ∑ (k −+ 11) −= λλ2 (()1) E X +−= λλ 2 k=1 (k − 1)!

Note that Poisson is ultimate Binomial if np → λ as n → ∞ . Also note that np≈ np(1 − p ) when p is small ( np is the Binomial mean and np (1-p) is the Binomial variance). Exponential distributions are a class of continuous probability distribution. An arises naturally when modeling the time between independent events that happen at a constant average rate. For example, if you receive 3 calls on average between 8am-5pm each day, then the hours you wait for the first call since 8am tomorrow should follow an exponential distribution with parameter λ =3 calls /9 hrs = 1/3 . The average time you wait for the new call since last call is the expectation of the distribution: 1/λ = 3 hrs . The probability density function is λe−λx , for x ≥ 0 f( x ) =  .  0, for x < 0 The variance of the distribution is 1/ λ 2 (Please prove it yourself).