Standard Distributions

Standard Distributions

Appendix A Standard Distributions A.1 Standard Univariate Discrete Distributions I. Binomial Distribution B(n, p) has the probability mass function n f(x)= px (x =0, 1,...,n). x The mean is np and variance np(1 − p). II. The Negative Binomial Distribution NB(r, p) arises as the distribution of X ≡{number of failures until the r-th success} in independent trials each with probability of success p. Thus its probability mass function is r + x − 1 r x fr(x)= p (1 − p) (x =0, 1, ··· , ···). x Let Xi denote the number of failures between the (i − 1)-th and i-th successes (i =2, 3,...,r), and let X1 be the number of failures before the first success. Then X1,X2,...,Xr and r independent random variables each having the distribution NB(1,p) with probability mass function x f1(x)=p(1 − p) (x =0, 1, 2,...). Also, X = X1 + X2 + ···+ Xr. Hence ∞ ∞ x x−1 E(X)=rEX1 = r p x(1 − p) = rp(1 − p) x(1 − p) x=0 x=1 ∞ ∞ d d = rp(1 − p) − (1 − p)x = −rp(1 − p) (1 − p)x x=1 dp dp x=1 © Springer-Verlag New York 2016 343 R. Bhattacharya et al., A Course in Mathematical Statistics and Large Sample Theory, Springer Texts in Statistics, DOI 10.1007/978-1-4939-4032-5 344 A Standard Distributions ∞ d d 1 = −rp(1 − p) (1 − p)x = −rp(1 − p) dp dp 1 − (1 − p) x=0 =p r(1 − p) = . (A.1) p Also, one may calculate var(X) using (Exercise A.1) var(X)=rvar(X1). (A.2) III. The Poisson Distribution P(λ) has probability mass function λx f(x)=e−λ (x =0, 1, 2,...), x! where λ>0 is the mean. To see this let X be a random variable with this distribution. Then ∞ x ∞ x ∞ y −λ λ −λ λ −λ λ E(X)= xe = e − = λe x=0 x! x=1 (x 1)! y=0 y! = λe−λ · eλ = λ. (y = x − 1) Also, ∞ ∞ λx λx E(X(X − 1)) = x(x − 1)e−λ = e−λ x(x − 1) x=0 x! x=2 x! ∞ x−2 −λ 2 λ −λ 2 · λ 2 = e λ − = e λ e = λ , x=2 (x 2)! so that E(X2)=λ2 + E(X)=λ2 + λ, var(X)=E(X2) − (E(X))2 = λ2 + λ − λ2 = λ. (A.3) IV. The Beta-Binomial Distribution BB(α, β, n) is the marginal distribution of X when the conditional distribution of X given (another random variable) Y = y (withvaluesin[0, 1]) is B(n, y), where Y has the beta distribution Be(α, β)(see Sect. A.2). Hence the probability mass function of X is 1 n 1 f(x)= yx(1 − y)n−x yα−1(1 − y)β−1dy x B(α, β) 0 n n 1 B(x + α, n − x + β) = x yx+α−1(1 − y)n−x+β−1dy = x B(α, β) 0 B(α, β) n Γ (x + α)Γ (n − x + β)Γ (α + β) = (x =0, 1,...,n) x Γ (n + α + β)Γ (α)Γ (β) A Standard Distributions 345 [See below for the relation B(α, β)=Γ (α)Γ (β)/Γ (α + β).]HereifX ∼ BB(α, β, n), nα E(X)=EE(X | Y )=EnY = nEY = , α + β E(X2)=EE(X2 | Y )=E[nY (1 − Y )+n2Y 2] α(α +1) α =(n2 − 1)EY 2 + nEY =(n2 − 1) + n , (α + β)(α + β +1) α + β var (X)=E(X2) − (E(X))2. (See below for the computation of the moments of the beta distribution.) A.2 Some Absolutely Continuous Distributions I. The Uniform Distribution U (α, β)on[α, β] has the probability density function (p.d.f.) 1 f(x)= for α ≤ x ≤ β, β − α = 0 elsewhere. II. The Beta Distribution Be(α, β) has p.d.f. 1 f(x)= xα−1(1 − x)β−1, 0 <x<1, B(α, β) = 0 elsewhere. Here α>0, β>0andB(α, β) is the normalizing constant (beta function) 1 B(α, β)= xα−1(1 − x)β−1dx. 0 Clearly, if X ∼ Be(α, β), then 1 1 B(α +1,β) E(X)= xα(1 − x)β−1dx = , B(α, β) 0 B(α, β) B(α +2,β) B(α + k, β) E(X2)= ,..., E(Xk)= . (A.4) B(α, β) B(α, β) Recall that the gamma function Γ (α) is defined by ∞ Gamma(α)= e−xxα−1dx (α>0). 0 On integration by parts one gets ∞ ∞ ∞ Γ (α +1)= e−xxαdx = −e−xxα + αxα−1e−xdx =0+αΓ (α)=αΓ (α). 0 0 0 (A.5) 346 A Standard Distributions We now prove Γ (α)Γ (β) B(α, β)= , ∀ α>0,β>0. (A.6) Γ (α + β) Now ∞ ∞ Γ (α)Γ (β)= e−xxα−1dx e−yyβ−1dy 0 0 ∞ ∞ = e−(x+y)xα−1yβ−1dxdy. 0 0 x = z − y Change variables: z = x + y, y = y,toget , with y = y ∂x ∂x ∂t ∂y Jacobian = det ∂y ∂y =1 ∂z ∂y ∞ z Γ (α)Γ (β)= e−z (z − y)α−1yβ−1dy dz 0 0 ∞ z y α−1 y β−1 = e−zzα−1zβ−1 1 − dy dz z z 0 0 ∞ 1 −z α+β−2 α−1 β−1 y 1 = e z z (1 − u) u du dz u = z ,du= z dy 0 0 ∞ = e−zzα+β−1B(β,α)dz = Γ (α + β)B(β,α). (A.7) 0 But 1 1 B(β,α)= uβ−1(1 − u)α−1du = xα−1(1 − x)β−1dx 0 0 = B(α, β), (x =1− u). (A.8) Hence (A.7)and(A.6). Using (A.4)–(A.6), one gets the k-th moment of a beta (Be(α, β)) random variable X as Γ (α + k)Γ (β) · Γ (α + β) (α + k − 1) ···(α +1)α E(Xk)= = . Γ (α + β + k) · Γ (α)Γ (β) (α + β + k − 1) ···(α + β +1)(α + β) In particular, α (α +1)α αβ E(X)= ,E(X2)= , var(X)= . α+β (α + β +1)(α + β) (α+β)2(α+β+1) III. The Gamma Distribution G (α, β) has the p.d.f. 1 f(x)= e−x/αxβ−1, 0 <x<∞ Γ (β)αβ = 0 elsewhere, (A.9) A Standard Distributions 347 where α>0, β>0. Here α is a scale parameter, i.e., if X is ∼ G (α, β), then X/α is G (1,β), Note that k X 1 ∞ E = zke−zzβ−1dz α Γ (β) 0 Γ (β + k) = =(β + k − 1) ···(β +1)β, Γ (β) so that E(Xk)=αk(β + k − 1) ···(β +1)β. (A.10) Hence EX = αβ,var(X)=α2β. A.2.1 The Normal Distribution N(μ, σ2) has p.d.f. 1 −(x−μ)2/2σ2 fμ,σ2 (x)=√ e , −∞ <x<∞, (A.11) 2πσ2 where μ ∈ (−∞, ∞), σ2 > 0. The standard normal distribution N(0, 1) has p.d.f. 1 2 f(x)=√ e−x /2, −∞ <x<∞. (A.12) 2π x−μ To show that (A.12) (and hence (A.11), by transformation y = σ ) is a p.d.f., one needs to show that ∞ 2 π e−x /2dx = (A.13) 0 2 For this use the transformation z = x2/2toget ∞ ∞ √ −x2/2 −z 1 − 1 1 1 e dx = e 2 z 2 dz = √ Γ . (A.14) 0 0 2 2 2 1 Now, by (A.7)(withα = β = 2 ) 1 1 1 1 1 Γ 2 = Γ (1)B , =B , (since Γ (1)= ∞ e−xdx= − e−x|∞ =1, ) 2 2 2 2 2 0 0 1 1 = x−1/2(1 − x)−1/2dx = z−1(1 − z2)−1/22zdz (z = x1/2,dx=2zdz) 0 0 1 π/2 cos θdθ =2 (1 − z2)−1/2dz =2 (z =sinθ, dz =cosθdθ) 0 0 cos θ π =2 = π. 2 Hence 1 √ Γ = π, (A.15) 2 whichwhenusedin(A.14) yields (A.13). 348 A Standard Distributions 2 X−μ If X is N(μ, σ ), then Z = σ has the p.d.f. (A.12) (by change of variables). Therefore, ∞ X − μ 1 2 2 E = E(Z)= √ xe−x /2dx =0 (sincexe−x /2 is odd ), σ 2π −∞ 2 ∞ ∞ X − μ 1 2 2 2 E = E(Z2)=√ x2e−x /2dx = √ x2e−x /2dx σ 2π 2π −∞ 0 ∞ 2 1 −z 2 = (2z) 2 e dz (z = x ,dz= xdx) π 2 0 2 ∞ 2 3 2 1 1 2 1 √ = √ e−zz1/2dz = √ Γ = √ Γ = √ π π 0 π 2 π 2 2 π 2 =1. Hence E(X − μ)=0, or E(X)=μ, 2 X−μ 2 (A.16) E σ =1, or, var(X)=σ . 2 V. The Chi-Square Distribution χk with k Degrees of Freedom is defined to be the distribution of the sum of squares of k independent standard normal random variables. To derive its p.d.f. let X1,X2,...,Xk be k independent N(0, 1) random variables. Then define the chi-square random variable 2 2 2 Z = X1 + X2 + ···+ Xk , and note that, as Δz ↓ 0, k 1 −(x2+···+x2 )/2 P (z<Z≤ z + Δz)= ··· √ e 1 k dx1 ···dxk 2π k 2 {(x1,...,xk):z< xi ≤ z + Δz} 1 k 1 = √ e−z/2 + O(Δz) 2π 2 ×volume of the annulus {(x1,...,xk):z< xi <z+ Δz} k √ k √ 1 −z/2 k = √ e + O(Δz) ck z + Δz − ck z , 2π k writing the volume of a ball of radius r in k-dimension as ckr .

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    45 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us