Mathematical Statistics, Lecture 16 Asymptotics: Consistency and Delta

Mathematical Statistics, Lecture 16 Asymptotics: Consistency and Delta

Asymptotics Asymptotics: Consistency and Delta Method MIT 18.655 Dr. Kempthorne Spring 2016 1 MIT 18.655 Asymptotics: Consistency and Delta Method Asymptotics: Consistency Asymptotics Delta Method: Approximating Moments Delta Method: Approximating Distributions Outline 1 Asymptotics Asymptotics: Consistency Delta Method: Approximating Moments Delta Method: Approximating Distributions 2 MIT 18.655 Asymptotics: Consistency and Delta Method Asymptotics: Consistency Asymptotics Delta Method: Approximating Moments Delta Method: Approximating Distributions Consistency Statistical Estimation Problem X1;:::; Xn iid Pθ; θ 2 Θ: q(θ): target of estimation. q^(X1;:::; Xn): estimator of q(θ): Definition: q^n is a consistent estimator of q(θ), i.e., Pθ q^n(X1;:::; Xn) −! q(θ) if for every E > 0; limn!1 Pθ(jqn(X1;:::; Xn) − q(θ)j > E) = 0: Example: Consider Pθ such that: E [X1 j θ] = θ q(θ) = θ Pn i=1 Xi q^n = = X n When is q^n consistent for θ? 3 MIT 18.655 Asymptotics: Consistency and Delta Method Asymptotics: Consistency Asymptotics Delta Method: Approximating Moments Delta Method: Approximating Distributions Consistency: Example Example: Consistency of sample mean q^n(X1;:::; Xn) = X = X n: 2 If Var(X1 j θ) = σ (θ) < 1, apply Chebychev's Inequality. For any E > 0: 2 2 Var(X n j θ) σ (θ)/E n!1 Pθ(jX nj ≥ E) ≤ = −−−! 0 E2 n If Var(X1 j θ) = 1; X n is consistent if E [jX1j j θ] < 1: Proof: Levy Continuity Theorem. 4 MIT 18.655 Asymptotics: Consistency and Delta Method Asymptotics: Consistency Asymptotics Delta Method: Approximating Moments Delta Method: Approximating Distributions Consistency: A Stronger Definition Definition: q^n is a uniformly consistent estimator of q(θ), if for every E > 0; limn!1 sup[Pθ(jqn(X1;:::; Xn) − q(θ)j > E)] = 0: θ2Θ Example: Consider the sample mean q^n = X n for which E [X n j θ] = θ 2 Var[X n j θ] = σ (θ): Proof of consistency of q^n = X n extends to uniform consistency if 2 supθ2Θ σ (θ) ≤ M < 1 (∗). Examples Satisfying (∗) Xi i.i.d. Bernoulli(θ): 2 2 Xi i.i.d. Normal(µ, σ ); where θ = (µ, σ ) Θ = fθg = (−∞; +1) × [0; M]; for finite M < 1: 5 MIT 18.655 Asymptotics: Consistency and Delta Method Asymptotics: Consistency Asymptotics Delta Method: Approximating Moments Delta Method: Approximating Distributions Consistency: The Strongest Definition Definition: q^n is a strongly consistent estimator of q(θ), if i t Pθ lim jqn(X1;:::; Xn) − q(θ)j ≤ E)] = 1; for every E > 0: n!1 a:s: q^n −−! q(θ): (a.s. ≡ "almost surely") Compare to: Definition: q^n is a (weakly) consistent estimator of q(θ), if for every E > 0; limn!1 [Pθ(jqn(X1;:::; Xn) − q(θ)j > E)] = 0: 6 MIT 18.655 Asymptotics: Consistency and Delta Method Asymptotics: Consistency Asymptotics Delta Method: Approximating Moments Delta Method: Approximating Distributions Consistency of Plug-In Estimators Plug-In Estimators: Discrete Case Discrete outcome space of size K : X = fx1;:::; xK g X1;:::; Xn iid Pθ; θ 2 Θ; where θ = (p1;:::; pK ) P(X1 = xk j θ) = pk ; k = 1;:::; K PK pk ≥ 0 for k = 1;:::; K and 1 pk = 1: Θ = SK (K -dimensional simplex). Define the empirical distribution: ^ θn = (^p1;:::; p^K ) where Pn 1(X = x ) N p^ = i=1 i k ≡ k k n n ^ θn 2 SK . 7 MIT 18.655 Asymptotics: Consistency and Delta Method Asymptotics: Consistency Asymptotics Delta Method: Approximating Moments Delta Method: Approximating Distributions Consistency of Plug-In Estimators Proposition/Theorem (5.2.1) Suppose Xn = (X1;:::; Xn) is a random sample of size n from a discrete distribution θ 2 S: Then: θ^n is uniformly consistent for θ 2 S: d For any continuous function q : S! R , q^ = q(θ^n) is uniformly consistent for q(θ): Proof: For any E > 0; Pθ(jθ^n − θj ≥ E) −! 0: This follows upon noting that: ^ 2 2 K ^ 2 2 fxn : jθn(xn) − θj < E g ⊃ \k=1fxn : j(θn(xn) − θ)k j < E =K g So ^ 2 2 PK ^ 2 2 P(fxn : jθn(xn) − θj ≥ E g ≤ k=1 P(fxn : j(θn(xn) − θ)k j ≥ E =K g PK 1 2 K 2 ≤ k=1 4n =(E =K ) = 4n:2 8 MIT 18.655 Asymptotics: Consistency and Delta Method Asymptotics: Consistency Asymptotics Delta Method: Approximating Moments Delta Method: Approximating Distributions Consistency of Plug-In Estimators Proof (continued): q(·): continuous on compact S =) q(·) uniformly continuous on S: For every E > 0; there exists δ(E) > 0 such that: jθ1 − θ0j < δ(E) =) jq(θ1) − q(θ0)j < E, uniformly for all θ0; θ1 2 Θ: It follows that c c fx : jq(θ^n(x)) − q(θ)j < Eg ⊆ fx : jθ^n − θj < δ(E)g =) Pθ[jq^n − q(θ)j ≥ E] ≤ Pθ[jθ^n − θj ≥ δ(E)] Note: uniform consistency can be shown; see B&D. 9 MIT 18.655 Asymptotics: Consistency and Delta Method Asymptotics: Consistency Asymptotics Delta Method: Approximating Moments Delta Method: Approximating Distributions Consistency of Plug-In Estimators Proposition 5.2.1 Suppose: d g = (g1;:::; gd ): X ! Y ⊂ R : E [jgj (X1)j j θ] < 1; for j = 1;:::; d; for all θ 2 Θ: mj (θ) ≡ E [gj (X1) j θ]; for j = 1;:::; d: Define: q(θ) = h(m(θ)), where m(θ) = (m1(θ);:::; md (θ)) p h : Y! R ; is continuous. Then: n ! 1 X q^n = h(g¯) = h g(Xi ) n i=1 is a consistent estimator of q(θ): 10 MIT 18.655 Asymptotics: Consistency and Delta Method Asymptotics: Consistency Asymptotics Delta Method: Approximating Moments Delta Method: Approximating Distributions Consistency of Plug-In Estimators Proposition 5.2.1 Applied to Non-Parametric Models P = fP : EP (jg(X1)j) < 1g and ν(P) = h(EP g(X1)) P ν(P^n) −! ν(P) where P^n is empirical distribution. 11 MIT 18.655 Asymptotics: Consistency and Delta Method Asymptotics: Consistency Asymptotics Delta Method: Approximating Moments Delta Method: Approximating Distributions Consistency of MLEs in Exponential Family Theorem 5.2.2 Suppose: P is a canonical exponential family of rank d generated by T T = (T1(X );:::; Td (X )) : p(x j η) = h(x)expfT(x)η − A(η)g E = fηg is open. X1;:::; Xn are i.i.d Pη 2 P Then: n!1 Pη[ the MLE η^ exists] −−−! 1: η^ is consistent. 12 MIT 18.655 Asymptotics: Consistency and Delta Method Asymptotics: Consistency Asymptotics Delta Method: Approximating Moments Delta Method: Approximating Distributions Consistency of MLEs in Exponential Family Proof: Pn o η^(X ;:::; X ) exists iff T = T(X )=n 2 C : 1 n n 1 i Tn o If η is true, then t = E [T(X ) j η ] 2 C 0 0 1 0 Tn • and A(η0) = t0: By definition of the interior of the convex support, there exists o δ > 0: S = ft : jt − E [T(X )j < δg ⊂ C . δ η0 1 Tn By the WLLN: n 1 X Pη0 Tn = T(Xi ) −−−−! Eη0 [T(X1)] n i=1 o n!1 =) P T 2 C −−−−−! 1: η0 n Tn • o η^ exists if it solves A(η) = T , i.e., if T 2 C , n n Tn • 0 The map η ! A(η) is 1-to-1 on CT and continuous on E; so • −1 the inverse function A is continuous, and Prop. 5.2.1 applies. 13 MIT 18.655 Asymptotics: Consistency and Delta Method Asymptotics: Consistency Asymptotics Delta Method: Approximating Moments Delta Method: Approximating Distributions Consistency of Minimum-Contrast Estimates Minimum-Contrast Estimates d X1;:::; Xn iid Pθ; θ 2 Θ ⊂ R : ρ(X ; θ): X × Θ ! R; a contrast function D(θ0; θ) = E [ρ(X ; θ) j θ0]: the discrepancy function θ = θ0 uniquely minimizes D(θ0; θ): The Minimum-Contrast Estimate θ^ minimizes: n 1 X ρn(X; θ) = ρ(Xi ; θ): n i=1 Theorem 5.2.3 Suppose n P 1 X θ0 supf ρ(Xi ; θ) − D(θ0; θ)g −−−−! 0 θ2Θ n i=1 inf fD(θ0; θ)g > D(θ0; θ0); for all E > 0: jθ−θ0|≥: Then θ^ is consistent. 14 MIT 18.655 Asymptotics: Consistency and Delta Method Asymptotics: Consistency Asymptotics Delta Method: Approximating Moments Delta Method: Approximating Distributions Outline 1 Asymptotics Asymptotics: Consistency Delta Method: Approximating Moments Delta Method: Approximating Distributions 15 MIT 18.655 Asymptotics: Consistency and Delta Method Asymptotics: Consistency Asymptotics Delta Method: Approximating Moments Delta Method: Approximating Distributions Delta Method Theorem 5.3.1 (Applying Taylor Expansions) Suppose that: X1;:::; Xn iid with outcome space X = R. 2 E [X1] = µ, and Var[X1] = σ < 1: m E [jX1j ] < 1: h : R ! R, m-times differentiable on R; with m ≥ 2: dj h(x) h(j) = , j = 1; 2;:::; m: dxj (m) (m) jjh jj1 ≡ sup jh (x)j ≤ M < 1: x2X Then m−1 (j) X h (µ) j E (h(X )) = h(µ) + E[(X − µ) ] + Rm j! j=1 m E [jX1j ] −m=2 where jRmj ≤ M m! n 16 MIT 18.655 Asymptotics: Consistency and Delta Method Asymptotics: Consistency Asymptotics Delta Method: Approximating Moments Delta Method: Approximating Distributions Proof: Apply Taylor Expansion to h(X ): h(X ) = m−1 (j) X h (µ) j (m) ∗ m h(µ) + (X − µ) + h (X )(X − µ) ; j! j=1 ∗ where jX − µj ≤ jX − µj Take expectations and apply Lemma 5.3.1: If j E jX1j < 1; j ≥ 2; then there exist constants Cj > 0 and Dj > 0 such that j j −j=2 E jX − µj ≤ Cj E jX1j n j j −(j+1)=2 jE [(X − µ) ]j ≤ Dj E jX1j n for j odd 17 MIT 18.655 Asymptotics: Consistency and Delta Method Asymptotics: Consistency Asymptotics Delta Method: Approximating Moments Delta Method: Approximating Distributions Applying Taylor Expansions Corollary 5.3.1 (a).

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    33 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us