M-Estimators

M-Estimators

Identification and consistency Score and Hessian functions Asymptotic normality and delta method GMM M-estimators Gabriel V. Montes-Rojas Gabriel Montes-Rojas M-estimators Identification and consistency Score and Hessian functions Asymptotic normality and delta method GMM Set-up Let m(x, q) be a parametric model for E (yjx) (we will also use it later for other conditional moments, such as Qt (yjx) in in quantile regression) where m is a known function of x and q. x is a K vector of explanatory variables. q is a P × 1 parameter vector. q 2 Q ⊆ RP . A model is correctly specified for the conditional mean E (yjx), if for some q0 2 Q, E (yjx) = m(x, q0) Gabriel Montes-Rojas M-estimators Identification and consistency Score and Hessian functions Asymptotic normality and delta method GMM Set-up Let q(w, q) be a function of the random vector w and the parameter vector q 2 Q. In general we will have w = (y, x) with typical element wi from a sample fwi : i = 1, 2, ..., Ng. q is a known function of w and q. q is a P × 1 parameter vector. q 2 Q ⊆ RP . An M-estimator of q0 solves the problem N −1 min N ∑ q(wi , q), q2Q i=1 N ˆ −1 q = arg min N ∑ q(wi , q) q2Q i=1 −1 N Define the function q 7! MN (q) where MN (q) ≡ N ∑i=1 q(wi , q) and q 7! MN (q) where M(q) ≡ E [q(w, q)]. Note that qˆ is a random variable as it depends on the sample fwi : i = 1, 2, ..., Ng. Sometimes we will use the notation qˆN to emphasize that it depends on the sample. Gabriel Montes-Rojas M-estimators Identification and consistency Score and Hessian functions Asymptotic normality and delta method GMM Set-up It is generally assumed that q0 = arg min E [q(w, q)]. q2Q E [q(w, q)] can be seen in statistical terms a loss function. Examples of loss functions are: - quadratic: (.)2 which is the base of OLS estimators; - absolute value: j.j which is the base of median regression estimators. Gabriel Montes-Rojas M-estimators Identification and consistency Score and Hessian functions Asymptotic normality and delta method GMM Identification Identification requires that there is a unique solution to the minimisation problem: E [q(w, q0)] < E [q(w, q)], for all q 2 Q, q 6= q0. Gabriel Montes-Rojas M-estimators Identification and consistency Score and Hessian functions Asymptotic normality and delta method GMM Consistency Because qˆ depends on the whole function q 7! MN (q) we require an appropriate \functional convergence". We require uniform convergence in probability: N −1 p max N q(wi , q) − E [q(w, q)] ! 0. 2 ∑ q Q i=1 Uniform convergence is a difficult and technical issue. Gabriel Montes-Rojas M-estimators Identification and consistency Score and Hessian functions Asymptotic normality and delta method GMM Uniform Weak Law of Large Numbers Uniform Weak Law of Large Numbers: Let w be a random vector taking values in W ⊂ RM , Q ⊂ RP , and let q : W × Q ! R be a real valued function. Assume that (a) Q is compact; (b) for each q 2 Q, q(., q) is Borel measurable on W; (c) for each w 2 W, q(w,.) is continuous on Q; and (d) jq(w, q)j < b(w) for all q 2 Q, where b is a nonnegative function on W such that E [b(w)] < ¥. Then N −1 p max N q(wi , q) − E [q(w, q)] ! 0. 2 ∑ q Q i=1 Gabriel Montes-Rojas M-estimators Identification and consistency Score and Hessian functions Asymptotic normality and delta method GMM Consistency Consistency of M-estimators: Under the assumptions required for uniform convergence in probability above, and assuming identification holds, then a random −1 N vector qˆ that is an M-estimator (i.e. qˆ = arg minq2Q N ∑i=1 q(wi , q)) satisfies p qˆ ! q. p Suppose that qˆ ! q, and assume that r(w, q) satisfies the same assumptions on −1 N p q(w, q). Then, N ∑i=1 r(w, qˆ) ! E [r(w, q0)]. Gabriel Montes-Rojas M-estimators Identification and consistency Score and Hessian functions Asymptotic normality and delta method GMM Example: OLS In OLS models m(x, b) = E (yjx) = xb and q(y, x, b) = [y − m(x, b)]2. For identification of OLS models we solve 2 2 2 E [q(y, x, b0)] = E [y + m(x, b0) − 2ym(x, b0) ] 2 2 2 2 2 = E [(xb0) + u + (xb0) − 2(xb0) ] = s 2 2 2 E [q(y, x, b)]b6=b0 = E [y + m(x, b) − 2ym(x, b) ] 2 2 2 = E [(xb0) + u + (xb) − 2(xb0)(xb)] 2 2 = s + E (xb0 − xb) Then, E [q(x, b0)] < E [q(x, b)] for all b 2 B, b 6= b0. Gabriel Montes-Rojas M-estimators Identification and consistency Score and Hessian functions Asymptotic normality and delta method GMM Score of the objective funcion If q(w,.) is continuously differentiable on the interior of Q, then with probability approaching one, qˆ solves the first-order condition N ˆ ∑ s(wi , q) = 0, i=1 where 0 0 ¶q(w, q) ¶q(w, q) ¶q(w, q) s(w, q) = rqq(w, q) = , , ..., ¶q1 ¶q2 ¶qP is the score of the objective function. The condition above is usually satisfied with equality (but not always, see OLS vs. quantile regression). Gabriel Montes-Rojas M-estimators Identification and consistency Score and Hessian functions Asymptotic normality and delta method GMM Score of the objective funcion If E [q(w, q)] is continuously differentiable on the interior of Q, then a condition for the maximisation is rqE [q(w, q)]q=q0 = 0. If the derivative and expectations can be interchanged (Dominated Convergence Theorem), then E [rqq(w, q0)] = E [s(w, q0)] = 0. p Moreover, if qˆ ! q, then under standard regularity conditions N −1 ˆ p N ∑ s(wi , q) ! E [s(w, q0)] i=1 An estimator that is defined by N ˆ q = arg min k ∑ s(wi , q)kM q2Q i=1 is called a Z-estimator (for zero). Here kkM is a specified norm (see GMM). For this case we should assume a different identification condition E [s(w, q)] = 0 iff q = q0. Gabriel Montes-Rojas M-estimators Identification and consistency Score and Hessian functions Asymptotic normality and delta method GMM Hessian of the objective function If q(w,.) is twice continuously differentiable on the interior of Q, then define the Hessian of the objective function, ¶2q(w, q) H(w, q) = r2q(w, q) = q ¶q¶q0 By the mean-value expansion of q(w, q) expanded about q0, N N N ! ˆ ¨ ˆ ∑ s(wi , q) = ∑ s(wi , q0) + ∑ Hi (q − q0), i=1 i=1 i=1 where H¨ i ≡ H(wi , q˜) and q˜ is a vector where each element is on the line segment between qˆ and q0. p Using similar arguments as for the score if qˆ ! q, then by asymptotic theory N −1 ˆ p N ∑ H(wi , q) ! E [H(w, q0)]. i=1 For M-estimators, if q0 is identified then E [H(w, q0)] is positive definite. Gabriel Montes-Rojas M-estimators Identification and consistency Score and Hessian functions Asymptotic normality and delta method GMM Asymptotic expansions The expansion about the true value q0 plays a fundamental role in asymptotic analysis. Rearranging terms we obtain the first-order asymptotic expansion of qˆ: p N !−1 " N # −1 −1/2 N(qˆ − q0) = N ∑ H¨ i −N ∑ si (q0) i=1 i=1 where si (q0) = s(wi , q0). Moreover, using A0 ≡ E [H(w, q0)], p " N # ˆ −1 −1/2 N(q − q0) = A0 −N ∑ si (q0) i=1 0 1 N !−1 " N # −1 ¨ −1 −1/2 + @ N ∑ Hi − A0 A −N ∑ si (q0) i=1 i=1 " N # −1 −1/2 = A0 −N ∑ si (q0) + op (1) · Op (1) i=1 " N # −1 −1/2 = A0 −N ∑ si (q0) + op (1) i=1 ˆ −1 Define the influence function of q as e(wi , q0) ≡ ei (q0) ≡ A0 si (q0) because this provides the “influence” of each particular i − th observation on the estimator. Gabriel Montes-Rojas M-estimators Identification and consistency Score and Hessian functions Asymptotic normality and delta method GMM Asymptotic normality Asymptotic normality: In addition to the assumptions for consistency assume that (a) q0 2 intQ; (b) s(w,.) is continuously differentiable on the interior of Q for all w 2 W; (c) Each element in H(w, q) is bounded in absolute value by a function b(w), where E [b(w)] < ¥; (d) A0 ≡ E [H(w, q0)] is positive definite; (e) E [s(w, q0)] = 0; and (f) Each element in s(w, q0) has finite second moment. Then, p ˆ d −1 −1 N(q − q0) ! N(0, A0 B0A0 ), 0 where B0 ≡ E [s(w, q0)s(w, q0) ] = Var[s(w, q0)]. Thus, ˆ −1 −1 Avar(q) = A0 B0A0 /N. −1 −1 Note the sandwich formula: A0 B0A0 . This will appear many times later on. Gabriel Montes-Rojas M-estimators Identification and consistency Score and Hessian functions Asymptotic normality and delta method GMM Delta method The asymptotic expansion above can also be used for functions of parameters. Considerp a function q 7! g(q). We would like to know what is the distribution of N(g(qˆ) − g(q0)).

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    22 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us