Manuscript submitted to The Econometrics Journal, pp. 1–49. De-Biased Machine Learning of Global and Local Parameters Using Regularized Riesz Representers Victor Chernozhukov†, Whitney K. Newey†, and Rahul Singh† †MIT Economics, 50 Memorial Drive, Cambridge MA 02142, USA. E-mail:
[email protected],
[email protected],
[email protected] Summary We provide adaptive inference methods, based on ℓ1 regularization, for regular (semi-parametric) and non-regular (nonparametric) linear functionals of the conditional expectation function. Examples of regular functionals include average treatment effects, policy effects, and derivatives. Examples of non-regular functionals include average treatment effects, policy effects, and derivatives conditional on a co- variate subvector fixed at a point. We construct a Neyman orthogonal equation for the target parameter that is approximately invariant to small perturbations of the nui- sance parameters. To achieve this property, we include the Riesz representer for the functional as an additional nuisance parameter. Our analysis yields weak “double spar- sity robustness”: either the approximation to the regression or the approximation to the representer can be “completely dense” as long as the other is sufficiently “sparse”. Our main results are non-asymptotic and imply asymptotic uniform validity over large classes of models, translating into honest confidence bands for both global and local parameters. Keywords: Neyman orthogonality, Gaussian approximation, sparsity 1. INTRODUCTION Many statistical objects of interest can be expressed as a linear functional of a regression function (or projection, more generally). Examples include global parameters: average treatment effects, policy effects from changing the distribution of or transporting regres- sors, and average directional derivatives, as well as their local versions defined by taking averages over regions of shrinking volume.