Robustness of α-posteriors and their variational approximations On the Robustness to Misspecification of α-Posteriors and Their Variational Approximations Marco Avella Medina
[email protected] Department of Statistics Columbia University New York, NY 10027, USA Jos´eLuis Montiel Olea
[email protected] Department of Economics Columbia University New York, NY 10027, USA Cynthia Rush
[email protected] Department of Statistics Columbia University New York, NY 10027, USA Amilcar Velez
[email protected] Department of Economics Northwestern University Evanston, IL 60208, USA Abstract α-posteriors and their variational approximations distort standard posterior inference by downweighting the likelihood and introducing variational approximation errors. We show that such distortions, if tuned appropriately, reduce the Kullback-Leibler (KL) divergence from the true, but perhaps infeasible, posterior distribution when there is potential para- arXiv:2104.08324v1 [stat.ML] 16 Apr 2021 metric model misspecification. To make this point, we derive a Bernstein-von Mises theorem showing convergence in total variation distance of α-posteriors and their variational approx- imations to limiting Gaussian distributions. We use these distributions to evaluate the KL divergence between true and reported posteriors. We show this divergence is minimized by choosing α strictly smaller than one, assuming there is a vanishingly small probability of model misspecification. The optimized value becomes smaller as the the misspecification becomes