Using bagged posteriors for robust inference and model criticism
23rd October 2019, 2:30 pm – 3:30 pm
Fry Building, G.09
Standard Bayesian inference is known to be sensitive to model misspecification, leading to unreliable uncertainty quantification and poor predictive performance. However, finding generally applicable and computationally feasible methods for robust Bayesian inference under misspecification has proven to be a difficult challenge. An intriguing approach is to use bagging on the Bayesian posterior (“BayesBag”); that is, to use the average of posterior distributions conditioned on bootstrapped datasets. In this talk, I comprehensively develop the asymptotic theory of BayesBag in both the parameter inference and model selection settings. Based on our parameter inference theory (including a finite-sample extension), I propose a model–data mismatch index for model criticism using BayesBag. I also present empirical validation of our theory and methodology through simulation studies of a linear regression model. Overall, our results demonstrate that BayesBag combines the attractive modeling features of standard Bayesian inference with the distributional robustness properties of frequentist methods.