Deep bootstrap for Bayesian inference

Philos Trans A Math Phys Eng Sci. 2023 May 15;381(2247):20220154. doi: 10.1098/rsta.2022.0154. Epub 2023 Mar 27.

Abstract

For a Bayesian, the task to define the likelihood can be as perplexing as the task to define the prior. We focus on situations when the parameter of interest has been emancipated from the likelihood and is linked to data directly through a loss function. We survey existing work on both Bayesian parametric inference with Gibbs posteriors and Bayesian non-parametric inference. We then highlight recent bootstrap computational approaches to approximating loss-driven posteriors. In particular, we focus on implicit bootstrap distributions defined through an underlying push-forward mapping. We investigate independent, identically distributed (iid) samplers from approximate posteriors that pass random bootstrap weights through a trained generative network. After training the deep-learning mapping, the simulation cost of such iid samplers is negligible. We compare the performance of these deep bootstrap samplers with exact bootstrap as well as MCMC on several examples (including support vector machines or quantile regression). We also provide theoretical insights into bootstrap posteriors by drawing upon connections to model mis-specification. This article is part of the theme issue 'Bayesian inference: challenges, perspectives, and prospects'.

Keywords: bootstrap; generative networks; likelihood-free inference.