Towards end-to-end likelihood-free inference with convolutional neural networks
- PMID: 30793299
- DOI: 10.1111/bmsp.12159
Towards end-to-end likelihood-free inference with convolutional neural networks
Abstract
Complex simulator-based models with non-standard sampling distributions require sophisticated design choices for reliable approximate parameter inference. We introduce a fast, end-to-end approach for approximate Bayesian computation (ABC) based on fully convolutional neural networks. The method enables users of ABC to derive simultaneously the posterior mean and variance of multidimensional posterior distributions directly from raw simulated data. Once trained on simulated data, the convolutional neural network is able to map real data samples of variable size to the first two posterior moments of the relevant parameter's distributions. Thus, in contrast to other machine learning approaches to ABC, our approach allows us to generate reusable models that can be applied by different researchers employing the same model. We verify the utility of our method on two common statistical models (i.e., a multivariate normal distribution and a multiple regression scenario), for which the posterior parameter distributions can be derived analytically. We then apply our method to recover the parameters of the leaky competing accumulator (LCA) model and we reference our results to the current state-of-the-art technique, which is the probability density estimation (PDA). Results show that our method exhibits a lower approximation error compared with other machine learning approaches to ABC. It also performs similarly to PDA in recovering the parameters of the LCA model.
Keywords: approximate Bayesian computation; convolutional network; leaky competing accumulator; likelihood-free inference; machine-learning.
© 2019 The British Psychological Society.
Similar articles
-
Likelihood approximation networks (LANs) for fast inference of simulation models in cognitive neuroscience.Elife. 2021 Apr 6;10:e65074. doi: 10.7554/eLife.65074. Elife. 2021. PMID: 33821788 Free PMC article.
-
Approximate Bayesian computation (ABC) gives exact results under the assumption of model error.Stat Appl Genet Mol Biol. 2013 May 6;12(2):129-41. doi: 10.1515/sagmb-2013-0010. Stat Appl Genet Mol Biol. 2013. PMID: 23652634
-
Complex genetic admixture histories reconstructed with Approximate Bayesian Computation.Mol Ecol Resour. 2021 May;21(4):1098-1117. doi: 10.1111/1755-0998.13325. Epub 2021 Feb 26. Mol Ecol Resour. 2021. PMID: 33452723 Free PMC article.
-
On the use of kernel approximate Bayesian computation to infer population history.Genes Genet Syst. 2015;90(3):153-62. doi: 10.1266/ggs.90.153. Genes Genet Syst. 2015. PMID: 26510570 Review.
-
A generalized, likelihood-free method for posterior estimation.Psychon Bull Rev. 2014 Apr;21(2):227-50. doi: 10.3758/s13423-013-0530-0. Psychon Bull Rev. 2014. PMID: 24258272 Free PMC article. Review.
Cited by
-
Exploring the Potential of Variational Autoencoders for Modeling Nonlinear Relationships in Psychological Data.Behav Sci (Basel). 2024 Jun 25;14(7):527. doi: 10.3390/bs14070527. Behav Sci (Basel). 2024. PMID: 39062350 Free PMC article.
-
Likelihood approximation networks (LANs) for fast inference of simulation models in cognitive neuroscience.Elife. 2021 Apr 6;10:e65074. doi: 10.7554/eLife.65074. Elife. 2021. PMID: 33821788 Free PMC article.
-
OutbreakFlow: Model-based Bayesian inference of disease outbreak dynamics with invertible neural networks and its application to the COVID-19 pandemics in Germany.PLoS Comput Biol. 2021 Oct 25;17(10):e1009472. doi: 10.1371/journal.pcbi.1009472. eCollection 2021 Oct. PLoS Comput Biol. 2021. PMID: 34695111 Free PMC article.
-
Neural superstatistics for Bayesian estimation of dynamic cognitive models.Sci Rep. 2023 Aug 23;13(1):13778. doi: 10.1038/s41598-023-40278-3. Sci Rep. 2023. PMID: 37612320 Free PMC article.
-
Improving the reliability and validity of the IAT with a dynamic model driven by similarity.Behav Res Methods. 2024 Mar;56(3):2158-2193. doi: 10.3758/s13428-023-02141-1. Epub 2023 Jul 5. Behav Res Methods. 2024. PMID: 37450219
References
-
- Blum, M. G. (2010). Approximate Bayesian computation: A nonparametric perspective. Journal of the American Statistical Association, 105, 1178-1187. https://doi.org/10.1198/jasa.2010.tm09448
-
- Blum, M. G., & François, O. (2010). Non-linear regression models for Approximate Bayesian Computation. Statistics and Computing, 20, 63-73. https://doi.org/10.1007/s11222-009-9116-0
-
- Breiman, L. (2001). Random forests. Machine Learning, 45, 5-32. https://doi.org/10.1023/a:1010933404324
-
- Chollet, F. (2017). Deep learning with Python. Shelter Island, New York: Manning Publications Co.
-
- Cybenko, G. (1989). 1Approximation by superpositions of a sigmoidal function. Mathematics of Control, Signals and Systems, 2, 303-314. https://doi.org/10.1007/bf02551274
MeSH terms
LinkOut - more resources
Full Text Sources
Research Materials
