Sequential Bayesian inference with intractable likelihoods: A sequential mixture model method for posterior and likelihood estimation
dc.contributor.author | Häggström, Henrik | |
dc.contributor.department | Chalmers tekniska högskola / Institutionen för matematiska vetenskaper | sv |
dc.contributor.examiner | Schauer, Moritz | |
dc.contributor.supervisor | Picchini, Umberto | |
dc.date.accessioned | 2023-06-28T06:55:44Z | |
dc.date.available | 2023-06-28T06:55:44Z | |
dc.date.issued | 2023 | |
dc.date.submitted | 2023 | |
dc.description.abstract | In a Bayesian setting with realistic models it is not unusual that it is not possible to perform exact parameter inference. This is typically due to the fact that the likelihood function is not available in closed form or is intractable. Likelihood-free methods perform parameter inference in models where evaluating the likelihood is an intractable problem, but sampling data from a generative model is possible. With the expansion of machine learning, recent approaches learn the posterior distribution of the parameters by sequential updates of neural-network based density estimators. While these methods perform well, they require a network architecture to be specified and training the neural-network can be computationally demanding and time consuming. In this work we present a Bayesian inference method which, in place of neural networks, uses Gaussian mixtures sequentially learned through an expectation-maximization procedure. Posterior samples are then obtained via MCMC through an informative and self-tuned proposal sampler. Only the number of components in the Gaussian mixture needs to be specified to run the algorithm. We show the feasibility of this method and benchmark it against two neural-network based state-of-the-art Bayesian methods in 4 simulation studies. The results show that the proposed method is competitive and in some cases even outperforms the other methods in terms of simulation efficiency. Additionally, it is in most cases significantly faster to run. | |
dc.identifier.coursecode | MVEX03 | |
dc.identifier.uri | http://hdl.handle.net/20.500.12380/306451 | |
dc.language.iso | eng | |
dc.setspec.uppsok | PhysicsChemistryMaths | |
dc.subject | Bayesian inference, simulation-based inference, likelihood-free methods, multimodal posteriors, posterior estimation, likelihood estimation, R, Python. | |
dc.title | Sequential Bayesian inference with intractable likelihoods: A sequential mixture model method for posterior and likelihood estimation | |
dc.type.degree | Examensarbete för masterexamen | sv |
dc.type.degree | Master's Thesis | en |
dc.type.uppsok | H | |
local.programme | Engineering mathematics and computational science (MPENM), MSc |