Sequential Bayesian inference with intractable likelihoods: A sequential mixture model method for posterior and likelihood estimation

dc.contributor.authorHäggström, Henrik
dc.contributor.departmentChalmers tekniska högskola / Institutionen för matematiska vetenskapersv
dc.contributor.examinerSchauer, Moritz
dc.contributor.supervisorPicchini, Umberto
dc.date.accessioned2023-06-28T06:55:44Z
dc.date.available2023-06-28T06:55:44Z
dc.date.issued2023
dc.date.submitted2023
dc.description.abstractIn a Bayesian setting with realistic models it is not unusual that it is not possible to perform exact parameter inference. This is typically due to the fact that the likelihood function is not available in closed form or is intractable. Likelihood-free methods perform parameter inference in models where evaluating the likelihood is an intractable problem, but sampling data from a generative model is possible. With the expansion of machine learning, recent approaches learn the posterior distribution of the parameters by sequential updates of neural-network based density estimators. While these methods perform well, they require a network architecture to be specified and training the neural-network can be computationally demanding and time consuming. In this work we present a Bayesian inference method which, in place of neural networks, uses Gaussian mixtures sequentially learned through an expectation-maximization procedure. Posterior samples are then obtained via MCMC through an informative and self-tuned proposal sampler. Only the number of components in the Gaussian mixture needs to be specified to run the algorithm. We show the feasibility of this method and benchmark it against two neural-network based state-of-the-art Bayesian methods in 4 simulation studies. The results show that the proposed method is competitive and in some cases even outperforms the other methods in terms of simulation efficiency. Additionally, it is in most cases significantly faster to run.
dc.identifier.coursecodeMVEX03
dc.identifier.urihttp://hdl.handle.net/20.500.12380/306451
dc.language.isoeng
dc.setspec.uppsokPhysicsChemistryMaths
dc.subjectBayesian inference, simulation-based inference, likelihood-free methods, multimodal posteriors, posterior estimation, likelihood estimation, R, Python.
dc.titleSequential Bayesian inference with intractable likelihoods: A sequential mixture model method for posterior and likelihood estimation
dc.type.degreeExamensarbete för masterexamensv
dc.type.degreeMaster's Thesisen
dc.type.uppsokH
local.programmeEngineering mathematics and computational science (MPENM), MSc
Ladda ner
Original bundle
Visar 1 - 1 av 1
Hämtar...
Bild (thumbnail)
Namn:
Master_Thesis_Henrik_Häggström_2023.pdf
Storlek:
42.95 MB
Format:
Adobe Portable Document Format
Beskrivning:
License bundle
Visar 1 - 1 av 1
Hämtar...
Bild (thumbnail)
Namn:
license.txt
Storlek:
2.35 KB
Format:
Item-specific license agreed upon to submission
Beskrivning: