1. On Monte Carlo methods for intractable latent variable models
- Author
-
Schmon, Sebastian, Doucet, Arnaud, and Deligiannidis, George
- Subjects
510 ,Intractable Likelihood ,Latent Variable Models ,Monte Carlo method ,Statistics ,Markov chain Monte Carlo ,Sequential Monte Carlo - Abstract
This thesis provides novel methodological and theoretical contributions to the area of Monte Carlo methods for intractable Bayesian models. Such intractability can come in various forms. The first project of this thesis considers Markov chain Monte Carlo methods for Bayesian models with intractable likelihood functions. In such cases, the posterior distribution, which is proportional to the product of likelihood and prior, is intractable as well. However, inference based on Markov chain Monte Carlo algorithms is still possible for such models. If the likelihood is replaced by a non-negative unbiased estimate, the resulting algorithm will still target the correct invariant distribution. Such algorithms often result in a trade-off; if the number of samples producing the estimator is increased the algorithm mixes faster, but at the cost of a higher computational effort. In the paper 'Large Sample Asymptotics of the Pseudo-Marginal Algorithm' we analyse the trade-off between the number of samples and speed of mixing under assumptions on the target distribution and the noise of the logarithm of the likelihood estimate. We assume that the Bayesian posterior obeys the regularity conditions which are commonly referred to as Bernstein-von Mises theorems, i.e. the posterior will be similar to a concentrating normal distribution as the number of data increases. Considering the estimator of the log-likelihood we assume that a logarithmic central limit theorem holds as we increase the number of samples with the number of observed data points. Under these conditions, we derive an asymptotic expression of the pseudo-marginal algorithm, which we subsequently exploit as a guide for optimizing such algorithms. We derive novel dimension dependent tuning guidelines for the noise in the likelihood estimate as well as the scaling of the proposal distribution. Similar intractability problems can occur in the case of sequential Monte Carlo algorithms. Such algorithms require a resampling step where random variables are drawn from a discrete distribution with probabilities proportional to some importance weights. However, these importance weights can be intractable for complicated models or importance distributions. In the second paper, called 'Bernoulli Race Particle Filters', we present a novel resampling algorithm that enables exact sampling of these intractable discrete distributions when only a non-negative unbiased estimator is available. We analyse the resulting algorithm theoretically and investigate its performance on various examples.
- Published
- 2020