Description
Estimation of the marginal likelihood also referred to as the evidence or model evidence is of central importance in Bayesian statistics as it is directly associated with the evaluation of competing hypotheses or models via Bayes factors and posterior model probabilities. Consider, for instance, a setting where we have \(K\) potential models for our data \(\mathbf{y}\), denoted by \(M_1,\ldots,M_K\), with associated parameter vectors \(\boldsymbol{\theta}_1,\ldots,\boldsymbol{\theta}_K\) and our aim is to choose the “best” model or a smaller subset of more “promising” models. For such considerations we typically need to evaluate the marginal likelihood of each model under consideration, which for model \(M_{k}\) is given bywhich is essentially the normalizing constant of the posterior \(f(\boldsymbol{\theta}_{k}|\mathbf{y},M_{k})\), obtained by integrating the likelihood function \(f(\mathbf{y}|\boldsymbol{\theta}_{k},M_{k})\) with respect to the prior density \(f(\boldsymbol{\theta}_{k}|M_{k})\).
The integration above is generally intractable outside the simple conjugate Bayesian framework. Numerical integration methods can be used as an approach to the problem, but such techniques are of limited use when sample sizes are moderate to large or when vector \(\boldsymbol{\theta}_{k}\) is of large dimensionality. In addition, the simplest Monte Carlo estimate given byusing draws \(\{\boldsymbol{\theta}_{k}^{(i)}:i=1,2,...,N\}\) from the prior distribution is extremely unstable when the posterior is concentrated in relation to the prior.
To this end, there is a lot of research on the development of efficient estimators, based on utilising Markov chain Monte Carlo samples from the posterior distribution of \(\mathbf{\theta}_k\). There are numerous such approaches based on importance-sampling, bridge-sampling, candidate’s estimators, Lebesgue intergation theory and the Fourier integral theorem, among others. In this project you will have the opportunity to obtain a deeper understanding of the Bayesian model comparison framework, study and learn about some of the available model evidence estimators and implement them for various Bayesian models.
Prerequisites
Bayesian Computation and Modelling III
In general, a good understanding of Bayesian statistics and good programming skills.
Resources
Feel free to email at konstantinos.perrakis@durham.ac.uk if you have questions.