Project III (MATH3382)
Bayesian computational methods
Description
Bayesian statistics provides a coherent
and theoretically flawless framework for the statistical
analysis of challenging modern problems. In real problems, Bayesian
inference and prediction present computational challenges due to the
intractability of the posterior and predictive distributions required.
The main objective of the project is to
study computational methods required to facilitate Bayesian analysis
in modern problems. In particular, we aim at focusing on
stochastic computational methods such as Markov chain Monte Carlo
(MCMC) methods. However there is flexibility to deviate and study
other methods provided that they fall under the umbrella of Bayesian
computations.
The demo on the right demonstrates the output of an MCMC
procedure while learning an intractable 2-dimensional
posterior probability density.
In particular we see the MCMC tranjectories |
|
Objectives
We will study several aspects of Bayesian
computations. Some potential directions can be:
- Design of problem specific MCMC methods to address specific
applications: such as regression, classification, time series,
cryptography, etc...
- Design of adaptive/automatic MCMC algorithms
- Addressing local trapping issues in MCMC
- Design of stochastic optimization algorithms
- Design of parallel MCMC algorithms able to run in parallel
computing environments
The methodology for the design of
Bayesian computational methods overcoming certain issues can
be potential objective high impact in modern statistics.
Below we see how MCMC trajectories (black line) from three
different MCMC algorithms are trying to learn the same
2-dimensional posterior distribution (blue shade). |
This guy here is struggling to learn the density because of
the strange bananaaa shape of the density prevents him to move
in the support. |
|
...but this more intelligent MCMC, overcomes the above
issue by moving along and across the contours. However, still,
it takes him quite some time to 'think' which direction to go. |
|
.. but this even more intelligent MCMC is quicker, because
it performs multiple moves simultaneously, by taking advantage
of parallel computing environments. (essentially, he is
cheating) |
|
Pre-requisites
- Statistical Concepts II
- Knowledge of one programming language, such as R, Python,
MATLAB, FORTRAN, C/C++
- alternatively you should be willing to learn one.
Co-requisites
- Nothing in particular, but it will be good to attend Bayesian
Statistics III.
References
- Andrieu, C., De Freitas, N., Doucet, A., & Jordan, M. I.
(2003). An introduction to MCMC for machine learning. Machine
learning, 50(1-2), 5-43. [LINK]
This project will be supervised by Dr Georgios Karagiannis (Office
CM126b) in Term 1, and by Dr Louis Aslett (Office CM212) in Term 2.
For further information, feel free to contact:
Email: georgios.karagiannis@durham.ac.uk
Telephone: +44 (0) 1913342718
Email: louis.aslett@durham.ac.uk
Telephone: +44 (0) 191 33 43067