Machine Learning II (2008/2009)

Advanced Markov Chain Monte Carlo

This week we will have a coding weekend, where you will form teams and code up:

  • A Gibbs sampler for a mixture model;
  • Annealed importance sampling (AIS);
  • Reversible jump MCMC (RJMCMC).

The idea is to do Bayesian model averaging. You can use AIS to estimate the evidence of data given the model size (number of components). From this you can estimate the posterior distribution over models. Independently, you can use RJMCMC to directly get samples from the posterior. As both approaches are pretty different, if you can get both to agree on the posterior over model size, there is a good chance that you would have gotten the programming for both correct.

On Friday 130-300pm we will split you up into groups of three students and set you up with work package. Yee Whye and Maneesh will give introductory lectures. You will have Friday and over the weekend to work on this. Both will be available to answer questions via email, and Yee Whye will be available on Monday and Tuesday at Gatsby. We will reconvene at 11am on Tuesday to discuss results and wrap up.

Slides:
[rjmcmc.pdf] [ais.pdf]

Package:
[tgz] [zip]

Reading material:

Related readings: