Advanced Probabilistic Techniques Workshop

Gatsby, August 2007
Organizers: Richard Turner and Pietro Berkes

The aim of the workshop was to cover new probabilistic techniques, applicable to neuroscience, one topic a day for a week. On each day, the basic format was a two hour seminar in the morning session, concluding with a simple worked example. The remainder of the day was spent using the technique-of-the-day to attack a data set. The plan was to be problem oriented. So, for example, the task might be to use the technique to infer some unknown parameter, or to make a prediction. In order to encourage discussion and transference of skills we split up into pairs to code up the algorithms.

Wednesday 8th August - Generalized Belief Propagation, lecturer: Richard Turner (.html)

Topics: Bayes nets, factor graphs, Belief Propagation on trees, loopy BP, free energy interpretation, generalized BP, region graph methods

Task: Implementing a Sudoku solver. Deriving a factor graph representation of the Sudoku constraints, implementing loopy BP on a simpler problem (4x4 Sudoku - Quadoku), optimizing for the full problem, generalizing the method to GBP.


Thursday 9th August - Expectation Propagation, lecturer: Jonathan Pillow (.pdf)

Topics: Laplace approximation, Assumed Density Filtering, clutter problem, Expectation Propagation, Belief Propagation

Task: Inferring the mean of an unknown Gaussian embedded in noise (clutter problem) / Computing the posterior distribution over receptive fields of a neural encoding model / Gaussian process classification

Friday 10th August - Energy models, deep belief networks, lecturer: Pietro Berkes

Topics: Energy models, causal generative models vs. energy models in overcomplete ICA, contrastive divergence learning, score matching, restricted Boltzmann machines, deep belief networks

Task: Implementing a deep belief network for handwritten letters classification / generation.


Monday 13th August - Particle Filtering, lecturer: Frank Wood (.ppt,.pdf)

Topics: Linear State Space Models, importance sampling, sequential importance sampling, up-sampling, down-sampling, sequential importance resampler, stratified sampling, optimal particle re-weighting

Task: Inferring the true trajectory of a ballistic projectile from noisy observations (target tracking)


Tuesday 14th August - Dirichlet Processes, lecturer: Yee Whye Teh (.pdf)

Topics: Exponential families, infinite mixture models, Gaussian processes, Dirichlet processes, representations of DPs: Blackwell-MacQueen Urn scheme, Chinese Restaurant Process, Stick-breaking construction

Task: Generating from a Chinese Restaurant Process and Stick-breaking construction, implementing Gibbs samplers for an infinite mixture model