XClose

Gatsby Computational Neuroscience Unit

Home
Menu

Probabilistic and Unsupervised Learning
Approximate Inference and Learning in Probabilistic Models
(2021)

Dates 4 October - 15 December 2021
Lectures Mondays and Thursdays 11:00-13:00 (note any exceptions below)
Tutorials Wednesdays 9:00-11:00
Lecturers Peter Orbanz (weeks 1-5, COMP0086) and Maneesh Sahani (weeks 6-10, COMP0085)
Coordinator (CS) Dmitry Adamskiy. Please direct all correspondence regarding coursework, extensions and administrative issues to Dmitry at d.adamskiy@ucl.ac.uk.
TAs (CS) TBC (Gatsby/SWC) Pierre Glaser, Kevin Huang
Location Ground Floor Seminar Room. In light of the ongoing epidemic, only Gatsby and SWC PhD students may attend in person until further notice.
Online by zoom. The meeting ID will be distributed to registered students via moodle, email and calendars. You must connect using a UCL zoom account.
About the course

This course (offered as two successive modules to MSc students) provides an in-depth introduction to statistical modelling, unsupervised, and some supervised learning techniques. It presents probabilistic approaches to modelling and their relation to coding theory and Bayesian statistics. A variety of latent variable models will be covered including mixture models (used for clustering), dimensionality reduction methods, time series models such as hidden Markov models which are used in speech recognition and bioinformatics, Gaussian process models, independent components analysis, hierarchical models, and nonlinear models. The course will present the foundations of probabilistic graphical models (e.g. Bayesian networks and Markov networks) as an overarching framework for unsupervised modelling. We will cover Markov chain Monte Carlo sampling methods and variational approximations for inference. Time permitting, students will also learn about other topics in probabilistic (or Bayesian) machine learning.

The complete course forms a component of the Gatsby PhD programme, and is mandatory for Gatsby students. The two component modules are also available to students on Machine Learning related MSc programmes. Note that MSc student may only take the second part (Approximate Inference and Learning in Probabilistic Models) if they complete the first.

Mailing list TBC
General attendance

Unfortunately, for the moment only Gatsby/SWC PhD students and staff, and UCL students registered through moodle are able to attend. Students who wish to audit may self-register on moodle with the key "CS202122". We will provide further information here if this situation changes.

Prerequisites

A good background in statistics, calculus, linear algebra, and computer science. You should thoroughly review the maths in the following cribsheet [pdf] [ps] before the start of the course. This quiz may help you check where you stand. You must know Matlab, Octave or Python/NumPy.

Text

There is no required textbook. However, the following are excellent sources for many of the topics covered here.

David J.C. MacKay (2003) Information Theory, Inference, and Learning Algorithms, Cambridge University Press.
Christopher M. Bishop (2006) Pattern Recognition and Machine Learning, Springer-Verlag.
David Barber (2012) Bayesian Reasoning and Machine Learning, Cambridge University Press.
Kevin Patrick Murphy (2012) Machine Learning: a Probabilistic Perspective, MIT Press.
Daphne Koller and Nir Friedman (2009) Probabilistic Graphical Models, MIT Press. (This contains a more extensive treatment of graphical models, good for reference)

Some of our work will depend numerical computation. A very useful reference in this area is:

Gene H. Golub and Charles F. Van Loan (2013) Matrix Computations, 4th ed, Johns Hopkins Univ. Press
Reading material

Specific recommendations for additional reading can be found here. New material may be assembled as course progresses.

Lectures and slides Lecture slides and assignments will be posted here as the course progresses. Note that the content and sequence of lectures may vary from those of previous years. Also note that, following the usual Gatsby teaching pattern, there may be lectures scheduled during "reading week".
Probabilistic and Unsupervised Learning (PO)
aggregated slides for weeks 1-5 [last modified 7 Oct 2021]
4/10: Introduction to Probabilistic Learning 7/10: contd.
11/10: Optimisation 14/10: contd.
18/10: Latent Variables and EM 21/10: Linear-Gaussian Models
25/10: Model selection 28/10: Latent chain models
1/11: contd. 4/11:
8/11: Reading week 11/11
Approximate Inference (MS)
15/11: Graphical models / screen version 18/11: contd
22/11: Bayes and Gaussian Processes / screen version 25/11: contd
29/11: Factored variational approximations / screen version 2/12: contd (additional slides: intractable models / screen version)
6/12: Expectation Propagation / screen version 9/12: Belief Propagation / screen version
12/12: Convexity and Free-Energy / screen version 16/12: Parametric variational and recognition models / screen version [to be updated]
Assignments
50% of the total mark (for each module) is based on coursework assignments (the other 50% being based on the final written examination).

For those registered for the course through an MSc programme, assignments must be submitted online through moodle. For Gatsby, SWC and related students, they should be handed in by email or on paper to a Gatsby TA.

Assignments submitted late will be penalised in accordance with the usual UCL policy.

COMP0086 Formative Assignments (for the entire module)
COMP0086 Summative Assignments (for the entire module)
Additional files:
Binary digits: [binarydigits.txt,  bindigit.m,  bindigit.py]
Geyser data: [geyser.txt]
SSMs: [ssm_spins.txt ssm_spins_test.txt ssm_kalman.m ssm_kalman.py]
Decrypting with MCMC: [message.txt, symbols.txt]
Gibbs sampling for LDA: [matlab code, python code]

COMP0085 Formative Assignments (for the entire module)
COMP0085 Summative Assignments (for the entire module)
Additional files:
CO2 data: [co2.txt]
Binary images and code: [images.jpg, genimages.m, Mstep.m, genimages.py, MStep.py]