Probabilistic and Unsupervised Learning
Approximate Inference and Learning in Probabilistic Models

Lecture slides and assignments will be posted here in due course.
Dates Mondays & Thursdays
4 October - 16 December 2010
Time 11:00-13:00
Tutorials Wednesdays 9:00-11:00
Lecturers Yee Whye Teh and Maneesh Sahani
TAs Lloyd Elliott and Loic Matthey
Location Basement B10 Seminar Room, Alexandra House, 17 Queen Square, London WC1N 3AR [directions]
About the course

This course (offered as two successive modules to MSc students) provides students with an in-depth introduction to statistical modelling, unsupervised, and some supervised learning techniques. It presents probabilistic approaches to modelling and their relation to coding theory and Bayesian statistics. A variety of latent variable models will be covered including mixture models (used for clustering), dimensionality reduction methods, time series models such as hidden Markov models which are used in speech recognition and bioinformatics, Gaussian process models, independent components analysis, hierarchical models, and nonlinear models. The course will present the foundations of probabilistic graphical models (e.g. Bayesian networks and Markov networks) as an overarching framework for unsupervised modelling. We will cover Markov chain Monte Carlo sampling methods and variational approximations for inference. Time permitting, students will also learn about other topics in probabilistic (or Bayesian) machine learning.

The complete course forms a component of the Gatsby PhD programme, and is mandatory for Gatsby students. The two component modules are also available to students on the UCL MSc in Machine Learning and UCL MSc in Computational Statistics and Machine Learning. The first part (GI18: Probabilistic and Unsupervised Learning) may be used to fill a core requirement in each Masters programme. The second part (GI16: Approximate Inference and Learning in Probabilistic Models) is an optional module, but is only available to students who have completed GI18.

Students, postdocs and faculty from outside the unit and these programmes are welcome to attend, but should contact the unit in advance.


A good background in statistics, calculus, linear algebra, and computer science. You should thoroughly review the maths in the following cribsheet [pdf] [ps] before the start of the course. You must either know Matlab or Octave, be taking a class on Matlab/Octave, or be willing to learn it on your own. Any student or researcher at UCL meeting these requirements is welcome to attend the lectures.


There is no required textbook. However, the following are excellent sources for many of the topics covered here.

David J.C. MacKay (2003) Information Theory, Inference, and Learning Algorithms, Cambridge University Press. (also available online)
Christopher M. Bishop (2006) Pattern Recognition and Machine Learning, Springer-Verlag.
Reading material

Specific recommendations for reading can be found here.

Lecture schedule and slides
4/10: Introduction/Foundations 7/10: Foundations
11/10: Latent Variable Models 14/10: LV Contd. & The EM algorithm
18/10: EM Contd. 20/10: Time Series models
25/10: Time Series models 28/10: Graphical Models
1/11: Graphical Models 3/11: Exact Bayes
8/11: Gaussian Processes
GI16 (subject to change)
8/11: Hierarchical and Nonlinear Models 11/11: Hierarchical and Nonlinear Models
15/11: Variational Approximations 17/11: Belief Propagation
22/11: Convex Approximations 25/11: Monte Carlo Approximations
29/11: continued 2/12: Expectation Propagation
6/12: (no lecture) 9/12: (no lecture)
13/12: Review 17/12: continued
50% of the total mark (for each module) is based on coursework assignments (the other 50% being based on the final written examination).

All assignments (coursework) are to be handed in to the Gatsby Unit, not to the CS department. Please hand in all assignments at the beginning of lecture on the due date to the lecturer or to the TAs. Late assignments will be penalised. If you are unable to attend the lecture, you can also hand in assignments to Ms. Rachel Howes at the Alexandra House 4th floor reception.

Late Assignment Policy: Assignments that are handed in late will be penalised as follows: 10% penalty per day for every weekday late, until the answers are discussed in a review session. NO CREDIT will be given for assignments that are handed in after answers are discussed in the review session.

Assignment 1        [binarydigits.txt, bindigit.m] due 18 Oct 2010
Assignment 2        [geyser.txt] due 3 Nov 2010
Assignment 3        [co2.txt] due 17 Nov 2010
Assignment 4        [images.jpg, genimages.m, Mstep.m] due 1 Dec 2010
Assignment 5        [message.txt, symbols.txt, code.tgz] due 17 Dec 2009
Assignment 6        due 17 Jan 2011

To attend Please contact the unit.