
Yee Whye Teh
Gatsby Computational Neuroscience Unit, UCL, UK
Wednesday 14 February 2007, 16:00
Seminar Room B10 (Basement)
Alexandra House, 17 Queen Square, London, WC1N 3AR
Stickbreaking Construction for the Indian Buffet Process
The Indian buffet Process (IBP) is a recently proposed latent feature model where each object is modelled using a potentially unbounded number of binary latent features. It has had a variety of applications, including matrix factorization, causal inference, and psychological choice modelling. However, due to the unbounded nature of the model, standard Markov chain Monte Carlo inference techniques like Gibbs sampling is cumbersome and inefficient in IBPs. In this talk, I will reformulate the IBP model using a stickbreaking construction, and show that this leads to straightforward and efficient MCMC inference for the IBP. Furthermore, we will see that there are interesting and strong connections between the stickbreaking construction for the IBP, and the standard stickbreaking construction for the more wellknown Dirichlet process.