home
people
the greater gatsby
research
annual report
publications
seminars
travel
vacancies
search
ucl
 

 

 

 

 

Variational Inference for Bayesian Mixtures of Factor Analysers
Zoubin Ghahramani and Matthew Beal
Gatsby Computational Neuroscience Unit
University College London

In Advances in Neural Information Processing Systems 12, MIT Press, Cambridge, MA

Abstract

We present an algorithm that infers the model structure of a mixture of factor analysers using an efficient and deterministic variational approximation to full Bayesian integration over model parameters.  This procedure can automatically determine the optimal number of components and the local dimensionality of each component (i.e. the number of factors in each factor analyser).  Alternatively it can be used to infer posterior distributions over number of components and dimensionalities. Since all parameters are integrated out the method is not prone to overfitting.  Using a stochastic procedure for adding components it is possible to perform the variational optimisation incrementally and to avoid local maxima.  Results show that the method works very well in practice and connectly infers the number and dimensionality of nontrivial synthetic examples.

By importance sampling from the variational approximation we show how to obtain unbiased estimates of the true evidence, the exact predictive density, and the KL divergence between the variational posterior and the true posterior, not only in this model but for variational approximations in general.


Download:  ps or pdf