Infinite Mixtures of Gaussian Process Experts

Carl Edward Rasmussen, Gatsby Computational Neuroscience Unit, UCL
Zoubin Ghahramani, Gatsby Computational Neuroscience Unit, UCL

We present an extension to the Mixture of Experts (ME) model, where the individual experts are Gaussian Process (GP) regression models. Using a input-dependent adaptation of the Dirichlet Process, we implement a gating network for an infinite number of Experts. Inference in this model may be done efficiently using a Markov Chain relying on Gibbs sampling. The model allows the effective covariance function to vary with the inputs, and may handle large datasets -- thus potentially overcoming two of the biggest hurdles with GP models. Simulations show the viability of this approach.

Presented at NIPS*2001, appears in Advances in Neural Information Processing Systems 14, MIT Press (2002).

Available as ps pdf.