I have worked on various extensions to Gaussian processes for
regression, in collaboration with Zoubin and Carl Rasmussen. We
had a paper at NIPS 2003 titled Warped Gaussian Processes,
available in ps or pdf format. Matlab code is available.
My most recent work is a new method for sparse Gaussian process
regression which involves finding the locations of a basis set of
‘pseudo-inputs’ using gradient descent. This enables us to
achieve greater accuracy than other methods and also to find
hyperparameters in one smooth joint optimization. We had a paper at
NIPS 2005 titled Sparse Gaussian Processes
using Pseudo-inputs. The talk
from the conference is also available. These two documents present the
work in rather different ways, and should complement each other.
We have extended the Sparse Pseudo-input Gaussian Process (SPGP)
to include automatic supervised dimensionality reduction, allowing
high dimensional data sets to be tackled. We have also been working on
a version that can model heteroscedastic noise. (Paper at UAI
2006; available here).