I have recently started a post-doc at Microsoft Research Cambridge. I am a former research student of Zoubin Ghahramani, and
I am interested in many aspects of machine learning and Bayesian
statistics.
Research Areas
I have worked on various extensions to Gaussian processes for
regression, in collaboration with Zoubin and Carl Rasmussen. We
had a paper at NIPS 2003 titled Warped Gaussian Processes,
available in ps or pdf format. Matlab code is available.
I have also been looking into approximations for Bayesian
predictive distributions, as an alternative to the traditional
approach of approximating posterior distributions on parameters. We
had a paper at ICML 2005 titled Compact
Approximations to Bayesian Predictive Distributions.
My most recent work is a new method for sparse Gaussian process
regression which involves finding the locations of a basis set of
‘pseudo-inputs’ using gradient descent. This enables us to
achieve greater accuracy than other methods and also to find
hyperparameters in one smooth joint optimization. We had a paper at
NIPS 2005 titled Sparse Gaussian Processes
using Pseudo-inputs. The talk
from the conference is also available. These two documents present the
work in rather different ways, and should complement each other.
We have extended the Sparse Pseudo-input Gaussian Process (SPGP)
to include automatic supervised dimensionality reduction, allowing
high dimensional data sets to be tackled. We have also been working on
a version that can model heteroscedastic noise. (Paper at UAI
2006; available here).
We have developed a combined local and global Gaussian process
approximation based on the framework
of Joaquin QuiƱonero
Candela and Carl
Rasmussen. (Paper at AISTATS 2007;
available here).
My PhD thesis on Gaussian processes is available here. It contains some
new material not currently published elsewhere.
Competitions
I have participated in several supervised learning challenges where
the objective is to give predictions together with uncertainty
estimates.