links

Gatsby Computational Neuroscience Unit CSML University College London

Contact arthur.gretton@gmail.com
Gatsby Computational Neuroscience Unit
Sainsbury Wellcome Centre
25 Howland Street
London W1T 4JG UK

Phone
+44 (0)7795 291 705

bottom corner

info

Arthur Gretton I am a Reader (Associate Professor) with the Gatsby Computational Neuroscience Unit, part of the Centre for Computational Statistics and Machine Learning at UCL. A short biography.

My research focus is on using kernel methods to reveal properties and relations in data. A first application is in measuring distances between probability distributions. These distances can be used to determine strength of dependence, for example in measuring how strongly two bodies of text in different languages are related; testing for similarities in two datasets, which can be used in attribute matching for databases (that is, automatically finding which fields of two databases correspond); and testing for conditional dependence, which is useful in detecting redundant variables that carry no additional predictive information, given the variables already observed. I am also working on applications of kernel methods to inference in graphical models, where the relations between variables are learned directly from training data.

Recent news

• UAI 2017 Tutorial on representing and comparing probabilities: Slides 1 and Slides 2 (August 2017).
• Some notes on the Cramer GAN, showing that it is a generative moment matching network with a particular kernel, and describing a problem with the critic.
Density Estimation in Infinite Dimensional Exponential Families in JMLR (July 2017).
GP-Select: Accelerating EM Using Adaptive Subspace Preselection in Neural Computation (August 2017)
• An efficient density estimator for the infinite dimensional exponential family. Contains a comparison with a score estimator based on autoencoders. Code
• A linear time kernel goodness-of-fit test . Gives a linear time test for assessing the quality of a model, compared with a reference sample.
• ICML 2017 paper: linear time kernel independence test based on covariance of analytic features, with code
• Criticizing and training generative models using MMD: ICLR 2017 paper and code. Also presented at the adversarial learning workshop at NIPS (see below).

Older news

Talk slides for the NIPS 2016 workshop on generative adversarial networks (more detailed slides from the Dagstuhl workshop). Adaptive MMD test paper and code; linear-time ME test paper and code.
• NIPS oral presentation: adaptive linear-time two-sample tests, with power matching quadratic-time tests: Paper and code.
• I co-chaired AISTATS 2016 with Christian Robert . This took place from 9-11 May 2016 in Cadiz, Spain.
• Slides online for the kernel courses at the Machine Learning Summer Schools in Tuebignen, Cadiz, and Arequipa. See the teaching page .

bottom corner