Dougal J. Sutherland

I'm a postdoc at the Gatsby Computational Neuroscience Unit, University College London, working with Arthur Gretton.

The essentials: , CV, ORCID, GPG key / keybase.

My research interests include:

Before Gatsby, I did my Ph.D. at Carnegie Mellon University, working with Jeff Schneider on machine learning. See also: various code on github, my crossvalidated/stackoverflow profiles, and my Swarthmore page for older stuff from undergrad.

Publications and selected talks are listed below.

Publications

Below, ** denotes equal contribution. Also available as a .bib file, and most of these are on Google Scholar.

Coauthor filters: (show) (hide)

Preprints

On gradient regularizers for MMD GANs. Michael Arbel, Dougal J. Sutherland, Mikołaj Bińkowski, and Arthur Gretton. Preprint 2018.

Journal and Low-Acceptance-Rate Conference Papers

Demystifying MMD GANs. Mikołaj Bińkowski**, Dougal J. Sutherland**, Michael Arbel, and Arthur Gretton. International Conference on Learning Representations (ICLR) 2018.
Efficient and principled score estimation with Nyström kernel exponential families. Dougal J. Sutherland**, Heiko Strathmann**, Michael Arbel, and Arthur Gretton. Artificial Intelligence and Statistics (AISTATS) 2018. Selected for oral presentation.
Bayesian Approaches to Distribution Regression. Ho Chung Leon Law**, Dougal J. Sutherland**, Dino Sejdinovic, and Seth Flaxman. Artificial Intelligence and Statistics (AISTATS) 2018.
Generative Models and Model Criticism via Optimized Maximum Mean Discrepancy. Dougal J. Sutherland, Hsiao-Yu Tung, Heiko Strathmann, Soumyajit De, Aaditya Ramdas, Alex Smola, and Arthur Gretton. International Conference on Learning Representations (ICLR) 2017.
Dynamical Mass Measurements of Contaminated Galaxy Clusters Using Machine Learning. Michelle Ntampaka, Hy Trac, Dougal J. Sutherland, Sebastian Fromenteau, Barnabás Póczos, and Jeff Schneider. The Astrophysical Journal (ApJ) 831, 2, 135. 2016.
Linear-time Learning on Distributions with Approximate Kernel Embeddings. Dougal J. Sutherland**, Junier B. Oliva**, Barnabás Póczos, and Jeff Schneider. AAAI Conference on Artificial Intelligence (AAAI) 2016.
On the Error of Random Fourier Features. Dougal J. Sutherland and Jeff Schneider. Uncertainty in Artificial Intelligence (UAI) 2015. Chapter 3 / Section 4.1 of my thesis supersedes this paper, fixing a few errors in constants and providing more results.
Active Pointillistic Pattern Search. Yifei Ma**, Dougal J. Sutherland**, Roman Garnett, and Jeff Schneider. Artificial Intelligence and Statistics (AISTATS) 2015.
A Machine Learning Approach for Dynamical Mass Measurements of Galaxy Clusters. Michelle Ntampaka, Hy Trac, Dougal J. Sutherland, Nicholas Battaglia, Barnabás Póczos, and Jeff Schneider. The Astrophysical Journal (ApJ) 803, 2, 50. 2015.
Active learning and search on low-rank matrices. Dougal J. Sutherland, Barnabás Póczos, and Jeff Schneider. Knowledge Discovery and Data Mining (KDD) 2013. Selected for oral presentation.
Nonparametric kernel estimators for image classification. Barnabás Póczos, Liang Xiong, Dougal J. Sutherland, and Jeff Schneider. Computer Vision and Pattern Recognition (CVPR) 2012.
Managing User Requests with the Grand Unified Task System (GUTS). Andrew Stromme, Dougal J. Sutherland, Alexander Burka, Benjamin Lipton, Nicholas Felt, Rebecca Roelofs, Daniel-Elia Feist-Alexandrov, Steve Dini, and Allen Welkie. Large Installation System Administration (LISA) 2012. Work done as part of the Swarthmore College Computer Society.

Dissertations

Scalable, Flexible, and Active Learning on Distributions. Committee: Jeff Schneider, Barnabás Póczos, Maria-Florina Balcan, and Arthur Gretton. Computer Science Department, Carnegie Mellon University. Ph.D. thesis, 2016.
Integrating Human Knowledge into a Relational Learning System. Dougal J. Sutherland. Computer Science Department, Swarthmore College. B.A. thesis, 2011.

Technical Reports, Posters, etc.

Bayesian Approaches to Distribution Regression. Ho Chung Leon Law**, Dougal J. Sutherland**, Dino Sejdinovic, and Seth Flaxman. Learning on Distributions, Functions, Graphs and Groups, NIPS 2017. Selected for oral presentation.
Fixing an error in Caponnetto and de Vito (2007). Dougal J. Sutherland. Technical report 2017.
Understanding the 2016 US Presidential Election using ecological inference and distribution regression with census microdata. Seth Flaxman, Dougal J. Sutherland, Yu-Xiang Wang, and Yee Whye Teh. Technical report 2016.
List Mode Regression for Low Count Detection. Jay Jin, Kyle Miller, Dougal J. Sutherland, Simon Labov, Karl Nelson, and Artur Dubrawski. IEEE Nuclear Science Symposium (IEEE NSS/MIC) 2016.
Deep Mean Maps. Junier B. Oliva**, Dougal J. Sutherland**, Barnabás Póczos, and Jeff Schneider. Technical report 2015.
Linear-time Learning on Distributions with Approximate Kernel Embeddings. Dougal J. Sutherland**, Junier B. Oliva**, Barnabás Póczos, and Jeff Schneider. Feature Extraction: Modern Questions and Challenges, NIPS 2015.
Active Pointillistic Pattern Search. Yifei Ma**, Dougal J. Sutherland**, Roman Garnett, and Jeff Schneider. Bayesian Optimization (BayesOpt), NIPS 2014.
Kernels on Sample Sets via Nonparametric Divergence Estimates. Dougal J. Sutherland, Liang Xiong, Barnabás Póczos, and Jeff Schneider. Technical report 2012.
Finding Representative Objects with Sparse Modeling. Junier B. Oliva, Dougal J. Sutherland, and Yifei Ma. CMU 10-725 Optimization course project 2012. Best poster award.
Grounding Conceptual Knowledge with Spatio-Temporal Multi-Dimensional Relational Framework Trees. Matthew Bodenhamer, Thomas Palmer, Dougal J. Sutherland, and Andrew H. Fagg. Technical report 2012.

Selected talks

Better GANs by using the MMD. June 2018. Facebook AI Research New York. Related papers: Demystifying MMD GANs, On gradient regularizers for MMD GANs.
Efficiently Estimating Densities and Scores with Kernel Exponential Families. June 2018. Gatsby Tri-Center Meeting. Related papers: Efficient and principled score estimation with Nyström kernel exponential families.
Better GANs by using the MMD. June 2018. Machine Learning reading group, Google New York. Related papers: Demystifying MMD GANs, On gradient regularizers for MMD GANs.
Better GANs by using the MMD. June 2018. Machine Learning reading group, Columbia University. Related papers: Demystifying MMD GANs, On gradient regularizers for MMD GANs. No slides actually used at the talk because of a projector mishap, but they would have been the same as the Google talk.
Advances in GANs based on the MMD. May 2018. Machine Learning Seminar, University of Sheffield. Related papers: Demystifying MMD GANs, On gradient regularizers for MMD GANs.
Efficient and principled score estimation with kernel exponential families. December 2017. Approximating high dimensional functions, Alan Turing Institute. Related papers: Efficient and principled score estimation with Nyström kernel exponential families.
Efficient and principled score estimation with kernel exponential families. December 2017. Computational Statistics and Machine Learning seminar, University College London. Related papers: Efficient and principled score estimation with Nyström kernel exponential families.
Evaluating and Training Implicit Generative Models with Two-Sample Tests. August 2017. Implicit Models, ICML. Related papers: Generative Models and Model Criticism via Optimized Maximum Mean Discrepancy.
Two-Sample Tests, Integral Probability Metrics, and GAN Objectives. April 2017. The Theory of Generative Adversarial Networks, DALI. Related papers: Generative Models and Model Criticism via Optimized Maximum Mean Discrepancy.
Generative Models and Model Criticism via Optimized Maximum Mean Discrepancy. February 2017. Computational Statistics and Machine Learning seminar, Oxford University. Related papers: Generative Models and Model Criticism via Optimized Maximum Mean Discrepancy.