links

Gatsby Computational Neuroscience Unit CSML University College London

Contact arthur.gretton@gmail.com
Gatsby Computational Neuroscience Unit
Sainsbury Wellcome Centre
25 Howland Street
London W1T 4JG UK

Phone
+44 (0)7795 291 705

bottom corner

workshops (2016)

  • NIPS 2016 workshop: Adaptive and Scalable Nonparametric Methods in Machine Learning
    Large amounts of high-dimensional data are routinely acquired in scientific fields ranging from biology, genomics and health sciences to astronomy and economics due to improvements in engineering and data acquisition techniques. Nonparametric methods allow for better modelling of complex systems underlying data generating processes compared to traditionally used linear and parametric models. From statistical point of view, scientists have enough data to reliably fit nonparametric models. However, from computational point of view, nonparametric methods often do not scale well to big data problems.
    The aim of this workshop is to bring together practitioners, who are interested in developing and applying nonparametric methods in their domains, and theoreticians, who are interested in providing sound methodology. We hope to effectively communicate advances in development of computational tools for fitting nonparametric models and discuss challenging future directions that prevent applications of nonparametric methods to big data problems.
    Web page

workshops (2014)

  • NIPS 2014 Workshop, Modern Nonparametrics 3: Automating the Learning Pipeline
    In theoretical machine learning, much work has focused on proper tuning of the actual optimization procedures used to minimize (penalized) empirical risks. In particular, effort has gone into the automatic setting of important tuning parameters such as ‘learning rates’ and ‘step sizes’. A main aim of this workshop is to cover the various approaches proposed so far towards automating the learning pipeline, and the implications for these tuning strategies of large datasizes and dimensionality.
    Web page

  • UCL-Duke Workshop on Sensing and Analysis of High-Dimensional Data
    This workshop, which acts as the first European counterpart of the Biannual Duke University SAHD Workshop, aims to bring together world-class researchers in the general fields of mathematics, statistics, computer science and engineering that work at the intersection of computational statistics, machine learning, signal processing, and information and learning theory, with the goal to advance the field of sensing, analysis and processing of high-dimensional data.
    Web page

workshops (2013)

  • NIPS 2013 Workshop on Modern Nonparametric Methods in Machine Learning
    Modern data acquisition routinely produces massive and complex datasets. Examples are data from high throughput genomic experiments, climate data from worldwide data centers, robotic control data collected overtime in adversarial settings, user-behavior data from social networks, user preferences on online markets, and so forth. Modern pattern recognition problems arising in such disciplines are characterized by large data sizes, large number of observed variables, and increased pattern complexity. Therefore, nonparametric methods which can handle generally complex patterns are ever more relevant for modern data analysis. The aim of this workshop is to bring together both theoretical and applied researchers to discuss these challenges.
    Web page

workshops (2012)

  • NIPS 2012 Workshop on the Confluence between Kernel Methods and Graphical Models
    This workshop addresses two main research questions: first, how may kernel methods be used to address difficult learning problems for graphical models, such as inference for multi-modal continuous distributions on many variables, and dealing with non-conjugate priors? And second, how might kernel methods be advanced by bringing in concepts from graphical models, for instance by incorporating sophisticated conditional independence structures, latent variables, and prior information?
    Web page

  • NIPS 2012 Workshop on Modern Nonparametric Methods in Machine Learning
    Statistical analysis of big, high-dimensional data has become frequent in many scientific fields ranging from biology, genomics and health sciences to astronomy, economics and machine learning. The aim of this workshop is to bring together practitioners, who work on specialized applications, and theoreticians that are interested in providing sound methodology. We hope to advertise recent successes of nonparametric methods in a number of domains, involving large scale high-dimensional problems, and to dismiss the common belief that nonparametric methods are not suitable for dealing with challenges arising from big data.
    Web page

  • ICML 2012 Workshop on RKHS and kernel-based methods: theoretical topics and recent advances
    The workshop brings together researchers in probability theory, mathematicians, and machine learning researchers working on RKHS methods. The goals of the workshop are threefold: first, to provide an accessible review and synthesis of classical results in RKHS theory from the point of view of functional analysis, probability theory, and numerical analysis. Second, to cover recent advances in RKHS theory relevant to machine learners (for instance, operator valued RKHS, kernels on time series, kernel embeddings of conditional probabilities). Third, to provide a forum for open problems, to elucidate misconceptions that sometimes occur in the literature, and to discuss technical challenges.
    Web page

workshops (2010)

  • NIPS 2010 Workshop on Low-rank Methods for Large-scale Machine Learning
    There has been a growing body of research devoted to developing randomized methods to efficiently and accurately generate low-rank approximations to large matrices. This workshop aims to survey recent developments with an emphasis on usefulness for practical large-scale machine learning problems. Questions to be addressed include: What are the state-of-the-art approximation techniques? How does the heterogeneity of data affect the randomization aspects of these algorithms? Which methods are appropriate for various machine learning tasks? How do these methods work in practice for large-scale machine learning tasks? What is the tradeoff between numerical precision and time/space efficiency vs performance, e.g., classification or clustering accuracy?
    Web page

workshops (2009)

  • NIPS 2009 Workshop on Temporal Segmentation
    Data with temporal (or sequential) structure arise in several applications, such as speaker diarization, human action segmentation, network intrusion detection, DNA copy number analysis, and neuron activity modelling, to name a few. The purpose of this workshop is to bring together experts working on temporal segmentation from the statistics, machine learning, and signal processing communities, to address a broad range of applications from robotics to neuroscience, and to define the current and future challenges in the field.
    Videolectures
    Web page

  • NIPS 2009 Workshop on Large-Scale Machine Learning: Parallelism and Massive Datasets
    Prior NIPS workshops have focused on the topic of scaling machine learning, which remains an important developing area. We introduce a new perspective by focusing on how large-scale machine learning algorithms should be informed by future parallel architectures. By bringing together experts from computer architectures, parallel algorithms, and scientific computing, and machine learning we will develop a concrete research agenda for large-scale learning on parallel architectures.
    Videolectures
    Web page

workshops (2004-2008)

  • NIPS 2008 Workshop on Kernel Learning: Automatic Selection of Optimal Kernels
    Videolectures
    Web page

  • NIPS 2007 Workshop on Representations and Inference on Probability Distributions
    Videolectures

  • NIPS 2005 Workshop on Kernel Methods and Structured Domains
    Videolectures

  • NIPS 2004 Workshop on Learning with Structured Outputs

bottom corner