**Yoshua Bengio**

(http://www.iro.umontreal.ca/~bengioy/)

Monday 2nd July 2012

**Time: ***11am*****

** **

B10 Seminar Room, Basement,

Alexandra House, 17 Queen Square, London, WC1N 3AR

__Sampling from Auto-Encoders performing Manifold Learning and Implicit Density Estimation__

Recent work suggests that some auto-encoder variants do a good job of capturing the local manifold structure of the unknown data generating density, by the effect of the opposing forces of reconstruction error and a contractive regularizer. This talk discusses novel mathematical understanding of this phenomenon and helps define better justified sampling algorithms for deep learning based on auto-encoder variants. We consider an MCMC where each step samples from a Gaussian whose mean and covariance matrix depend on the previous state, and defines through its asymptotic distribution a target density. First, we show that good choices (in the sense of consistency) for these mean and covariance functions are the local expected value and local covariance under that target density. Then we show that an auto-encoder with a contractive penalty captures estimators of these local moments in its reconstruction function and its Jacobian. We also show a connection between the local mean and local covariance one one hand and the density gradient and density Hessian (with respect to the input), which makes sense in the context of manifold learning. A contribution of this work is thus a novel alternative to maximum-likelihood density estimation, which we call local moment matching. This also justifies a successful sampling algorithm for the Contractive Auto-Encoder and the Denoising Auto-Encoder, and we show samples obtained on two datasets.