Machine Learning II (2007/2008)

6/20 The Hilbert space learning framework

This lecture introduces the Hilbert space approach to learning, which encompasses such popular algorithms as the Support Vector Machine. We start out with the general principle of regularized risk minimization over function spaces, and find that statistical and computational considerations lead naturally to Hilbert space algorithms, such as the SVM.

Concepts covered: The statistical learning model. Empirical risk and regularized risk. Reproducing kernel hilbert spaces (RKHS). The representer theorem. The support vector machine and other Hilbert space learning algorithms. Solving Hilbert space learning problems in the dual. Loss functions: zero-one loss, squared error loss, hinge loss, multi-class loss functions. Surrogate loss functions. Kernels: linear, polynomial, Gaussian RBF, convolution, string kernels, graph kernels, diffusion kernels, etc.. Regularization operators corresponding to specific kernels. Input/output space learning. [notes]

  • B. Scholkopf and A. Smola: Learning with kernels. MIT Press, 2002. [amazon]