Journée Mathematical Foundations of Learning Theory
< précédent | suivant >
|Regularization of Kernel Methods by Decreasing the Bandwidth of the Gaussian Kernel|
Jean-Philippe Vert (École des Mines)
2 juin 2006
We consider learning algorithms that minimize an empirical risk regularized by the norm in the reproducing kernel Hilbert space of the Gaussian kernel. The conditions on the loss function for Bayes consistency of such methods have been studied recently when the regularization term asymptotically vanishes as the sample size increases. Here we study the different situation where the regularization term does not vanish, but the bandwidth of the Gaussian kernel instead decreases with the sample size. We will explicit the asymptotic limit of the function selected by the algorithm, give conditions on the loss function to ensure Bayes consistency, and provide non-asymptotic learning bounds in this case. We will deduce in particular the consistency of the one-class support vector machine algorithm as a density level set estimator.
(Joint work with Régis Vert.)
||Jean-Philippe Vert (École des Mines)|
Centre for Computational Biology. Since 2008, Senior Researcher and Adjunct Director. Department of Cancer Computational Genomics, Bioinformatics, Biostatistics and Epidemiology, Insitut Curie/Mines ParisTech/INSERM.