Back to Search Start Over

Total stability of kernel methods

Authors :
Dao-Hong Xiang
Andreas Christmann
Ding-Xuan Zhou
Source :
Neurocomputing. 289:101-118
Publication Year :
2018
Publisher :
Elsevier BV, 2018.

Abstract

Regularized empirical risk minimization using kernels and their corresponding reproducing kernel Hilbert spaces (RKHSs) plays an important role in machine learning. However, the actually used kernel often depends on one or on a few hyperparameters or the kernel is even data dependent in a much more complicated manner. Examples are Gaussian RBF kernels, kernel learning, and hierarchical Gaussian kernels which were recently proposed for deep learning. Therefore, the actually used kernel is often computed by a grid search or in an iterative manner and can often only be considered as an approximation to the “ideal” or “optimal” kernel. The paper gives conditions under which classical kernel based methods based on a convex Lipschitz loss function and on a bounded and smooth kernel are stable, if the probability measure P, the regularization parameter λ, and the kernel K may slightly change in a simultaneous manner. Similar results are also given for pairwise learning. Therefore, the topic of this paper is somewhat more general than in classical robust statistics, where usually only the influence of small perturbations of the probability measure P on the estimated function is considered.

Details

ISSN :
09252312
Volume :
289
Database :
OpenAIRE
Journal :
Neurocomputing
Accession number :
edsair.doi...........9910e741aa4d17dcc8424810b82dd63a