Back to Search Start Over

A Linear Incremental Nyström Method for Online Kernel Learning

Authors :
Xiao Zhang
Shan Xu
Shizhong Liao
Source :
ICPR
Publication Year :
2018
Publisher :
IEEE, 2018.

Abstract

Although the incremental Nystrom method has been used in kernel approximation, it is not suitable for online kernel learning due to the cubic time complexity and the lack of theoretical guarantees. In this paper, we propose a novel incremental Nystrom method, which is in a linear time complexity with respect to the sampling size at each round, and enjoys a sublinear regret bound for online kernel learning. We construct the intersection matrix using the ridge leverage score estimator, compute the rank-k approximation of the intersection matrix incrementally via the incremental singular value decomposition, and recalculate the generalized inverse matrix periodically. When applying the proposed incremental Nystrom method to online kernel learning, we approximate the kernel matrix using the updated generalized inverse matrix at each round, and formulate the explicit feature mapping by the singular value decomposition of the approximated kernel matrix, yielding the linear classifier for online kernel learning at each round. Theoretically, we prove that our incremental Nystrom method has a $(1+\epsilon)$ relative-error bound for kernel matrix approximation, enjoys a sublinear regret bound using online gradient descent method for online kernel learning, and reduces the time complexity of generalized inverse computation from $O(m^{3})$ to $O(mk)$ at each round, where $m$ is the sampling size and $k$ is the truncated rank. Experimental results show that the proposed incremental Nystrom method is accurate and efficient in kernel matrix approximation and is suitable for online kernel learning.

Details

Database :
OpenAIRE
Journal :
2018 24th International Conference on Pattern Recognition (ICPR)
Accession number :
edsair.doi...........ae5b5cc0444b2c93dfde3d82dcfdd20e