Back to Search Start Over

Kernel reconstruction learning.

Authors :
Wu, Yun
Xiong, Shifeng
Source :
Neurocomputing. Feb2023, Vol. 522, p1-10. 10p.
Publication Year :
2023

Abstract

• We propose a class of kernel interpolation-based methods, called kernel reconstruction learning, for solving machine learning problems. • We prove a reconstruction representer theorem, which indicates that conventional kernel methods can be viewed as special cases of kernel reconstruction learning. • We propose the kernel reconstruction vector machine, kernel reconstruction logistic regression, and kernel reconstruction density estimation methods, and show that they outperform popular kernel methods. This paper proposes a class of kernel interpolation-based methods, called kernel reconstruction learning, for solving machine learning problems. Kernel reconstruction learning uses kernel interpolators to reconstruct the unknown functions, which are needed to estimate in the problem, with estimated function values at selected knots. It can be applied to any learning problem that involves function estimation. We prove a reconstruction representer theorem, which indicates that conventional kernel methods, including kernel ridge regression, kernel support vector machine, and kernel logistic regression, can be viewed as special cases of kernel reconstruction learning. Furthermore, kernel reconstruction learning provides new algorithms for large datasets. The kernel reconstruction vector machine, kernel reconstruction logistic regression, and kernel reconstruction density estimation are discussed in detail. With appropriate implementations, they are shown to have higher prediction/estimation accuracy and/or less computational cost than popular kernel methods. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09252312
Volume :
522
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
161080087
Full Text :
https://doi.org/10.1016/j.neucom.2022.12.015