Back to Search Start Over

Optimality of regularized least squares ranking with imperfect kernels.

Authors :
He, Fangchao
Zeng, Yu
Zheng, Lie
Wu, Qiang
Source :
Information Sciences. Apr2022, Vol. 589, p564-579. 16p.
Publication Year :
2022

Abstract

Ranking is one of the central machine learning tasks arising from supervised learning and has many important applications in information retrieval. Many ranking approaches have been designed and extensively studied. In this paper we study two kernel ranking algorithms, the regularized least square ranking and its bias corrected version, which learn scoring functions from reproducing kernel Hilbert spaces (RKHSs). We say a kernel is perfect for ranking if an optimal scoring function lies in the associated RKHS. It is imperfect if an optimal scoring function can be approximated by the RKHS. The two regularized kernel ranking algorithms have been shown theoretically justified and empirically effective when the kernel is perfectly chosen. In practice, however, kernels are usually selected via cross validation process. It is more common than not that the kernel is imperfectly chosen. We develop a novel leave-two-out analysis technique to evaluate the generalization performance of the two regularized ranking algorithms in this situation. We show that they are still effective and can achieve the capacity independent optimal convergence rates. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00200255
Volume :
589
Database :
Academic Search Index
Journal :
Information Sciences
Publication Type :
Periodical
Accession number :
155090874
Full Text :
https://doi.org/10.1016/j.ins.2021.12.087