Back to Search Start Over

Towards Optimal Sobolev Norm Rates for the Vector-Valued Regularized Least-Squares Algorithm

Authors :
Li, Zhu
Meunier, Dimitri
Mollenhauer, Mattes
Gretton, Arthur
Publication Year :
2023

Abstract

We present the first optimal rates for infinite-dimensional vector-valued ridge regression on a continuous scale of norms that interpolate between $L_2$ and the hypothesis space, which we consider as a vector-valued reproducing kernel Hilbert space. These rates allow to treat the misspecified case in which the true regression function is not contained in the hypothesis space. We combine standard assumptions on the capacity of the hypothesis space with a novel tensor product construction of vector-valued interpolation spaces in order to characterize the smoothness of the regression function. Our upper bound not only attains the same rate as real-valued kernel ridge regression, but also removes the assumption that the target regression function is bounded. For the lower bound, we reduce the problem to the scalar setting using a projection argument. We show that these rates are optimal in most cases and independent of the dimension of the output space. We illustrate our results for the special case of vector-valued Sobolev spaces.<br />Comment: Published JMLR version. arXiv admin note: text overlap with arXiv:2208.01711

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2312.07186
Document Type :
Working Paper