Back to Search Start Over

Data-Dependent Generalization Bounds for Multi-Class Classification.

Authors :
Lei, Yunwen
Dogan, Urun
Zhou, Ding-Xuan
Kloft, Marius
Source :
IEEE Transactions on Information Theory. May2019, Vol. 65 Issue 5, p2995-3021. 27p.
Publication Year :
2019

Abstract

In this paper, we study data-dependent generalization error bounds that exhibit a mild dependency on the number of classes, making them suitable for multi-class learning with a large number of label classes. The bounds generally hold for empirical multi-class risk minimization algorithms using an arbitrary norm as the regularizer. Key to our analysis is new structural results for multi-class Gaussian complexities and empirical $\ell _\infty $ -norm covering numbers, which exploit the Lipschitz continuity of the loss function with respect to the $\ell _{2}$ - and $\ell _\infty $ -norm, respectively. We establish data-dependent error bounds in terms of the complexities of a linear function class defined on a finite set induced by training examples, for which we show tight lower and upper bounds. We apply the results to several prominent multi-class learning machines and show a tighter dependency on the number of classes than the state of the art. For instance, for the multi-class support vector machine of Crammer and Singer (2002), we obtain a data-dependent bound with a logarithmic dependency, which is a significant improvement of the previous square-root dependency. The experimental results are reported to verify the effectiveness of our theoretical findings. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00189448
Volume :
65
Issue :
5
Database :
Academic Search Index
Journal :
IEEE Transactions on Information Theory
Publication Type :
Academic Journal
Accession number :
136101311
Full Text :
https://doi.org/10.1109/TIT.2019.2893916