Back to Search Start Over

Comparison theorems on large-margin learning.

Authors :
Benabid, Amina
Fan, Jun
Xiang, Dao-Hong
Source :
International Journal of Wavelets, Multiresolution & Information Processing. Sep2021, Vol. 19 Issue 5, p1-18. 18p.
Publication Year :
2021

Abstract

This paper studies the binary classification problem associated with a family of Lipschitz convex loss functions called large-margin unified machines (LUMs), which offers a natural bridge between distribution-based likelihood approaches and margin-based approaches. LUMs can overcome the so-called data piling issue of support vector machine in the high-dimension and low-sample size setting, while their theoretical analysis from the perspective of learning theory is still lacking. In this paper, we establish some new comparison theorems for all LUM loss functions which play a key role in the error analysis of large-margin learning algorithms. Based on the obtained comparison theorems, we further derive learning rates for regularized LUMs schemes associated with varying Gaussian kernels, which maybe of independent interest. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
02196913
Volume :
19
Issue :
5
Database :
Academic Search Index
Journal :
International Journal of Wavelets, Multiresolution & Information Processing
Publication Type :
Academic Journal
Accession number :
153048959
Full Text :
https://doi.org/10.1142/S0219691321500156