Back to Search Start Over

QMCM: Minimizing Vapnik's bound on the VC dimension.

Authors :
Jayadeva
Soman, Sumit
Pant, Himanshu
Sharma, Mayank
Source :
Neurocomputing. Jul2020, Vol. 399, p352-360. 9p.
Publication Year :
2020

Abstract

The recently proposed Minimal Complexity Machine (MCM) learns a hyperplane classifier by minimizing a bound on the Vapnik-Chervonenkis (VC) dimension. Both the linear and kernel versions of the MCM solve a linear programming problem, in order to minimize a bound on the VC dimension. This paper proposes a new quadratic programming formulation, termed as the Quadratic MCM (QMCM), that minimizes a tighter bound on the VC dimension. We present two variants of the QMCM, that differ in the norm of the error vector being minimized. We also explore a scalable variant of the QMCM for large datasets using Stochastic Gradient Descent (SGD), and present the use of the QMCM as a viable featureselection method, in view of the the sparse nature of the models it learns. We compare the performance of the QMCM variants with LIBLinear, a linear Support Vector Machine (SVM) library; as well as against Pegasos and the linear MCM for large datasets, along with sequential feature selection methods and ReliefF. Our results validate the superiority of the QMCM in terms of statistically significant improvements on benchmark datasets from the UCI Machine Learning repository. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09252312
Volume :
399
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
143326617
Full Text :
https://doi.org/10.1016/j.neucom.2020.01.062