Back to Search Start Over

A generalized Gilbert's algorithm for approximating general SVM classifiers

Authors :
Liu, Zhenbing
Liu, JianGuo
Chen, Zhong
Source :
Neurocomputing. Dec2009, Vol. 73 Issue 1-3, p219-224. 6p.
Publication Year :
2009

Abstract

Abstract: Geometric methods provide an intuitive and theoretically solid viewpoint for the solution of many optimization problems in the fields of pattern recognition and machine learning. The support vector machine (SVM) classification is a typical optimization task that has achieved excellent generalization performance in a wide variety of applications. In this paper, the notion of “scaled convex hull” (SCH) is presented, through which the nonseparable SVM classifications can be approximately transformed to separable ones: by a suitable selection of the reduction factor, the initially overlapping SCHs (each is generated by the training patterns of each class) can be reduced to become separable, then the maximal margin classifier between them can be trained, which is an approximation of the standard nonseparable SVM. As a practical application of the SCH framework, the popular Gilbert''s algorithm has been generalized to approximately solve general (linear and nonlinear, separable and nonseparable) SVM classification problems both accurately and efficiently. The experiments show that the proposed method may achieve better performance than the state-of-the-art methods, an improved sequential minimal optimization and Gilbert''s algorithm based on the reduced convex hull (RCH), in terms of the number of kernel evaluations and the execution time. [Copyright &y& Elsevier]

Details

Language :
English
ISSN :
09252312
Volume :
73
Issue :
1-3
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
45217119
Full Text :
https://doi.org/10.1016/j.neucom.2009.09.005