Back to Search Start Over

Improved multiclass feature selection via list combination.

Authors :
Izetta, Javier
Verdes, Pablo F.
Granitto, Pablo M.
Source :
Expert Systems with Applications. Dec2017, Vol. 88, p205-216. 12p.
Publication Year :
2017

Abstract

Feature selection is a crucial machine learning technique aimed at reducing the dimensionality of the input space. By discarding useless or redundant variables, not only it improves model performance but also facilitates its interpretability. The well-known Support Vector Machines–Recursive Feature Elimination (SVM-RFE) algorithm provides good performance with moderate computational efforts, in particular for wide datasets. When using SVM-RFE on a multiclass classification problem, the usual strategy is to decompose it into a series of binary ones, and to generate an importance statistics for each feature on each binary problem. These importances are then averaged over the set of binary problems to synthesize a single value for feature ranking. In some cases, however, this procedure can lead to poor selection. In this paper we discuss six new strategies, based on list combination, designed to yield improved selections starting from the importances given by the binary problems. We evaluate them on artificial and real-world datasets, using both One–Vs–One (OVO) and One–Vs–All (OVA) strategies. Our results suggest that the OVO decomposition is most effective for feature selection on multiclass problems. We also find that in most situations the new K-First strategy can find better subsets of features than the traditional weight average approach. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09574174
Volume :
88
Database :
Academic Search Index
Journal :
Expert Systems with Applications
Publication Type :
Academic Journal
Accession number :
124473743
Full Text :
https://doi.org/10.1016/j.eswa.2017.06.043