Back to Search
Start Over
Optimization of Distributions Differences for Classification.
- Source :
-
IEEE Transactions on Neural Networks & Learning Systems . Feb2019, Vol. 30 Issue 2, p511-523. 13p. - Publication Year :
- 2019
-
Abstract
- In this paper, we introduce a new classification algorithm called the optimization of distribution differences (ODD). The algorithm aims to find a transformation from the feature space to a new space where the instances in the same class are as close as possible to one another, whereas the gravity centers of these classes are as far as possible from one another. This aim is formulated as a multiobjective optimization problem that is solved by a hybrid of an evolutionary strategy and the quasi-Newton method. The choice of the transformation function is flexible and could be any continuous space function. We experiment with a linear and a nonlinear transformation in this paper. We show that the algorithm can outperform eight other classification methods, namely naive Bayes, support vector machines, linear discriminant analysis, multilayer perceptrons, decision trees, and $k$ -nearest neighbors, and two recently proposed classification methods, in 12 standard classification data sets. Our results show that the method is less sensitive to the imbalanced number of instances compared with these methods. We also show that ODD maintains its performance better than other classification methods in these data sets and hence offers a better generalization ability. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 2162237X
- Volume :
- 30
- Issue :
- 2
- Database :
- Academic Search Index
- Journal :
- IEEE Transactions on Neural Networks & Learning Systems
- Publication Type :
- Periodical
- Accession number :
- 134278837
- Full Text :
- https://doi.org/10.1109/TNNLS.2018.2844723