Back to Search
Start Over
Optimal linear combination of neural network classifiers based on the minimum classification error criterion.
- Source :
- Systems & Computers in Japan; 8/1/2000, Vol. 31 Issue 9, p39-48, 10p
- Publication Year :
- 2000
-
Abstract
- Focusing on classification problems, this paper presents a new method for linearly combining multiple neural network classifiers based on statistical pattern recognition theory. In our approach, several neural networks are first selected, each of which works best for each class in terms of minimizing classification errors. Then, they are linearly combined and form an ideal classifier able to take advantage of the strengths of the individual classifiers, to avoid their weaknesses, and to improve all of the individual classifiers. In this approach, the minimum classification error (MCE) criterion is utilized to estimate the optimal linear weights. In this formulation, because the classification decision rule is incorporated into the cost function, better combination weights suitable for the classification objective can be obtained. Experimental results using artificial and real data sets show that the proposed method can construct a better combined classifier which outperforms the best single classifier in terms of the overall classification errors for test data. © 2000 Scripta Technica, Syst Comp Jpn, 31(9): 39–48, 2000 [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 08821666
- Volume :
- 31
- Issue :
- 9
- Database :
- Supplemental Index
- Journal :
- Systems & Computers in Japan
- Publication Type :
- Academic Journal
- Accession number :
- 13380319
- Full Text :
- https://doi.org/10.1002/1520-684X(200008)31:9<39::AID-SCJ5>3.0.CO;2-O