Back to Search Start Over

Boosting random subspace method

Authors :
García-Pedrajas, Nicolás
Ortiz-Boyer, Domingo
Source :
Neural Networks. Nov2008, Vol. 21 Issue 9, p1344-1362. 19p.
Publication Year :
2008

Abstract

Abstract: In this paper we propose a boosting approach to random subspace method (RSM) to achieve an improved performance and avoid some of the major drawbacks of RSM. RSM is a successful method for classification. However, the random selection of inputs, its source of success, can also be a major problem. For several problems some of the selected subspaces may lack the discriminant ability to separate the different classes. These subspaces produce poor classifiers that harm the performance of the ensemble. Additionally, boosting RSM would also be an interesting approach for improving its performance. Nevertheless, the application of the two methods together, boosting and RSM, achieves poor results, worse than the results of each method separately. In this work, we propose a new approach for combining RSM and boosting. Instead of obtaining random subspaces, we search subspaces that optimize the weighted classification error given by the boosting algorithm, and then the new classifier added to the ensemble is trained using the obtained subspace. An additional advantage of the proposed methodology is that it can be used with any classifier, including those, such as nearest neighbor classifiers, that cannot use boosting methods easily. The proposed approach is compared with standard AdaBoost and RSM showing an improved performance on a large set of 45 problems from the UCI Machine Learning Repository. An additional study of the effect of noise on the labels of the training instances shows that the less aggressive versions of the proposed methodology are more robust than AdaBoost in the presence of noise. [Copyright &y& Elsevier]

Details

Language :
English
ISSN :
08936080
Volume :
21
Issue :
9
Database :
Academic Search Index
Journal :
Neural Networks
Publication Type :
Academic Journal
Accession number :
34980075
Full Text :
https://doi.org/10.1016/j.neunet.2007.12.046