Back to Search Start Over

RotBoost: A technique for combining Rotation Forest and AdaBoost

Authors :
Zhang, Chun-Xia
Zhang, Jiang-She
Source :
Pattern Recognition Letters. Jul2008, Vol. 29 Issue 10, p1524-1536. 13p.
Publication Year :
2008

Abstract

Abstract: This paper presents a novel ensemble classifier generation technique RotBoost, which is constructed by combining Rotation Forest and AdaBoost. The experiments conducted with 36 real-world data sets available from the UCI repository, among which a classification tree is adopted as the base learning algorithm, demonstrate that RotBoost can generate ensemble classifiers with significantly lower prediction error than either Rotation Forest or AdaBoost more often than the reverse. Meanwhile, RotBoost is found to perform much better than Bagging and MultiBoost. Through employing the bias and variance decompositions of error to gain more insight of the considered classification methods, RotBoost is seen to simultaneously reduce the bias and variance terms of a single tree and the decrement achieved by it is much greater than that done by the other ensemble methods, which leads RotBoost to perform best among the considered classification procedures. Furthermore, RotBoost has a potential advantage over AdaBoost of suiting parallel execution. [Copyright &y& Elsevier]

Details

Language :
English
ISSN :
01678655
Volume :
29
Issue :
10
Database :
Academic Search Index
Journal :
Pattern Recognition Letters
Publication Type :
Academic Journal
Accession number :
32492732
Full Text :
https://doi.org/10.1016/j.patrec.2008.03.006