Back to Search Start Over

Exploiting Universum data in AdaBoost using gradient descent.

Authors :
Xu, Jingsong
Wu, Qiang
Zhang, Jian
Tang, Zhenmin
Source :
Image & Vision Computing. Aug2014, Vol. 32 Issue 8, p550-557. 8p.
Publication Year :
2014

Abstract

Abstract: Recently, Universum data that does not belong to any class of the training data, has been applied for training better classifiers. In this paper, we address a novel boosting algorithm called AdaBoost that can improve the classification performance of AdaBoost with Universum data. AdaBoost chooses a function by minimizing the loss for labeled data and Universum data. The cost function is minimized by a greedy, stagewise, functional gradient procedure. Each training stage of AdaBoost is fast and efficient. The standard AdaBoost weights labeled samples during training iterations while AdaBoost gives an explicit weighting scheme for Universum samples as well. In addition, this paper describes the practical conditions for the effectiveness of Universum learning. These conditions are based on the analysis of the distribution of ensemble predictions over training samples. Experiments on handwritten digits classification and gender classification problems are presented. As exhibited by our experimental results, the proposed method can obtain superior performances over the standard AdaBoost by selecting proper Universum data. [Copyright &y& Elsevier]

Details

Language :
English
ISSN :
02628856
Volume :
32
Issue :
8
Database :
Academic Search Index
Journal :
Image & Vision Computing
Publication Type :
Academic Journal
Accession number :
96660310
Full Text :
https://doi.org/10.1016/j.imavis.2014.04.009