Back to Search Start Over

Improving Deep Forest by Screening.

Authors :
Pang, Ming
Ting, Kai Ming
Zhao, Peng
Zhou, Zhi-Hua
Source :
IEEE Transactions on Knowledge & Data Engineering; Sep2022, Vol. 34 Issue 9, p4298-4312, 15p
Publication Year :
2022

Abstract

Most studies about deep learning are based on neural network models, where many layers of parameterized nonlinear differentiable modules are trained by backpropagation. Recently, it has been shown that deep learning can also be realized by non-differentiable modules without backpropagation training called deep forest. We identify that deep forest has high time costs and memory requirements—this has inhibited its use on large-scale datasets. In this paper, we propose a simple and effective approach with three main strategies for efficient learning of deep forest. First, it substantially reduces the number of instances that needs to be processed through redirecting instances having high predictive confidence straight to the final level for prediction, by-passing all the intermediate levels. Second, many non-informative features are screened out, and only the informative ones are used for learning at each level. Third, an unsupervised feature transformation procedure is proposed to replace the supervised multi-grained scanning procedure. Our theoretical analysis supports the proposed approach in varying the model complexity from low to high as the number of levels increases in deep forest. Experiments show that our approach achieves highly competitive predictive performance with reduced time cost and memory requirement by one to two orders of magnitude. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10414347
Volume :
34
Issue :
9
Database :
Complementary Index
Journal :
IEEE Transactions on Knowledge & Data Engineering
Publication Type :
Academic Journal
Accession number :
158405973
Full Text :
https://doi.org/10.1109/TKDE.2020.3038799