Back to Search Start Over

A Novel Pruning Algorithm for Smoothing Feedforward Neural Networks Based on Group Lasso Method.

Authors :
Wang, Jian
Xu, Chen
Yang, Xifeng
Zurada, Jacek M.
Source :
IEEE Transactions on Neural Networks & Learning Systems. May2018, Vol. 29 Issue 5, p2012-2024. 13p.
Publication Year :
2018

Abstract

In this paper, we propose four new variants of the backpropagation algorithm to improve the generalization ability for feedforward neural networks. The basic idea of these methods stems from the Group Lasso concept which deals with the variable selection problem at the group level. There are two main drawbacks when the Group Lasso penalty has been directly employed during network training. They are numerical oscillations and theoretical challenges in computing the gradients at the origin. To overcome these obstacles, smoothing functions have then been introduced by approximating the Group Lasso penalty. Numerical experiments for classification and regression problems demonstrate that the proposed algorithms perform better than the other three classical penalization methods, Weight Decay, Weight Elimination, and Approximate Smoother, on both generalization and pruning efficiency. In addition, detailed simulations based on a specific data set have been performed to compare with some other common pruning strategies, which verify the advantages of the proposed algorithm. The pruning abilities of the proposed strategy have been investigated in detail for a relatively large data set, MNIST, in terms of various smoothing approximation cases. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
2162237X
Volume :
29
Issue :
5
Database :
Academic Search Index
Journal :
IEEE Transactions on Neural Networks & Learning Systems
Publication Type :
Periodical
Accession number :
129265834
Full Text :
https://doi.org/10.1109/TNNLS.2017.2748585