1. Smooth Group L1/2 Regularization for Pruning Convolutional Neural Networks
- Author
-
Yuan Bao, Zhaobin Liu, Zhongxuan Luo, and Sibo Yang
- Subjects
convolutional neural network (CNN) ,fully connected layer ,smooth group L1/2 regularization ,sparsity ,Mathematics ,QA1-939 - Abstract
In this paper, a novel smooth group L1/2 (SGL1/2) regularization method is proposed for pruning hidden nodes of the fully connected layer in convolution neural networks. Usually, the selection of nodes and weights is based on experience, and the convolution filter is symmetric in the convolution neural network. The main contribution of SGL1/2 is to try to approximate the weights to 0 at the group level. Therefore, we will be able to prune the hidden node if the corresponding weights are all close to 0. Furthermore, the feasibility analysis of this new method is carried out under some reasonable assumptions due to the smooth function. The numerical results demonstrate the superiority of the SGL1/2 method with respect to sparsity, without damaging the classification performance.
- Published
- 2022
- Full Text
- View/download PDF