Back to Search Start Over

Feature flow regularization: Improving structured sparsity in deep neural networks.

Authors :
Wu, Yue
Lan, Yuan
Zhang, Luchan
Xiang, Yang
Source :
Neural Networks. Apr2023, Vol. 161, p598-613. 16p.
Publication Year :
2023

Abstract

Pruning is a model compression method that removes redundant parameters and accelerates the inference speed of deep neural networks (DNNs) while maintaining accuracy. Most available pruning methods impose various conditions on parameters or features directly. In this paper, we propose a simple and effective regularization strategy to improve the structured sparsity and structured pruning in DNNs from a new perspective of evolution of features. In particular, we consider the trajectories connecting features of adjacent hidden layers, namely feature flow. We propose feature flow regularization (FFR) to penalize the length and the total absolute curvature of the trajectories, which implicitly increases the structured sparsity of the parameters. The principle behind FFR is that short and straight trajectories will lead to an efficient network that avoids redundant parameters. Experiments on CIFAR-10 and ImageNet datasets show that FFR improves structured sparsity and achieves pruning results comparable to or even better than those state-of-the-art methods. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
08936080
Volume :
161
Database :
Academic Search Index
Journal :
Neural Networks
Publication Type :
Academic Journal
Accession number :
162504149
Full Text :
https://doi.org/10.1016/j.neunet.2023.02.013