Back to Search Start Over

A CNN channel pruning low-bit framework using weight quantization with sparse group lasso regularization.

Authors :
Long, Xin
Zeng, Xiangrong
Liu, Yan
Xiao, Huaxin
Zhang, Maojun
Ben, Zongcheng
Source :
Journal of Intelligent & Fuzzy Systems. 2020, Vol. 39 Issue 1, p221-232. 12p.
Publication Year :
2020

Abstract

The deployment of large-scale Convolutional Neural Networks (CNNs) in limited-power devices is hindered by their high computation cost and storage. In this paper, we propose a novel framework for CNNs to simultaneously achieve channel pruning and low-bit quantization by combining weight quantization with Sparse Group Lasso (SGL) regularization. We model this framework as a discretely constrained problem and solve it by Alternating Direction Method of Multipliers (ADMM). Different from previous approaches, the proposed method reduces not only model size but also computational operations. In experimental section, we evaluate the proposed framework on CIFAR datasets with several popular models such as VGG-7/16/19 and ResNet-18/34/50, which demonstrate that the proposed method can obtain low-bit networks and dramatically reduce redundant channels of the network with slight inference accuracy loss. Furthermore, we also visualize and analyze weight tensors, which showing the compact group-sparsity structure of them. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10641246
Volume :
39
Issue :
1
Database :
Academic Search Index
Journal :
Journal of Intelligent & Fuzzy Systems
Publication Type :
Academic Journal
Accession number :
144656359
Full Text :
https://doi.org/10.3233/JIFS-191014