Back to Search Start Over

Kde-Entropy: preserve efficient filter.

Authors :
Wen, Long
Yang, Yanjiao
Liu, Bingyao
Zhang, Fengshun
Source :
Signal, Image & Video Processing; Feb2024, Vol. 18 Issue 1, p579-587, 9p
Publication Year :
2024

Abstract

With the rapid development of convolutional neural networks (CNNs), higher accuracy is usually accompanied by huge parameters and calculations. In embedded devices with limited computing resources, huge network models are often difficult to deploy. It has been shown in previous researches that there are a large number of invalid filters in the CNNs network, and removing these invalid filters has little effect on network accuracy. Therefore, compressing CNNs by pruning these redundant filters can be done while maintaining accuracy. Model pruning is one of the main compression techniques for models. In the existing model pruning work, the filter with a small ℓ 1 -norm value is pruned by calculating the ℓ 1 -norm of the filter, but some researches have pointed out that the ℓ 1 -norm is not always effective. This paper proposes a kernel density estimation-based entropy algorithm to prune filters with low information content. We experimentally demonstrate the effectiveness of the algorithm. Pruning VGG-16 model on CIFAR-10 datasets achieves an accuracy of 93.36% even after removing 91.97% of the original parameter count and 64.87% of the FLOPs. After pruning 75.84% of the network parameters and 52.35% of the FLOPs of the ResNet-110 model on the CIFAR-100 datasets, the accuracy remains almost the same. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
18631703
Volume :
18
Issue :
1
Database :
Complementary Index
Journal :
Signal, Image & Video Processing
Publication Type :
Academic Journal
Accession number :
175023546
Full Text :
https://doi.org/10.1007/s11760-023-02763-0