1. DDEP: Evolutionary pruning using distilled dataset.
- Author
-
Wang, Xingwang, Sun, Yafeng, Chen, Xinyue, and Xu, Haixiao
- Subjects
- *
CONVOLUTIONAL neural networks , *EVOLUTIONARY algorithms , *CONSTRAINED optimization , *MEMETICS - Abstract
Network pruning has been a hot topic in recent years, and many popular pruning methods rely on network design expertise. However, the pruning process usually involves manual intervention and can be difficult for users who lack prior knowledge. Automatic pruning using evolutionary algorithms shows great promise, but it must address the challenge of performing time-consuming model evaluations and searching through a large solution space. Dataset distillation is a technique that compresses the original dataset to decrease the cost of fine-tuning models. In this paper, we explore the potential of using the distilled dataset to exhibit a similar role as the real dataset in network pruning, and proposed the evolutionary pruning framework using distilled dataset. Specifically, the network pruning pipeline is carried out on the distilled dataset to significantly reduce the model evaluation cost, and the number of filters in the convolutional layer is directly coded to narrow the search space. In addition, a tailored evolutionary algorithm is proposed that takes the form of constrained optimization to search the most suitable pruned network. The experiments conducted on VGG16, VGG19, ResNet56, and ResNet110 demonstrate that the proposed method reduces at least 41.56% of the flops and achieves competitive results with little compromising accuracy. • Distillation dataset is first applied to network pruning. • A constrained evolutionary algorithm with double-balanced multi-branch is proposed. • Deep convolutional neural network pruning is performed using the constrained evolutionary algorithm. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF