Back to Search Start Over

Till the Layers Collapse: Compressing a Deep Neural Network through the Lenses of Batch Normalization Layers

Authors :
Liao, Zhu
Hezbri, Nour
Quétu, Victor
Nguyen, Van-Tam
Tartaglione, Enzo
Publication Year :
2024

Abstract

Today, deep neural networks are widely used since they can handle a variety of complex tasks. Their generality makes them very powerful tools in modern technology. However, deep neural networks are often overparameterized. The usage of these large models consumes a lot of computation resources. In this paper, we introduce a method called \textbf{T}ill the \textbf{L}ayers \textbf{C}ollapse (TLC), which compresses deep neural networks through the lenses of batch normalization layers. By reducing the depth of these networks, our method decreases deep neural networks' computational requirements and overall latency. We validate our method on popular models such as Swin-T, MobileNet-V2, and RoBERTa, across both image classification and natural language processing (NLP) tasks.<br />Comment: Accepted at AAAI 2025

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2412.15077
Document Type :
Working Paper