Back to Search Start Over

Adaptive Computation Modules: Granular Conditional Computation For Efficient Inference

Authors :
Wójcik, Bartosz
Devoto, Alessio
Pustelnik, Karol
Minervini, Pasquale
Scardapane, Simone
Publication Year :
2023

Abstract

While transformer models have been highly successful, they are computationally inefficient. We observe that for each layer, the full width of the layer may be needed only for a small subset of tokens inside a batch and that the "effective" width needed to process a token can vary from layer to layer. Motivated by this observation, we introduce the Adaptive Computation Module (ACM), a generic module that dynamically adapts its computational load to match the estimated difficulty of the input on a per-token basis. An ACM consists of a sequence of learners that progressively refine the output of their preceding counterparts. An additional gating mechanism determines the optimal number of learners to execute for each token. We also propose a distillation technique to replace any pre-trained model with an "ACMized" variant. Our evaluation of transformer models in computer vision and speech recognition demonstrates that substituting layers with ACMs significantly reduces inference costs without degrading the downstream accuracy for a wide interval of user-defined budgets.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2312.10193
Document Type :
Working Paper