Back to Search
Start Over
USM RNN-T model weights binarization
- Publication Year :
- 2024
-
Abstract
- Large-scale universal speech models (USM) are already used in production. However, as the model size grows, the serving cost grows too. Serving cost of large models is dominated by model size that is why model size reduction is an important research topic. In this work we are focused on model size reduction using weights only quantization. We present the weights binarization of USM Recurrent Neural Network Transducer (RNN-T) and show that its model size can be reduced by 15.9x times at cost of word error rate (WER) increase by only 1.9% in comparison to the float32 model. It makes it attractive for practical applications.
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2406.02887
- Document Type :
- Working Paper
- Full Text :
- https://doi.org/10.21437/Interspeech.2024-47