Back to Search Start Over

TorchScale: Transformers at Scale

Authors :
Ma, Shuming
Wang, Hongyu
Huang, Shaohan
Wang, Wenhui
Chi, Zewen
Dong, Li
Benhaim, Alon
Patra, Barun
Chaudhary, Vishrav
Song, Xia
Wei, Furu
Ma, Shuming
Wang, Hongyu
Huang, Shaohan
Wang, Wenhui
Chi, Zewen
Dong, Li
Benhaim, Alon
Patra, Barun
Chaudhary, Vishrav
Song, Xia
Wei, Furu
Publication Year :
2022

Abstract

Large Transformers have achieved state-of-the-art performance across many tasks. Most open-source libraries on scaling Transformers focus on improving training or inference with better parallelization. In this work, we present TorchScale, an open-source toolkit that allows researchers and developers to scale up Transformers efficiently and effectively. TorchScale has the implementation of several modeling techniques, which can improve modeling generality and capability, as well as training stability and efficiency. Experimental results on language modeling and neural machine translation demonstrate that TorchScale can successfully scale Transformers to different sizes without tears. The library is available at https://aka.ms/torchscale.<br />Comment: Work in progress

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1381584728
Document Type :
Electronic Resource