Back to Search Start Over

On the Representation Collapse of Sparse Mixture of Experts

Authors :
Chi, Zewen
Dong, Li
Huang, Shaohan
Dai, Damai
Ma, Shuming
Patra, Barun
Singhal, Saksham
Bajaj, Payal
Song, Xia
Mao, Xian-Ling
Huang, Heyan
Wei, Furu
Publication Year :
2022

Abstract

Sparse mixture of experts provides larger model capacity while requiring a constant computational overhead. It employs the routing mechanism to distribute input tokens to the best-matched experts according to their hidden representations. However, learning such a routing mechanism encourages token clustering around expert centroids, implying a trend toward representation collapse. In this work, we propose to estimate the routing scores between tokens and experts on a low-dimensional hypersphere. We conduct extensive experiments on cross-lingual language model pre-training and fine-tuning on downstream tasks. Experimental results across seven multilingual benchmarks show that our method achieves consistent gains. We also present a comprehensive analysis on the representation and routing behaviors of our models. Our method alleviates the representation collapse issue and achieves more consistent routing than the baseline mixture-of-experts methods.<br />Comment: NeurIPS 2022

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2204.09179
Document Type :
Working Paper