Back to Search Start Over

David helps Goliath: Inference-Time Collaboration Between Small Specialized and Large General Diffusion LMs

Authors :
Han, Xiaochuang
Kumar, Sachin
Tsvetkov, Yulia
Ghazvininejad, Marjan
Publication Year :
2023

Abstract

Diffusion-based language models are emerging as a promising alternative to autoregressive LMs: they approach the competence of autoregressive LMs while offering nuanced controllability at inference time. While autoregressive LMs have benefited immensely from scaling and instruction-based learning, existing studies of diffusion LMs have been conducted on a smaller scale. Starting with a recently proposed diffusion model SSD-LM, in this work we first explore methods to scale it from 0.4B to 13B parameters, proposing techniques to improve its training and inference efficiency, and to finetune the model to follow instructions. Armed with a more powerful, general purpose diffusion LM, we introduce the primary contribution of this work -- SSD-2 -- an approach to easily ensemble at inference time a large general-purpose diffusion LM with smaller, but specialized and contextualized diffusion LMs. We show that SSD-2 facilitates novel ensembles with 100x smaller models that can be customized and deployed by individual users. We find that compared to autoregressive models, the collaboration between diffusion LMs is more effective, leading to higher-quality model responses due to their ability to dynamically incorporate bi-directional contexts.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2305.14771
Document Type :
Working Paper