Back to Search Start Over

Multi-Stage Balanced Distillation: Addressing Long-Tail Challenges in Sequence-Level Knowledge Distillation

Authors :
Zhou, Yuhang
Zhu, Jing
Xu, Paiheng
Liu, Xiaoyu
Wang, Xiyao
Koutra, Danai
Ai, Wei
Huang, Furong
Publication Year :
2024

Abstract

Large language models (LLMs) have significantly advanced various natural language processing tasks, but deploying them remains computationally expensive. Knowledge distillation (KD) is a promising solution, enabling the transfer of capabilities from larger teacher LLMs to more compact student models. Particularly, sequence-level KD, which distills rationale-based reasoning processes instead of merely final outcomes, shows great potential in enhancing students' reasoning capabilities. However, current methods struggle with sequence level KD under long-tailed data distributions, adversely affecting generalization on sparsely represented domains. We introduce the Multi-Stage Balanced Distillation (BalDistill) framework, which iteratively balances training data within a fixed computational budget. By dynamically selecting representative head domain examples and synthesizing tail domain examples, BalDistill achieves state-of-the-art performance across diverse long-tailed datasets, enhancing both the efficiency and efficacy of the distilled models.<br />Comment: preprint

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2406.13114
Document Type :
Working Paper