Back to Search Start Over

MICL: Improving In-Context Learning through Multiple-Label Words in Demonstration

Authors :
Zixiao, Zhu
Zijian, Feng
Hanzhang, Zhou
Junlang, Qian
Kezhi, Mao
Publication Year :
2024

Abstract

In-context learning (ICL) enables large language models (LLMs) to perform new tasks by using sample-label pairs as demonstrations. However, variations in demonstrations can lead to significantly different performances. Current research mainly focuses on selecting demonstration samples, preassuming the class name to be the label word when creating sample-label pairs. However, the choice of label words is crucial for ICL performance. Besides, we observe that using a single class name in demonstration may not yield optimal results while using multiple label words in one sample-label pair can enhance ICL performance. In this paper, we propose a comprehensive approach that organizes both samples and labels in demonstrations based on LLMs' output space distribution. This approach uses multiple label words in one sample-label pair to enhance label instruction. Evaluation results from seven classification datasets show that this demonstration organization method, which incorporates multiple label words to provide diverse label information, improves ICL performance.<br />Comment: 19 pages, 11 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2406.10908
Document Type :
Working Paper