Back to Search Start Over

A Survey on In-context Learning

Authors :
Dong, Qingxiu
Li, Lei
Dai, Damai
Zheng, Ce
Ma, Jingyuan
Li, Rui
Xia, Heming
Xu, Jingjing
Wu, Zhiyong
Chang, Baobao
Sun, Xu
Sui, Zhifang
Publication Year :
2022

Abstract

With the increasing capabilities of large language models (LLMs), in-context learning (ICL) has emerged as a new paradigm for natural language processing (NLP), where LLMs make predictions based on contexts augmented with a few examples. It has been a significant trend to explore ICL to evaluate and extrapolate the ability of LLMs. In this paper, we aim to survey and summarize the progress and challenges of ICL. We first present a formal definition of ICL and clarify its correlation to related studies. Then, we organize and discuss advanced techniques, including training strategies, prompt designing strategies, and related analysis. Additionally, we explore various ICL application scenarios, such as data engineering and knowledge updating. Finally, we address the challenges of ICL and suggest potential directions for further research. We hope that our work can encourage more research on uncovering how ICL works and improving ICL.<br />Comment: Papers collected until 2024/06/01

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2301.00234
Document Type :
Working Paper