Back to Search Start Over

In-Context Learning Functions with Varying Number of Minima

Authors :
Oniani, David
Wang, Yanshan
Publication Year :
2023

Abstract

Large Language Models (LLMs) have proven effective at In-Context Learning (ICL), an ability that allows them to create predictors from labeled examples. Few studies have explored the interplay between ICL and specific properties of functions it attempts to approximate. In our study, we use a formal framework to explore ICL and propose a new task of approximating functions with varying number of minima. We implement a method that allows for producing functions with given inputs as minima. We find that increasing the number of minima degrades ICL performance. At the same time, our evaluation shows that ICL outperforms 2-layer Neural Network (2NN) model. Furthermore, ICL learns faster than 2NN in all settings. We validate the findings through a set of few-shot experiments across various hyperparameter configurations.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2311.12538
Document Type :
Working Paper