Back to Search Start Over

Uncovering mesa-optimization algorithms in Transformers

Authors :
von Oswald, Johannes
Schlegel, Maximilian
Meulemans, Alexander
Kobayashi, Seijin
Niklasson, Eyvind
Zucchet, Nicolas
Scherrer, Nino
Miller, Nolan
Sandler, Mark
Arcas, Blaise Agüera y
Vladymyrov, Max
Pascanu, Razvan
Sacramento, João
Publication Year :
2023

Abstract

Some autoregressive models exhibit in-context learning capabilities: being able to learn as an input sequence is processed, without undergoing any parameter changes, and without being explicitly trained to do so. The origins of this phenomenon are still poorly understood. Here we analyze a series of Transformer models trained to perform synthetic sequence prediction tasks, and discover that standard next-token prediction error minimization gives rise to a subsidiary learning algorithm that adjusts the model as new inputs are revealed. We show that this process corresponds to gradient-based optimization of a principled objective function, which leads to strong generalization performance on unseen sequences. Our findings explain in-context learning as a product of autoregressive loss minimization and inform the design of new optimization-based Transformer layers.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2309.05858
Document Type :
Working Paper