Back to Search Start Over

Disentangling Feature Structure: A Mathematically Provable Two-Stage Training Dynamics in Transformers

Authors :
Gong, Zixuan
Teng, Jiaye
Liu, Yong
Publication Year :
2025

Abstract

Transformers may exhibit two-stage training dynamics during the real-world training process. For instance, when training GPT-2 on the Counterfact dataset, the answers progress from syntactically incorrect to syntactically correct to semantically correct. However, existing theoretical analyses hardly account for this two-stage phenomenon. In this paper, we theoretically demonstrate how such two-stage training dynamics occur in transformers. Specifically, we analyze the dynamics of transformers using feature learning techniques under in-context learning regimes, based on a disentangled two-type feature structure. Such disentanglement of feature structure is general in practice, e.g., natural languages contain syntax and semantics, and proteins contain primary and secondary structures. To our best known, this is the first rigorous result regarding a two-stage optimization process in transformers. Additionally, a corollary indicates that such a two-stage process is closely related to the spectral properties of the attention weights, which accords well with empirical findings.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2502.20681
Document Type :
Working Paper