Back to Search Start Over

On Expressive Power of Looped Transformers: Theoretical Analysis and Enhancement via Timestep Encoding

Authors :
Xu, Kevin
Sato, Issei
Publication Year :
2024

Abstract

Looped Transformers offer advantages in parameter efficiency and Turing completeness. However, their expressive power for function approximation and approximation rate remains underexplored. In this paper, we establish approximation rates of Looped Transformers by defining the concept of the modulus of continuity for sequence-to-sequence functions. This reveals a limitation specific to the looped architecture. That is, the analysis prompts us to incorporate scaling parameters for each loop, conditioned on timestep encoding. Experimental results demonstrate that increasing the number of loops enhances performance, with further gains achieved through the timestep encoding architecture.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2410.01405
Document Type :
Working Paper