Back to Search Start Over

FORA: Fast-Forward Caching in Diffusion Transformer Acceleration

Authors :
Selvaraju, Pratheba
Ding, Tianyu
Chen, Tianyi
Zharkov, Ilya
Liang, Luming
Publication Year :
2024

Abstract

Diffusion transformers (DiT) have become the de facto choice for generating high-quality images and videos, largely due to their scalability, which enables the construction of larger models for enhanced performance. However, the increased size of these models leads to higher inference costs, making them less attractive for real-time applications. We present Fast-FORward CAching (FORA), a simple yet effective approach designed to accelerate DiT by exploiting the repetitive nature of the diffusion process. FORA implements a caching mechanism that stores and reuses intermediate outputs from the attention and MLP layers across denoising steps, thereby reducing computational overhead. This approach does not require model retraining and seamlessly integrates with existing transformer-based diffusion models. Experiments show that FORA can speed up diffusion transformers several times over while only minimally affecting performance metrics such as the IS Score and FID. By enabling faster processing with minimal trade-offs in quality, FORA represents a significant advancement in deploying diffusion transformers for real-time applications. Code will be made publicly available at: https://github.com/prathebaselva/FORA.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2407.01425
Document Type :
Working Paper