Back to Search Start Over

Post-training Quantization for Text-to-Image Diffusion Models with Progressive Calibration and Activation Relaxing

Authors :
Tang, Siao
Wang, Xin
Chen, Hong
Guan, Chaoyu
Wu, Zewen
Tang, Yansong
Zhu, Wenwu
Publication Year :
2023

Abstract

High computational overhead is a troublesome problem for diffusion models. Recent studies have leveraged post-training quantization (PTQ) to compress diffusion models. However, most of them only focus on unconditional models, leaving the quantization of widely-used pretrained text-to-image models, e.g., Stable Diffusion, largely unexplored. In this paper, we propose a novel post-training quantization method PCR (Progressive Calibration and Relaxing) for text-to-image diffusion models, which consists of a progressive calibration strategy that considers the accumulated quantization error across timesteps, and an activation relaxing strategy that improves the performance with negligible cost. Additionally, we demonstrate the previous metrics for text-to-image diffusion model quantization are not accurate due to the distribution gap. To tackle the problem, we propose a novel QDiffBench benchmark, which utilizes data in the same domain for more accurate evaluation. Besides, QDiffBench also considers the generalization performance of the quantized model outside the calibration dataset. Extensive experiments on Stable Diffusion and Stable Diffusion XL demonstrate the superiority of our method and benchmark. Moreover, we are the first to achieve quantization for Stable Diffusion XL while maintaining the performance.<br />Comment: Accepted by ECCV2024

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2311.06322
Document Type :
Working Paper