Back to Search Start Over

Improving Inference for Neural Image Compression

Authors :
Yang, Yibo
Bamler, Robert
Mandt, Stephan
Publication Year :
2020

Abstract

We consider the problem of lossy image compression with deep latent variable models. State-of-the-art methods build on hierarchical variational autoencoders (VAEs) and learn inference networks to predict a compressible latent representation of each data point. Drawing on the variational inference perspective on compression, we identify three approximation gaps which limit performance in the conventional approach: an amortization gap, a discretization gap, and a marginalization gap. We propose remedies for each of these three limitations based on ideas related to iterative inference, stochastic annealing for discrete optimization, and bits-back coding, resulting in the first application of bits-back coding to lossy compression. In our experiments, which include extensive baseline comparisons and ablation studies, we achieve new state-of-the-art performance on lossy image compression using an established VAE architecture, by changing only the inference method.<br />9 pages + detailed supplement with additional results; various typos corrected. Camera-ready version paper at NeurIPS 2020

Details

Language :
English
Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....82e8d158ca9c81a2e10b979dbcb9377c