Back to Search Start Over

Universally Quantized Neural Compression

Authors :
Agustsson, Eirikur
Theis, Lucas
Publication Year :
2020

Abstract

A popular approach to learning encoders for lossy compression is to use additive uniform noise during training as a differentiable approximation to test-time quantization. We demonstrate that a uniform noise channel can also be implemented at test time using universal quantization (Ziv, 1985). This allows us to eliminate the mismatch between training and test phases while maintaining a completely differentiable loss function. Implementing the uniform noise channel is a special case of the more general problem of communicating a sample, which we prove is computationally hard if we do not make assumptions about its distribution. However, the uniform special case is efficient as well as easy to implement and thus of great interest from a practical point of view. Finally, we show that quantization can be obtained as a limiting case of a soft quantizer applied to the uniform noise channel, bridging compression with and without quantization.<br />Comment: Authors contributed equally

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2006.09952
Document Type :
Working Paper