Back to Search Start Over

Entropy Optimized Deep Feature Compression

Authors :
Xiaoran Cao
Wei Ziwei
Yun He
Benben Niu
Source :
IEEE Signal Processing Letters. 28:324-328
Publication Year :
2021
Publisher :
Institute of Electrical and Electronics Engineers (IEEE), 2021.

Abstract

This letter focuses on the compression of deep features. With the rapid expansion of deep feature data in various CNN-based analysis and processing tasks, the demand for efficient compression continues to increase. Product quantization (PQ) is widely used in the compact expression of features. In the quantization process, feature vectors are mapped into fixed-length codes based on a pre-trained codebook. However, PQ is not specifically designed for data compression, and the fixed-length codes are not suitable for further compression such as entropy coding. In this letter, we propose an entropy-optimized compression scheme for deep features. By introducing entropy into the loss function in the training process of quantization, the quantization and entropy coding modules are jointly optimized to minimize the total coding cost. We evaluate the proposed methods in retrieval tasks. Compared with fixed-length coding, the proposed scheme can be generally combined with PQ and its extended method and can achieve a better compression performance consistently.

Details

ISSN :
15582361 and 10709908
Volume :
28
Database :
OpenAIRE
Journal :
IEEE Signal Processing Letters
Accession number :
edsair.doi...........375f6c74570d78ea349cacd0530d6da0
Full Text :
https://doi.org/10.1109/lsp.2021.3052097