1. Generalization bottleneck in deep metric learning.
- Author
-
Hu, Zhanxuan, Wu, Danyang, Nie, Feiping, and Wang, Rong
- Subjects
- *
DEEP learning , *GENERALIZATION , *NONLINEAR functions , *VECTOR spaces - Abstract
Deep metric learning aims to learn a non-linear function that maps raw-data to a discriminative lower-dimensional embedding space, where semantically similar samples have larger similarity than dissimilar ones. Most existing approaches process each raw-data in two steps, by mapping the raw-data to a higher-dimensional feature space via a fixed backbone, followed by mapping the higher-dimensional feature space to a lower-dimensional embedding space via a linear layer. This paradigm, however, inevitably leads to a Generalization Bottleneck (GB) problem. Specifically, GB refers to a limitation that the generalization capacity of lower-dimensional embedding space is inferior to the higher-dimensional feature space in the test stage. To mitigate the capacity gap between feature space and embedding space, we propose to introduce a fully-learnable module, dubbed Relational Knowledge Preserving (RKP), that improves the generalization capacity of lower-dimensional embedding space by transferring the mutual similarity of instances. Our proposed RKP module can be integrated into a general deep metric learning approach. And, experiments conducted on different benchmarks show that it can significantly improve the performance of original model. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF