Back to Search Start Over

Deep metric learning assisted by intra-variance in a semi-supervised view of learning.

Authors :
Liu, Pingping
Liu, Zetong
Lang, Yijun
Liu, Shihao
Zhou, Qiuzhan
Li, Qingliang
Source :
Engineering Applications of Artificial Intelligence. May2024, Vol. 131, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

Deep metric learning aims to construct an embedding space where samples belonging to the same class are closely grouped together, while samples from different classes are widely separated. Many existing deep metric learning methods focus on maximizing the difference between inter-class features, and semantic related information is obtained by increasing the distance between samples of different classes in the embedding space. However, by compressing all positive samples together and creating large margins between different classes, the local structure between similar samples is inadvertently disrupted. This disregard for the intra-class variance presents in the local structure results in an embedding space that exhibits lower generalizability when faced with unseen classes. Consequently, the network tends to overfit the training set and performs poorly on the test set, potentially leading to crashes during evaluation. Taking this into account, this paper introduces a self-supervised generative assisted ranking framework that offers a semi-supervised perspective on learning intra-class variance within traditional supervised deep metric learning. Specifically, this paper employs various intensities and diversities to synthesize samples based on positive examples, aiming to simulate the complex variations between similar samples. Then, an intra-class ranking loss function is devised based on the principles of self-supervised learning. This loss function constrains the ordering relationship of synthesized samples according to their generation intensity, enabling the network to capture subtle intra-class differences and maintain the intra-class distribution. With the adoption of this approach, a more realistic embedding space can be achieved, preserving both the global and local structures of samples. [Display omitted] • We propose a generative boundary that improves the accuracy of generation. • The generative boundary can balance inter-class variance and intra-class variance. • We propose a ranking loss to learn the intra-class variance. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09521976
Volume :
131
Database :
Academic Search Index
Journal :
Engineering Applications of Artificial Intelligence
Publication Type :
Academic Journal
Accession number :
176501716
Full Text :
https://doi.org/10.1016/j.engappai.2024.107885