Back to Search Start Over

MES-Loss: Mutually equidistant separation metric learning loss function.

Authors :
Boutaleb, Yasser
Soladie, Catherine
Duong, Nam-Duong
Kacete, Amine
Royan, Jérôme
Seguier, Renaud
Source :
Pattern Recognition Letters. Aug2023, Vol. 172, p58-64. 7p.
Publication Year :
2023

Abstract

• A new deep metric learning loss function that enforces the neural network to learn a highly discriminative feature space. • An automated penalty-based loss function (APML) that ensures the best intra-class distance minimization. • A complementary loss function (MES) that explicitly implies a heavy constraint seeking to achieve an optimal inter-class separation. Deep metric learning has attracted much attention in recent years due to its extensive applications, such as clustering and image retrieval. Thanks to the success of deep learning (DL), many deep metric learning (DML) methods have been proposed. Neural networks (NNs) utilize DML loss functions to learn a mapping function that maps samples into a highly discriminative low-dimensional feature space, facilitating measuring similarities between pairs of samples in such a manifold. Most existing methods usually try to boost the discriminatory power of NN by enhancing intra-class compactness in the high-level feature space. However, they do not explicitly imply constraints to improve inter-class separation. We propose in this paper a new composite DML loss function that, in addition to the intra-class compactness, explicitly implies regulations to enforce the best inter-class separation by mutually equidistantly distributing the centers of the classes. The proposed DML loss function achieved state-of-the-art results for clustering and image-retrieval tasks on two real-world data sets. [ABSTRACT FROM AUTHOR]

Subjects

Subjects :
*DEEP learning
*IMAGE retrieval

Details

Language :
English
ISSN :
01678655
Volume :
172
Database :
Academic Search Index
Journal :
Pattern Recognition Letters
Publication Type :
Academic Journal
Accession number :
169814874
Full Text :
https://doi.org/10.1016/j.patrec.2023.06.005