Back to Search Start Over

Contrastive Unsupervised Representation Learning With Optimize-Selected Training Samples.

Authors :
Cheng Y
Zhang Z
Li X
Wang S
Source :
IEEE transactions on neural networks and learning systems [IEEE Trans Neural Netw Learn Syst] 2024 Jul 12; Vol. PP. Date of Electronic Publication: 2024 Jul 12.
Publication Year :
2024
Publisher :
Ahead of Print

Abstract

Contrastive unsupervised representation learning (CURL) is a technique that seeks to learn feature sets from unlabeled data. It has found widespread and successful application in unsupervised feature learning, with the design of positive and negative pairs serving as the type of data samples. While CURL has seen empirical successes in recent years, there is still room for improvement in terms of the pair data generation process. This includes tasks such as combining and re-filtering samples, or implementing transformations among positive/negative pairs. We refer to this as the sample selection process. In this article, we introduce an optimized pair-data sample selection method for CURL. This method efficiently ensures that the two types of sampled data (similar pair and dissimilar pair) do not belong to the same class. We provide a theoretical analysis to demonstrate why our proposed method enhances learning performance by analyzing its error probability. Furthermore, we extend our proof into PAC-Bayes generalization to illustrate how our method tightens the bounds provided in previous literature. Our numerical experiments on text/image datasets show that our method achieves competitive accuracy with good generalization bounds.

Details

Language :
English
ISSN :
2162-2388
Volume :
PP
Database :
MEDLINE
Journal :
IEEE transactions on neural networks and learning systems
Publication Type :
Academic Journal
Accession number :
38995710
Full Text :
https://doi.org/10.1109/TNNLS.2024.3424331