Back to Search Start Over

Multi-Augmentation-Based Contrastive Learning for Semi-Supervised Learning.

Authors :
Wang, Jie
Yang, Jie
He, Jiafan
Peng, Dongliang
Source :
Algorithms; Mar2024, Vol. 17 Issue 3, p91, 20p
Publication Year :
2024

Abstract

Semi-supervised learning has been proven to be effective in utilizing unlabeled samples to mitigate the problem of limited labeled data. Traditional semi-supervised learning methods generate pseudo-labels for unlabeled samples and train the classifier using both labeled and pseudo-labeled samples. However, in data-scarce scenarios, reliance on labeled samples for initial classifier generation can degrade performance. Methods based on consistency regularization have shown promising results by encouraging consistent outputs for different semantic variations of the same sample obtained through diverse augmentation techniques. However, existing methods typically utilize only weak and strong augmentation variants, limiting information extraction. Therefore, a multi-augmentation contrastive semi-supervised learning method (MAC-SSL) is proposed. MAC-SSL introduces moderate augmentation, combining outputs from moderately and weakly augmented unlabeled images to generate pseudo-labels. Cross-entropy loss ensures consistency between strongly augmented image outputs and pseudo-labels. Furthermore, the MixUP is adopted to blend outputs from labeled and unlabeled images, enhancing consistency between re-augmented outputs and new pseudo-labels. The proposed method achieves a state-of-the-art performance (accuracy) through extensive experiments conducted on multiple datasets with varying numbers of labeled samples. Ablation studies further investigate each component's significance. [ABSTRACT FROM AUTHOR]

Subjects

Subjects :
SUPERVISED learning
DATA mining

Details

Language :
English
ISSN :
19994893
Volume :
17
Issue :
3
Database :
Complementary Index
Journal :
Algorithms
Publication Type :
Academic Journal
Accession number :
176272493
Full Text :
https://doi.org/10.3390/a17030091