Back to Search Start Over

Revisiting consistency for semi-supervised semantic segmentation

Authors :
Grubišić, Ivan
Oršić, Marin
Šegvić, Siniša
Source :
Sensors. 2023; 23(2):940
Publication Year :
2021

Abstract

Semi-supervised learning an attractive technique in practical deployments of deep models since it relaxes the dependence on labeled data. It is especially important in the scope of dense prediction because pixel-level annotation requires significant effort. This paper considers semi-supervised algorithms that enforce consistent predictions over perturbed unlabeled inputs. We study the advantages of perturbing only one of the two model instances and preventing the backward pass through the unperturbed instance. We also propose a competitive perturbation model as a composition of geometric warp and photometric jittering. We experiment with efficient models due to their importance for real-time and low-power applications. Our experiments show clear advantages of (1) one-way consistency, (2) perturbing only the student branch, and (3) strong photometric and geometric perturbations. Our perturbation model outperforms recent work and most of the contribution comes from photometric component. Experiments with additional data from the large coarsely annotated subset of Cityscapes suggest that semi-supervised training can outperform supervised training with the coarse labels.<br />Comment: The source code is available at https://github.com/Ivan1248/semisup-seg-efficient

Details

Database :
arXiv
Journal :
Sensors. 2023; 23(2):940
Publication Type :
Report
Accession number :
edsarx.2106.07075
Document Type :
Working Paper
Full Text :
https://doi.org/10.3390/s23020940