Back to Search
Start Over
Stacked co-training for semi-supervised multi-label learning.
- Source :
-
Information Sciences . Aug2024, Vol. 677, pN.PAG-N.PAG. 1p. - Publication Year :
- 2024
-
Abstract
- Due to the difficulty of annotation, multi-label learning sometimes obtains a small amount of labeled data and a large amount of unlabeled data as supplements. To make up this issue, many algorithms extended the existing semi-supervised strategies in single-label patterns to multi-label applications, but failed to effectively consider the characteristics of semi-supervised multi-label learning. In this paper, a novel method named SCTML (Stacked Co-Training for Multi-Label learning) is proposed for semi-supervised multi-label learning. Through a two-layer stacking framework, SCTML learns label correlation in both base learners and meta learner, and effectively incorporates the semi-supervised assumptions of co-training, clustering and manifold. Extensive experiments demonstrate that the combination of multiple semi-supervised learning strategies effectively solves the semi-supervised multi-label learning problem. [ABSTRACT FROM AUTHOR]
- Subjects :
- *SUPERVISED learning
*LEARNING strategies
Subjects
Details
- Language :
- English
- ISSN :
- 00200255
- Volume :
- 677
- Database :
- Academic Search Index
- Journal :
- Information Sciences
- Publication Type :
- Periodical
- Accession number :
- 177926290
- Full Text :
- https://doi.org/10.1016/j.ins.2024.120906