1. Active semi-supervised learning with multiple complementary information.
- Author
-
Park, Sung Ho and Kim, Seoung Bum
- Subjects
- *
LABELS , *DRUG labeling , *ACQUISITION of data , *MACHINE learning , *LEAST squares , *REGRESSION analysis , *DIMENSION reduction (Statistics) - Abstract
Highlights • We propose an active semi-supervised learning algorithm with multiple criteria. • The criteria are representativeness, diversity, and variance reduction of a model. • We use clustering information to develop representativeness and diversity criteria. • The proposed algorithm is useful to avoid selection of undesirable samples. Abstract In many practical machine learning problems, the acquisition of labeled data is often expensive and time consuming. To reduce this labeling cost, active learning has been introduced in many scientific fields. This study considers the problem of active learning of a regression model in the context of an optimal experimental design. Classical optimal experimental design approaches are based on the least square errors of labeled samples. Recently, a couple of active learning approaches that take advantage of both labeled and unlabeled data have been developed based on Laplacian regularized regression models with a single criterion. However, these approaches are susceptible to selecting undesirable samples when the number of initially labeled samples is small. To address this susceptibility, this study proposes an active learning method that considers multiple complementary criteria. These criteria include sample representativeness, diversity information, and variance reduction of the Laplacian regularization model. Specifically, we developed novel density and diversity criteria based on a clustering algorithm to identify the samples that are representative of their distributions, while minimizing their redundancy. Experiments were conducted on synthetic and benchmark data to compare the performance of the proposed method with that of existing methods. Experimental results demonstrate that the proposed active learning algorithm outperforms its existing counterparts. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF