Back to Search Start Over

Domain adaptive twin support vector machine learning using privileged information.

Authors :
Li, Yanmeng
Sun, Huaijiang
Yan, Wenzhu
Source :
Neurocomputing. Jan2022, Vol. 469, p13-27. 15p.
Publication Year :
2022

Abstract

In the fields of computer vision and machine learning, domain adaptation has been extensively studied and the main challenge in the case is how to transform the existing classifier(s) into an effective adaptive classifier to exploit the latent information in the new data source which typically has a different distribution compared with the original data source. Currently, the Adaptive Support Vector Machines (A-SVM) has been proposed to deal with the domain adaptation problem, which is an effective strategy. However, the resulting optimization task by minimizing a convex quadratic function in A-SVM can not effectively minimize the distance between a source and a target domain as much as possible and typically has high computational complexity. In order to handle these problems, in this paper, we extend the A-SVM by determining a pair of nonparallel up- and down-bound functions solved by two smaller sized quadratic programming problems (QPPs) to achieve a faster learning speed. Notably, our method yields two nonparallel separating hyperplanes to exploit the latent discriminant information based on SVM classification mechanism, which can naturally enhance the classification performance. This method is named as Adaptive Twin Support Vector Machine Learning (A-TSVM). Moreover, we consider a high-level learning paradigm with privilege information (LUPI) to learn a induced model that further constrains the solution in the target space. The learned model is named as domain Adaptive Twin Support Vector Machine Learning Using Privileged Information (A-TSVM+). Finally, a series of comparative experiments with many other methods are performed on three datasets. The experimental results effectively indicate that the proposed method can not only greatly improve the accuracy of classification, but also save computing time. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09252312
Volume :
469
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
153825651
Full Text :
https://doi.org/10.1016/j.neucom.2021.10.069