1. Cross-Domain Scene Classification by Integrating Multiple Incomplete Sources.
- Author
-
Gong, Tengfei, Zheng, Xiangtao, and Lu, Xiaoqiang
- Subjects
- *
CLASSIFICATION , *FEATURE extraction , *REMOTE sensing - Abstract
Cross-domain scene classification identifies scene categories by learning knowledge from a labeled data set (source domain) to an unlabeled data set (target domain), where the source data and the target data are sampled from different distributions. A lot of domain adaptation methods are used to reduce the distribution shift across domains, and most existing methods assume that the source domain shares the same categories with the target domain. It is usually hard to find a source domain that covers all categories in the target domain. Some works exploit multiple incomplete source domains to cover the target domain. However, in such setting, the categories of each source domain are a subset of the target-domain categories, and the target domain contains “unknown” categories for each source domain. The existence of unknown categories results in the conventional domain adaptation unsuitable. Known and unknown categories should be treated separately. Therefore, a separation mechanism is proposed to separate the known and unknown categories in this article. First, multiple-source classifiers trained on the multiple source domains are used to coarsely separate the known/unknown categories in the target domain. The target images with high similarities to source images are selected as known categories, and the target images with low similarities are selected as unknown categories. Then, a binary classifier trained using the selected images is used to finely separate all target-domain images. Finally, only the known categories are implemented in the cross-domain alignment and classification. The target images get labels by integrating the hypotheses of multiple-source classifiers on the known categories. Experiments are conducted on three cross-domain data sets to demonstrate the effectiveness of the proposed method. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF