Back to Search Start Over

Multisource Remote Sensing Data Classification With Graph Fusion Network.

Authors :
Du, Xingqian
Zheng, Xiangtao
Lu, Xiaoqiang
Doudkin, Alexander A.
Source :
IEEE Transactions on Geoscience & Remote Sensing. Dec2021, Vol. 59 Issue 12, p10062-10072. 11p.
Publication Year :
2021

Abstract

The land cover classification has been an important task in remote sensing. With the development of various sensors technologies, carrying out classification work with multisource remote sensing (MSRS) data has shown an advantage over using a single type of data. Hyperspectral images (HSIs) are able to represent the spectral properties of land cover, which is quite common for land cover understanding. Light detection and ranging (LiDAR) images contain altitude information of the ground, which is greatly helpful with urban scene analysis. Current HSI and LiDAR fusion methods perform feature extraction and feature fusion separately, which cannot well exploit the correlation of data sources. In order to make full use of the correlation of multisource data, an unsupervised feature extraction-fusion network for HSI and LiDAR, which utilizes feature fusion to guide the feature extraction procedure, is proposed in this article. More specifically, the network takes multisource data as input and directly output the unified fused feature. A multimodal graph is constructed for feature fusion, and graph-based loss functions including Laplacian loss and t-distributed stochastic neighbor embedding (t-SNE) loss are utilized to constrain the feature extraction network. Experimental results on several data sets demonstrate the proposed network can achieve more excellent classification performance than some state-of-the-art methods. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
01962892
Volume :
59
Issue :
12
Database :
Academic Search Index
Journal :
IEEE Transactions on Geoscience & Remote Sensing
Publication Type :
Academic Journal
Accession number :
153854144
Full Text :
https://doi.org/10.1109/TGRS.2020.3047130