Back to Search Start Over

Dual-Branch Fusion of Convolutional Neural Network and Graph Convolutional Network for PolSAR Image Classification.

Authors :
Radman, Ali
Mahdianpari, Masoud
Brisco, Brian
Salehi, Bahram
Mohammadimanesh, Fariba
Source :
Remote Sensing. Jan2023, Vol. 15 Issue 1, p75. 19p.
Publication Year :
2023

Abstract

Polarimetric synthetic aperture radar (PolSAR) images contain useful information, which can lead to extensive land cover interpretation and a variety of output products. In contrast to optical imagery, there are several challenges in extracting beneficial features from PolSAR data. Deep learning (DL) methods can provide solutions to address PolSAR feature extraction challenges. The convolutional neural networks (CNNs) and graph convolutional networks (GCNs) can drive PolSAR image characteristics by deploying kernel abilities in considering neighborhood (local) information and graphs in considering long-range similarities. A novel dual-branch fusion of CNN and mini-GCN is proposed in this study for PolSAR image classification. To fully utilize the PolSAR image capacity, different spatial-based and polarimetric-based features are incorporated into CNN and mini-GCN branches of the proposed model. The performance of the proposed method is verified by comparing the classification results to multiple state-of-the-art approaches on the airborne synthetic aperture radar (AIRSAR) dataset of Flevoland and San Francisco. The proposed approach showed 1.3% and 2.7% improvements in overall accuracy compared to conventional methods with these AIRSAR datasets. Meanwhile, it enhanced its one-branch version by 0.73% and 1.82%. Analyses over Flevoland data further indicated the effectiveness of the dual-branch model using varied training sampling ratios, leading to a promising overall accuracy of 99.9% with a 10% sampling ratio. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20724292
Volume :
15
Issue :
1
Database :
Academic Search Index
Journal :
Remote Sensing
Publication Type :
Academic Journal
Accession number :
161182892
Full Text :
https://doi.org/10.3390/rs15010075