Back to Search Start Over

MGCET: MLP-mixer and Graph Convolutional Enhanced Transformer for Hyperspectral Image Classification.

Authors :
Al-qaness, Mohammed A. A.
Wu, Guoyong
AL-Alimi, Dalal
Source :
Remote Sensing; Aug2024, Vol. 16 Issue 16, p2892, 26p
Publication Year :
2024

Abstract

The vision transformer (ViT) has demonstrated performance comparable to that of convolutional neural networks (CNN) in the hyperspectral image classification domain. This is achieved by transforming images into sequence data and mining global spectral-spatial information to establish remote dependencies. Nevertheless, both the ViT and CNNs have their own limitations. For instance, a CNN is constrained by the extent of its receptive field, which prevents it from fully exploiting global spatial-spectral features. Conversely, the ViT is prone to excessive distraction during the feature extraction process. To be able to overcome the problem of insufficient feature information extraction caused using by a single paradigm, this paper proposes an MLP-mixer and a graph convolutional enhanced transformer (MGCET), whose network consists of a spatial-spectral extraction block (SSEB), an MLP-mixer, and a graph convolutional enhanced transformer (GCET). First, spatial-spectral features are extracted using SSEB, and then local spatial-spectral features are fused with global spatial-spectral features by the MLP-mixer. Finally, graph convolution is embedded in multi-head self-attention (MHSA) to mine spatial relationships and similarity between pixels, which further improves the modeling capability of the model. Correlation experiments were conducted on four different HSI datasets. The MGEET algorithm achieved overall accuracies (OAs) of 95.45%, 97.57%, 98.05%, and 98.52% on these datasets. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20724292
Volume :
16
Issue :
16
Database :
Complementary Index
Journal :
Remote Sensing
Publication Type :
Academic Journal
Accession number :
179355205
Full Text :
https://doi.org/10.3390/rs16162892