Back to Search Start Over

Improved U-Net Remote Sensing Classification Algorithm Based on Multi-Feature Fusion Perception

Authors :
Chuan Yan
Xiangsuo Fan
Jinlong Fan
Nayi Wang
Source :
Remote Sensing, Volume 14, Issue 5, Pages: 1118
Publication Year :
2022
Publisher :
Multidisciplinary Digital Publishing Institute, 2022.

Abstract

The selection and representation of remote sensing image classification features play crucial roles in image classification accuracy. To effectively improve the classification accuracy of features, an improved U-Net network framework based on multi-feature fusion perception is proposed in this paper. This framework adds the channel attention module (CAM-UNet) to the original U-Net framework and cascades the shallow features with the deep semantic features, replaces the classification layer in the original U-Net network with a support vector machine, and finally uses the majority voting game theory algorithm to fuse the multifeature classification results and obtain the final classification results. This study used the forest distribution in Xingbin District, Laibin City, Guangxi Zhuang Autonomous Region as the research object, which is based on Landsat 8 multispectral remote sensing images, and, by combining spectral features, spatial features, and advanced semantic features, overcame the influence of the reduction in spatial resolution that occurs with the deepening of the network on the classification results. The experimental results showed that the improved algorithm can improve classification accuracy. Before the improvement, the overall segmentation accuracy and segmentation accuracy of the forestland increased from 90.50% to 92.82% and from 95.66% to 97.16%, respectively. The forest cover results obtained by the algorithm proposed in this paper can be used as input data for regional ecological models, which is conducive to the development of accurate and real-time vegetation growth change models.

Details

Language :
English
ISSN :
20724292
Database :
OpenAIRE
Journal :
Remote Sensing
Accession number :
edsair.doi.dedup.....1006b0263dfa82820604326f4288aa4c
Full Text :
https://doi.org/10.3390/rs14051118