1. A Multi-Sensor Fusion Framework Based on Coupled Residual Convolutional Neural Networks.
- Author
-
Li, Hao, Ghamisi, Pedram, Rasti, Behnood, Wu, Zhaoyan, Shapiro, Aurelie, Schultz, Michael, and Zipf, Alexander
- Subjects
- *
CONVOLUTIONAL neural networks , *MULTISENSOR data fusion , *DEEP learning , *REMOTE sensing , *BLOCK designs , *FEATURE extraction - Abstract
Multi-sensor remote sensing image classification has been considerably improved by deep learning feature extraction and classification networks. In this paper, we propose a novel multi-sensor fusion framework for the fusion of diverse remote sensing data sources. The novelty of this paper is grounded in three important design innovations: 1- a unique adaptation of the coupled residual networks to address multi-sensor data classification; 2- a smart auxiliary training via adjusting the loss function to address classifications with limited samples; and 3- a unique design of the residual blocks to reduce the computational complexity while preserving the discriminative characteristics of multi-sensor features. The proposed classification framework is evaluated using three different remote sensing datasets: the urban Houston university datasets (including Houston 2013 and the training portion of Houston 2018) and the rural Trento dataset. The proposed framework achieves high overall accuracies of 93.57%, 81.20%, and 98.81% on Houston 2013, the training portion of Houston 2018, and Trento datasets, respectively. Additionally, the experimental results demonstrate considerable improvements in classification accuracies compared with the existing state-of-the-art methods. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF