Back to Search Start Over

Object-Enhanced YOLO Networks for Synthetic Aperture Radar Ship Detection.

Authors :
Wu, Kun
Zhang, Zhijian
Chen, Zeyu
Liu, Guohua
Source :
Remote Sensing. Mar2024, Vol. 16 Issue 6, p1001. 20p.
Publication Year :
2024

Abstract

Synthetic aperture radar (SAR) enables precise object localization and imaging, which has propelled the rapid development of algorithms for maritime ship identification and detection. However, most current deep learning-based algorithms tend to increase network depth to improve detection accuracy, which may result in the loss of effective features of the target. In response to this challenge, this paper innovatively proposes an object-enhanced network, OE-YOLO, designed specifically for SAR ship detection. Firstly, we input the original image into an improved CFAR detector, which enhances the network's ability to localize and perform object extraction by providing more information through an additional channel. Additionally, the Coordinate Attention mechanism (CA) is introduced into the backbone of YOLOv7-tiny to improve the model's ability to capture spatial and positional information in the image, thereby alleviating the problem of losing the position of small objects. Furthermore, to enhance the model's detection capability for multi-scale objects, we optimize the neck part of the original model to integrate the Asymptotic Feature Fusion (AFF) network. Finally, the proposed network model is thoroughly tested and evaluated using publicly available SAR image datasets, including the SAR-Ship-Dataset and HRSID dataset. In comparison to the baseline method YOLOv7-tiny, OE-YOLO exhibits superior performance with a lower parameter count. When compared with other commonly used deep learning-based detection methods, OE-YOLO demonstrates optimal performance and more accurate detection results. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20724292
Volume :
16
Issue :
6
Database :
Academic Search Index
Journal :
Remote Sensing
Publication Type :
Academic Journal
Accession number :
176366572
Full Text :
https://doi.org/10.3390/rs16061001