1. Fast and accurate sea-land segmentation of SAR imagery based on multi-source data registration.
- Author
-
Ling Zhao, Yasheng Zhang, Wu Xue, and Dong Lin
- Subjects
- *
OPTICAL remote sensing , *SYNTHETIC aperture radar , *SYNTHETIC apertures , *SPECKLE interference , *VECTOR data , *OPTICAL images - Abstract
Targeted on accurate and efficient ship detection of Synthetic Aperture Radar imagery, sea-land segmentation is key for the complex port scenes. Despite the importance of this task, current methods are still severely influenced by several limitations, such as high level of image speckle noise, complicated backscattering characteristics. In order to overcome these issues, we propose a sea-land segmentation method of Synthetic Aperture Radar imagery based on multi-source data registration. To be specific, high-resolution optical remote sensing images and high-precision coastline vector data are integrated with Synthetic Aperture Radar images for accurate sea-land segmentation. First, the Pyramid Level-by-level Guided-Histogram of the Orientated Phase Congruency method is developed for the registration of Synthetic Aperture Radar and optical images, making two types of images into the same spatial reference. Following this, the high-precision coastline vector data collected from optical images is transferred to the Synthetic Aperture Radar image, realizing accurate sea-land segmentation. Finally, aiming at ship detection tasks, two quantitative evaluation indicators for sea-land segmentation are proposed: (1) the Absolute Segmentation Accuracy; and (2) the Ship Mis-segmentation Index. The accuracy and efficiency of the proposed method are tested and verified on Gaofen-3 SAR images of Yokosuka Port, Japan. Experimental results show that the proposed method achieves an Absolute Segmentation Accuracy of 3.15 pixels, a Ship Mis-segmentation Index of 9.23% and processing time of 82.92 s on average, which reveals that the proposed method outperforms current popular approaches in terms of accuracy and efficiency. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF