1. DADNet: text detection of arbitrary shapes from drone perspective based on boundary adaptation
- Author
-
Jun Liu, Jianxun Zhang, Ting Tang, and Shengyuan Wu
- Subjects
Computer vision ,Attention mechanism ,Arbitrary shape text detection ,Transformer ,Drone ,Electronic computers. Computer science ,QA75.5-76.95 ,Information technology ,T58.5-58.64 - Abstract
Abstract The rapid development of drone technology has made drones one of the essential tools for acquiring aerial information. The detection and localization of text information through drones greatly enhance their understanding of the environment, enabling tasks of significant importance such as community commercial planning and autonomous navigation in intelligent environments. However, the unique perspective and complex environment during drone photography lead to various challenges in text detection, including diverse text shapes, large-scale variations, and background interference, making traditional methods inadequate. To address this issue, we propose a drone-based text detection method based on boundary adaptation. We first conduct an in-depth analysis of text characteristics from a drone’s perspective. Using ResNet50 as the backbone network, we introduce the proposed Hybrid Text Attention Mechanism into the backbone network to enhance the perception of text regions in the feature extraction module. Additionally, we propose a Spatial Feature Fusion Module to adaptively fuse text features of different scales, thereby enhancing the model’s adaptability. Furthermore, we introduce a text detail transformer by incorporating a local feature extractor into the transformer of the text detail boundary iteration optimization module. This enables the precise optimization and localization of text boundaries by reducing the interference of complex backgrounds, eliminating the need for complex post-processing. Extensive experiments on challenging text detection datasets and drone-based text detection datasets validate the high robustness and state-of-the-art performance of our proposed method, laying a solid foundation for practical applications.
- Published
- 2024
- Full Text
- View/download PDF