Back to Search Start Over

FARNet: Fragmented affinity reasoning network of text instances for arbitrary shape text detection.

Authors :
Chen, Honghui
Chen, Pingping
Qiu, Yuhang
Ling, Nam
Source :
IET Image Processing (Wiley-Blackwell). May2023, Vol. 17 Issue 6, p1959-1977. 19p.
Publication Year :
2023

Abstract

Arbitrary shape text detection is a challenging task in scene text recognition. Driven by deep learning and large‐scale data sets, the detection method based on connected component (CC) has increasingly gained popularity. However, there are still problems of unclear separation of text instances and incorrect component links. Thus, in this paper, the authors propose a novel component connection method, that is, Fragmented Affinity Reasoning Network of Text Instances (FARNet), for arbitrary shape text detection. The network consists of a Weighted Feature Fusion Pyramid Network (WFFPN), Text Fragments Subgraph (TFS), and Dense Graph Attention Network (DGAT), which can be trained end‐to‐end. The WFFPN is used to generate text fragments, TFS and DGAT jointly construct an affinity reasoning network. Since the neighbouring boundaries between text instances may blend them into a single instance, the core idea is to use the WFFPN to divide the text instance into a series of rectangular fragments, the affinity reasoning network infers the affinity between fragments and then links them to rebuild text instances. Extensive experiments on seven challenging datasets (ICDAR2015, MSRA‐TD500, Totaltext, CTW‐1500, ICDAR 2019MLT, ICDAR2019 ArT, and DAST‐1500) demonstrate that the proposed text detector achieves state‐of‐the‐art performance in both on polygon datasets and quadrilateral datasets. The code is available at https://github.com/giganticpower/FARNet. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
17519659
Volume :
17
Issue :
6
Database :
Academic Search Index
Journal :
IET Image Processing (Wiley-Blackwell)
Publication Type :
Academic Journal
Accession number :
163412663
Full Text :
https://doi.org/10.1049/ipr2.12769