Back to Search Start Over

Learning global dependencies based on hierarchical full connection for brain tumor segmentation.

Authors :
Cai, Jianping
He, Zhe
Zheng, Zengwei
Xu, Qingsheng
Hu, Chi
Huo, Meimei
Source :
Computer Methods & Programs in Biomedicine. Jun2022, Vol. 221, pN.PAG-N.PAG. 1p.
Publication Year :
2022

Abstract

• We propose a light-weight module, H-FC, which has a good ability to learn global dependencies. • H-FC can be easily substitute attention modules that lack the ability to learn global dependencies. • H-FC can be applied to high-resolution low-level feature maps. Because the appearance, shape and location of brain tumors vary greatly among different patients, brain tumor segmentation (BTS) is extremely challenging. Recently, many studies have used attention mechanisms to solve this problem, which can be roughly divided into two categories: the spatial attention based on convolution (with or without channel attention) and self-attention. Due to the limitation of convolution operations, the spatial attention based on convolution cannot learn global dependencies very well, resulting in poor performance in BTS. A simple improvement idea is to directly substitute it with self-attention, which has an excellent ability to learn global dependencies. Since self-attention is not friendly to GPU memory, this simple substitution will make the new attention mechanism unable to be applied to high-resolution low-level feature maps, which contain considerable geometric information and are also important for improving the performance of attention mechanism in BTS. In this paper, we propose a hierarchical fully connected module, named H-FC, to learn global dependencies. H-FC learns local dependencies at different feature map scales through fully connected layers hierarchically, and then combines these local dependencies as approximations of the global dependencies. H-FC requires very little GPU memory and can easily replace spatial attention module based on convolution operation, such as Attention Gate and SAM (in CBAM), to improve the performance of attention mechanisms in BTS. Adequate comparative experiments illustrate that H-FC performs better than Attention Gate and SAM (in CBAM), which lack the ability to learn global dependencies, in BTS, with improvements in most metrics and a larger improvement in Hausdorff Distance. By comparing the amount of calculation and parameters of the model before and after adding H-FC, it is prove that H-FC is light-weight. In this paper, we propose a novel H-FC to learn global dependencies. We illustrate the effectiveness of H-FC through experiments on BraTS2020 dataset. We mainly explore the influence of the region size and the number of steps on the performance of H-FC. We also confirm that the global dependencies of low-level feature maps are also important to BTS. We show that H-FC is light-weight through a time and space complexity analysis and the experimental results. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
01692607
Volume :
221
Database :
Academic Search Index
Journal :
Computer Methods & Programs in Biomedicine
Publication Type :
Academic Journal
Accession number :
157542134
Full Text :
https://doi.org/10.1016/j.cmpb.2022.106925