Back to Search
Start Over
SATr: Slice Attention with Transformer for Universal Lesion Detection
- Publication Year :
- 2022
- Publisher :
- arXiv, 2022.
-
Abstract
- Universal Lesion Detection (ULD) in computed tomography plays an essential role in computer-aided diagnosis. Promising ULD results have been reported by multi-slice-input detection approaches which model 3D context from multiple adjacent CT slices, but such methods still experience difficulty in obtaining a global representation among different slices and within each individual slice since they only use convolution-based fusion operations. In this paper, we propose a novel Slice Attention Transformer (SATr) block which can be easily plugged into convolution-based ULD backbones to form hybrid network structures. Such newly formed hybrid backbones can better model long-distance feature dependency via the cascaded self-attention modules in the Transformer block while still holding a strong power of modeling local features with the convolutional operations in the original backbone. Experiments with five state-of-the-art methods show that the proposed SATr block can provide an almost free boost to lesion detection accuracy without extra hyperparameters or special network designs.<br />Comment: 11 pages, 3 figures
- Subjects :
- FOS: Computer and information sciences
Artificial Intelligence (cs.AI)
Computer Science - Artificial Intelligence
Computer Vision and Pattern Recognition (cs.CV)
Image and Video Processing (eess.IV)
Computer Science - Computer Vision and Pattern Recognition
FOS: Electrical engineering, electronic engineering, information engineering
Electrical Engineering and Systems Science - Image and Video Processing
Subjects
Details
- Database :
- OpenAIRE
- Accession number :
- edsair.doi.dedup.....29427ad716355a6b6a6edfdf7fa2eaa3
- Full Text :
- https://doi.org/10.48550/arxiv.2203.07373