Back to Search Start Over

GID: Global information distillation for medical semantic segmentation.

Authors :
Ye, Yong-Sen
Chen, Min-Rong
Zou, Hao-Li
Yang, Bai-Bing
Zeng, Guo-Qiang
Source :
Neurocomputing. Sep2022, Vol. 503, p248-258. 11p.
Publication Year :
2022

Abstract

In this work, we consider transferring global information from Transformer to Convolutional Neural Network (CNN) for medical semantic segmentation tasks. Previous network models for medical semantic segmentation tasks often suffer from difficulties in modeling global information or oversized model parameters. Here, to design a compact network with global and local information, we extract the global information modeling capability of Transformer into the CNN network and successfully apply it to the medical semantic segmentation tasks, called Global Information Distillation. In addition, the following two contributions are proposed to improve the effectiveness of distillation: i) We present an Information Transfer Module, which is based on a convolutional layer to prevent over-regularization and a Transformer layer to transfer global information; ii) For purpose of better transferring the teacher's soft targets, a Shrinking Result-Pixel distillation method is proposed in this paper. The effectiveness of our knowledge distillation approach is demonstrated by the experiments on multi-organ and cardiac segmentation tasks. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09252312
Volume :
503
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
158185142
Full Text :
https://doi.org/10.1016/j.neucom.2022.06.065