Back to Search Start Over

PDBL: Improving Histopathological Tissue Classification With Plug-and-Play Pyramidal Deep-Broad Learning.

Authors :
Lin, Jiatai
Han, Guoqiang
Pan, Xipeng
Liu, Zaiyi
Chen, Hao
Li, Danyi
Jia, Xiping
Shi, Zhenwei
Wang, Zhizhen
Cui, Yanfen
Li, Haiming
Liang, Changhong
Liang, Li
Wang, Ying
Han, Chu
Source :
IEEE Transactions on Medical Imaging; Sep2022, Vol. 41 Issue 9, p2252-2262, 11p
Publication Year :
2022

Abstract

Histopathological tissue classification is a simpler way to achieve semantic segmentation for the whole slide images, which can alleviate the requirement of pixel-level dense annotations. Existing works mostly leverage the popular CNN classification backbones in computer vision to achieve histopathological tissue classification. In this paper, we propose a super lightweight plug-and-play module, named Pyramidal Deep-Broad Learning (PDBL), for any well-trained classification backbone to improve the classification performance without a re-training burden. For each patch, we construct a multi-resolution image pyramid to obtain the pyramidal contextual information. For each level in the pyramid, we extract the multi-scale deep-broad features by our proposed Deep-Broad block (DB-block). We equip PDBL in three popular classification backbones, ShuffLeNetV2, EfficientNetb0, and ResNet50 to evaluate the effectiveness and efficiency of our proposed module on two datasets (Kather Multiclass Dataset and the LC25000 Dataset). Experimental results demonstrate the proposed PDBL can steadily improve the tissue-level classification performance for any CNN backbones, especially for the lightweight models when given a small among of training samples (less than 10%). It greatly saves the computational resources and annotation efforts. The source code is available at: https://github.com/linjiatai/PDBL. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
02780062
Volume :
41
Issue :
9
Database :
Complementary Index
Journal :
IEEE Transactions on Medical Imaging
Publication Type :
Academic Journal
Accession number :
158869981
Full Text :
https://doi.org/10.1109/TMI.2022.3161787