Back to Search Start Over

Human-machine Interactive Tissue Prototype Learning for Label-efficient Histopathology Image Segmentation

Authors :
Pan, Wentao
Yan, Jiangpeng
Chen, Hanbo
Yang, Jiawei
Xu, Zhe
Li, Xiu
Yao, Jianhua
Publication Year :
2022
Publisher :
arXiv, 2022.

Abstract

Recently, deep neural networks have greatly advanced histopathology image segmentation but usually require abundant annotated data. However, due to the gigapixel scale of whole slide images and pathologists' heavy daily workload, obtaining pixel-level labels for supervised learning in clinical practice is often infeasible. Alternatively, weakly-supervised segmentation methods have been explored with less laborious image-level labels, but their performance is unsatisfactory due to the lack of dense supervision. Inspired by the recent success of self-supervised learning methods, we present a label-efficient tissue prototype dictionary building pipeline and propose to use the obtained prototypes to guide histopathology image segmentation. Particularly, taking advantage of self-supervised contrastive learning, an encoder is trained to project the unlabeled histopathology image patches into a discriminative embedding space where these patches are clustered to identify the tissue prototypes by efficient pathologists' visual examination. Then, the encoder is used to map the images into the embedding space and generate pixel-level pseudo tissue masks by querying the tissue prototype dictionary. Finally, the pseudo masks are used to train a segmentation network with dense supervision for better performance. Experiments on two public datasets demonstrate that our human-machine interactive tissue prototype learning method can achieve comparable segmentation performance as the fully-supervised baselines with less annotation burden and outperform other weakly-supervised methods. Codes will be available upon publication.<br />Comment: IPMI2023 camera ready

Details

Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....77b23df6dfeee960024ad5f663c64a63
Full Text :
https://doi.org/10.48550/arxiv.2211.14491