Back to Search Start Over

Distilling foundation models for robust and efficient models in digital pathology

Authors :
Filiot, Alexandre
Dop, Nicolas
Tchita, Oussama
Riou, Auriane
Dubois, Rémy
Peeters, Thomas
Valter, Daria
Scalbert, Marin
Saillard, Charlie
Robin, Geneviève
Olivier, Antoine
Publication Year :
2025

Abstract

In recent years, the advent of foundation models (FM) for digital pathology has relied heavily on scaling the pre-training datasets and the model size, yielding large and powerful models. While it resulted in improving the performance on diverse downstream tasks, it also introduced increased computational cost and inference time. In this work, we explore the distillation of a large foundation model into a smaller one, reducing the number of parameters by several orders of magnitude. Leveraging distillation techniques, our distilled model, H0-mini, achieves nearly comparable performance to large FMs at a significantly reduced inference cost. It is evaluated on several public benchmarks, achieving 3rd place on the HEST benchmark and 5th place on the EVA benchmark. Additionally, a robustness analysis conducted on the PLISM dataset demonstrates that our distilled model reaches excellent robustness to variations in staining and scanning conditions, significantly outperforming other state-of-the art models. This opens new perspectives to design lightweight and robust models for digital pathology, without compromising on performance.<br />Comment: Preprint

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2501.16239
Document Type :
Working Paper