Back to Search Start Over

L-MAE: Longitudinal masked auto-encoder with time and severity-aware encoding for diabetic retinopathy progression prediction

Authors :
Zeghlache, Rachid
Conze, Pierre-Henri
Daho, Mostafa El Habib
Li, Yihao
Rezaei, Alireza
Boité, Hugo Le
Tadayoni, Ramin
Massin, Pascal
Cochener, Béatrice
Brahim, Ikram
Quellec, Gwenolé
Lamard, Mathieu
Zeghlache, Rachid
Conze, Pierre-Henri
Daho, Mostafa El Habib
Li, Yihao
Rezaei, Alireza
Boité, Hugo Le
Tadayoni, Ramin
Massin, Pascal
Cochener, Béatrice
Brahim, Ikram
Quellec, Gwenolé
Lamard, Mathieu
Publication Year :
2024

Abstract

Pre-training strategies based on self-supervised learning (SSL) have proven to be effective pretext tasks for many downstream tasks in computer vision. Due to the significant disparity between medical and natural images, the application of typical SSL is not straightforward in medical imaging. Additionally, those pretext tasks often lack context, which is critical for computer-aided clinical decision support. In this paper, we developed a longitudinal masked auto-encoder (MAE) based on the well-known Transformer-based MAE. In particular, we explored the importance of time-aware position embedding as well as disease progression-aware masking. Taking into account the time between examinations instead of just scheduling them offers the benefit of capturing temporal changes and trends. The masking strategy, for its part, evolves during follow-up to better capture pathological changes, ensuring a more accurate assessment of disease progression. Using OPHDIAT, a large follow-up screening dataset targeting diabetic retinopathy (DR), we evaluated the pre-trained weights on a longitudinal task, which is to predict the severity label of the next visit within 3 years based on the past time series examinations. Our results demonstrated the relevancy of both time-aware position embedding and masking strategies based on disease progression knowledge. Compared to popular baseline models and standard longitudinal Transformers, these simple yet effective extensions significantly enhance the predictive ability of deep classification models.

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1438539936
Document Type :
Electronic Resource