Back to Search Start Over

MERIT: Multi-view Evidential learning for Reliable and Interpretable liver fibrosis sTaging

Authors :
Liu, Yuanye
Gao, Zheyao
Shi, Nannan
Wu, Fuping
Shi, Yuxin
Chen, Qingchao
Zhuang, Xiahai
Liu, Yuanye
Gao, Zheyao
Shi, Nannan
Wu, Fuping
Shi, Yuxin
Chen, Qingchao
Zhuang, Xiahai
Publication Year :
2024

Abstract

Accurate staging of liver fibrosis from magnetic resonance imaging (MRI) is crucial in clinical practice. While conventional methods often focus on a specific sub-region, multi-view learning captures more information by analyzing multiple patches simultaneously. However, previous multi-view approaches could not typically calculate uncertainty by nature, and they generally integrate features from different views in a black-box fashion, hence compromising reliability as well as interpretability of the resulting models. In this work, we propose a new multi-view method based on evidential learning, referred to as MERIT, which tackles the two challenges in a unified framework. MERIT enables uncertainty quantification of the predictions to enhance reliability, and employs a logic-based combination rule to improve interpretability. Specifically, MERIT models the prediction from each sub-view as an opinion with quantified uncertainty under the guidance of the subjective logic theory. Furthermore, a distribution-aware base rate is introduced to enhance performance, particularly in scenarios involving class distribution shifts. Finally, MERIT adopts a feature-specific combination rule to explicitly fuse multi-view predictions, thereby enhancing interpretability. Results have showcased the effectiveness of the proposed MERIT, highlighting the reliability and offering both ad-hoc and post-hoc interpretability. They also illustrate that MERIT can elucidate the significance of each view in the decision-making process for liver fibrosis staging.<br />Comment: Submitted to Medical Image Analysis

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1438553851
Document Type :
Electronic Resource