Back to Search Start Over

Attention-map augmentation for hypercomplex breast cancer classification.

Authors :
Lopez, Eleonora
Betello, Filippo
Carmignani, Federico
Grassucci, Eleonora
Comminiello, Danilo
Source :
Pattern Recognition Letters. Jun2024, Vol. 182, p140-146. 7p.
Publication Year :
2024

Abstract

Breast cancer is the most widespread neoplasm among women and early detection of this disease is critical. Deep learning techniques have become of great interest to improve diagnostic performance. However, distinguishing between malignant and benign masses in whole mammograms poses a challenge, as they appear nearly identical to an untrained eye, and the region of interest (ROI) constitutes only a small fraction of the entire image. In this paper, we propose a framework, parameterized hypercomplex attention maps (PHAM), to overcome these problems. Specifically, we deploy an augmentation step based on computing attention maps. Then, the attention maps are used to condition the classification step by constructing a multi-dimensional input comprised of the original breast cancer image and the corresponding attention map. In this step, a parameterized hypercomplex neural network (PHNN) is employed to perform breast cancer classification. The framework offers two main advantages. First, attention maps provide critical information regarding the ROI and allow the neural model to concentrate on it. Second, the hypercomplex architecture has the ability to model local relations between input dimensions thanks to hypercomplex algebra rules, thus properly exploiting the information provided by the attention map. We demonstrate the efficacy of the proposed framework on both mammography images as well as histopathological ones. We surpass attention-based state-of-the-art networks and the real-valued counterpart of our approach. The code of our work is available at https://github.com/ispamm/AttentionBCS. • Deep learning enhances breast cancer diagnosis. • Mammogram mass discrimination is challenging. • Attention maps can highlight the small ROI. • Attention map-augmentation can condition a hypercomplex network to improve performance. • Hypercomplex algebra exploits the additional information provided by the attention map. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
01678655
Volume :
182
Database :
Academic Search Index
Journal :
Pattern Recognition Letters
Publication Type :
Academic Journal
Accession number :
177147731
Full Text :
https://doi.org/10.1016/j.patrec.2024.04.014