Back to Search
Start Over
FAMF-Net: Feature Alignment Mutual Attention Fusion with Region Awareness for Breast Cancer Diagnosis via Imbalanced Data.
- Source :
-
IEEE transactions on medical imaging [IEEE Trans Med Imaging] 2024 Nov 05; Vol. PP. Date of Electronic Publication: 2024 Nov 05. - Publication Year :
- 2024
- Publisher :
- Ahead of Print
-
Abstract
- Automatic and accurate classification of breast cancer in multimodal ultrasound images is crucial to improve patients' diagnosis and treatment effect and save medical resources. Methodologically, the fusion of multimodal ultrasound images often encounters challenges such as misalignment, limited utilization of complementary information, poor interpretability in feature fusion, and imbalances in sample categories. To solve these problems, we propose a feature alignment mutual attention fusion method (FAMF-Net), which consists of a region awareness alignment (RAA) block, a mutual attention fusion (MAF) block, and a reinforcement learning-based dynamic optimization strategy(RDO). Specifically, RAA achieves region awareness through class activation mapping and performs translation transformation to achieve feature alignment. When MAF utilizes a mutual attention mechanism for feature interaction fusion, it mines edge and color features separately in B-mode and shear wave elastography images, enhancing the complementarity of features and improving interpretability. Finally, RDO uses the distribution of samples and prediction probabilities during training as the state of reinforcement learning to dynamically optimize the weights of the loss function, thereby solving the problem of class imbalance. The experimental results based on our clinically obtained dataset demonstrate the effectiveness of the proposed method. Our code will be available at: https://github.com/Magnety/Multi_modal_Image.
Details
- Language :
- English
- ISSN :
- 1558-254X
- Volume :
- PP
- Database :
- MEDLINE
- Journal :
- IEEE transactions on medical imaging
- Publication Type :
- Academic Journal
- Accession number :
- 39499601
- Full Text :
- https://doi.org/10.1109/TMI.2024.3485612