Back to Search Start Over

Segmentation information with attention integration for classification of breast tumor in ultrasound image.

Authors :
Luo, Yaozhong
Huang, Qinghua
Li, Xuelong
Source :
Pattern Recognition. Apr2022, Vol. 124, pN.PAG-N.PAG. 1p.
Publication Year :
2022

Abstract

• A novel segmentation-to-classification framework is proposed for breast US diagnosis. • The segmentation network is trained to obtain the segmentation enhanced images, and the features of the segmentation enhanced images and original images are extracted in parallel for classification. • An attention-based method is proposed for feature aggregation of two parallel networks to enhance features useful for classification in a data-driven manner. • Experimental results show the advantages of the proposed method. Breast cancer is one of the most common forms of cancer among women worldwide. The development of computer-aided diagnosis (CAD) technology based on ultrasound imaging to promote the diagnosis of breast lesions has attracted the attention of researchers and deep learning is a popular and effective method. However, most of the deep learning based CAD methods neglect the relationship between two vision tasks tumor region segmentation and classification. In this paper, taking into account some prior knowledges of medicine, we propose a novel segmentation-to-classification scheme by adding the segmentation-based attention (SBA) information to the deep convolution network (DCNN) for breast tumors classification. A segmentation network is trained to generate tumor segmentation enhancement images. Then two parallel networks extract features for the original images and segmentation enhanced images and one channel attention based feature aggregation network is to automatically integrate the features extracted from two feature networks to improve the performance of recognizing malignant tumors in the breast ultrasound images. To validate our method, experiments have been conducted on breast ultrasound datasets. The classification results of our method have been compared with those obtained by eleven existing approaches. The experimental results show that the proposed method achieves the highest Accuracy (90.78%), Sensitivity (91.18%), Specificity (90.44%), F1-score (91.46%), and AUC (0.9549). [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00313203
Volume :
124
Database :
Academic Search Index
Journal :
Pattern Recognition
Publication Type :
Academic Journal
Accession number :
155491481
Full Text :
https://doi.org/10.1016/j.patcog.2021.108427