Back to Search Start Over

Automatic Polyp Segmentation in Colonoscopy Images Using a Modified Deep Convolutional Encoder-Decoder Architecture.

Authors :
Eu CY
Tang TB
Lin CH
Lee LH
Lu CK
Source :
Sensors (Basel, Switzerland) [Sensors (Basel)] 2021 Aug 20; Vol. 21 (16). Date of Electronic Publication: 2021 Aug 20.
Publication Year :
2021

Abstract

Colorectal cancer has become the third most commonly diagnosed form of cancer, and has the second highest fatality rate of cancers worldwide. Currently, optical colonoscopy is the preferred tool of choice for the diagnosis of polyps and to avert colorectal cancer. Colon screening is time-consuming and highly operator dependent. In view of this, a computer-aided diagnosis (CAD) method needs to be developed for the automatic segmentation of polyps in colonoscopy images. This paper proposes a modified SegNet Visual Geometry Group-19 (VGG-19), a form of convolutional neural network, as a CAD method for polyp segmentation. The modifications include skip connections, 5 × 5 convolutional filters, and the concatenation of four dilated convolutions applied in parallel form. The CVC-ClinicDB, CVC-ColonDB, and ETIS-LaribPolypDB databases were used to evaluate the model, and it was found that our proposed polyp segmentation model achieved an accuracy, sensitivity, specificity, precision, mean intersection over union, and dice coefficient of 96.06%, 94.55%, 97.56%, 97.48%, 92.3%, and 95.99%, respectively. These results indicate that our model performs as well as or better than previous schemes in the literature. We believe that this study will offer benefits in terms of the future development of CAD tools for polyp segmentation for colorectal cancer diagnosis and management. In the future, we intend to embed our proposed network into a medical capsule robot for practical usage and try it in a hospital setting with clinicians.

Details

Language :
English
ISSN :
1424-8220
Volume :
21
Issue :
16
Database :
MEDLINE
Journal :
Sensors (Basel, Switzerland)
Publication Type :
Academic Journal
Accession number :
34451072
Full Text :
https://doi.org/10.3390/s21165630