Back to Search Start Over

Noise-Powered Disentangled Representation for Unsupervised Speckle Reduction of Optical Coherence Tomography Images.

Authors :
Huang, Yongqiang
Xia, Wenjun
Lu, Zexin
Liu, Yan
Chen, Hu
Zhou, Jiliu
Fang, Leyuan
Zhang, Yi
Source :
IEEE Transactions on Medical Imaging. Oct2021, Vol. 40 Issue 10, p2600-2614. 15p.
Publication Year :
2021

Abstract

Due to its noninvasive character, optical coherence tomography (OCT) has become a popular diagnostic method in clinical settings. However, the low-coherence interferometric imaging procedure is inevitably contaminated by heavy speckle noise, which impairs both visual quality and diagnosis of various ocular diseases. Although deep learning has been applied for image denoising and achieved promising results, the lack of well-registered clean and noisy image pairs makes it impractical for supervised learning-based approaches to achieve satisfactory OCT image denoising results. In this paper, we propose an unsupervised OCT image speckle reduction algorithm that does not rely on well-registered image pairs. Specifically, by employing the ideas of disentangled representation and generative adversarial network, the proposed method first disentangles the noisy image into content and noise spaces by corresponding encoders. Then, the generator is used to predict the denoised OCT image with the extracted content features. In addition, the noise patches cropped from the noisy image are utilized to facilitate more accurate disentanglement. Extensive experiments have been conducted, and the results suggest that our proposed method is superior to the classic methods and demonstrates competitive performance to several recently proposed learning-based approaches in both quantitative and qualitative aspects. Code is available at: https://github.com/tsmotlp/DRGAN-OCT. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
02780062
Volume :
40
Issue :
10
Database :
Academic Search Index
Journal :
IEEE Transactions on Medical Imaging
Publication Type :
Academic Journal
Accession number :
153710547
Full Text :
https://doi.org/10.1109/TMI.2020.3045207