51. Anomaly Detection for Medical Images Using Self-Supervised and Translation-Consistent Features.
- Author
-
Zhao, He, Li, Yuexiang, He, Nanjun, Ma, Kai, Fang, Leyuan, Li, Huiqi, and Zheng, Yefeng
- Subjects
- *
ANOMALY detection (Computer security) , *COMPUTER-assisted image analysis (Medicine) , *DIAGNOSTIC imaging , *SUPERVISED learning , *OPTICAL coherence tomography , *GENERATIVE adversarial networks , *DEEP learning - Abstract
As the labeled anomalous medical images are usually difficult to acquire, especially for rare diseases, the deep learning based methods, which heavily rely on the large amount of labeled data, cannot yield a satisfactory performance. Compared to the anomalous data, the normal images without the need of lesion annotation are much easier to collect. In this paper, we propose an anomaly detection framework, namely $\mathbb {SALAD}$ , extracting $\mathbb {S}$ elf-supervised and tr $\mathbb {A}$ ns $\mathbb {L}$ ation-consistent features for $\mathbb {A}$ nomaly $\mathbb {D}$ etection. The proposed SALAD is a reconstruction-based method, which learns the manifold of normal data through an encode-and-reconstruct translation between image and latent spaces. In particular, two constraints (i.e., structure similarity loss and center constraint loss) are proposed to regulate the cross-space (i.e., image and feature) translation, which enforce the model to learn translation-consistent and representative features from the normal data. Furthermore, a self-supervised learning module is engaged into our framework to further boost the anomaly detection accuracy by deeply exploiting useful information from the raw normal data. An anomaly score, as a measure to separate the anomalous data from the healthy ones, is constructed based on the learned self-supervised-and-translation-consistent features. Extensive experiments are conducted on optical coherence tomography (OCT) and chest X-ray datasets. The experimental results demonstrate the effectiveness of our approach. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF