Back to Search Start Over

Overview of image-to-image translation by use of deep neural networks: denoising, super-resolution, modality conversion, and reconstruction in medical imaging

Authors :
Kaji, Shizuo
Kida, Satoshi
Publication Year :
2019

Abstract

Since the advent of deep convolutional neural networks (DNNs), computer vision has seen an extremely rapid progress that has led to huge advances in medical imaging. This article does not aim to cover all aspects of the field but focuses on a particular topic, image-to-image translation. Although the topic may not sound familiar, it turns out that many seemingly irrelevant applications can be understood as instances of image-to-image translation. Such applications include (1) noise reduction, (2) super-resolution, (3) image synthesis, and (4) reconstruction. The same underlying principles and algorithms work for various tasks. Our aim is to introduce some of the key ideas on this topic from a uniform point of view. We introduce core ideas and jargon that are specific to image processing by use of DNNs. Having an intuitive grasp of the core ideas of and a knowledge of technical terms would be of great help to the reader for understanding the existing and future applications. Most of the recent applications which build on image-to-image translation are based on one of two fundamental architectures, called pix2pix and CycleGAN, depending on whether the available training data are paired or unpaired. We provide computer codes which implement these two architectures with various enhancements. Our codes are available online with use of the very permissive MIT license. We provide a hands-on tutorial for training a model for denoising based on our codes. We hope that this article, together with the codes, will provide both an overview and the details of the key algorithms, and that it will serve as a basis for the development of new applications.<br />Comment: many typos are fixed. to appear in Radiological Physics and Technology

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1905.08603
Document Type :
Working Paper