Back to Search Start Over

Deep Translation Prior: Test-time Training for Photorealistic Style Transfer

Authors :
Kim, Sunwoo
Kim, Soohyun
Kim, Seungryong
Publication Year :
2021

Abstract

Recent techniques to solve photorealistic style transfer within deep convolutional neural networks (CNNs) generally require intensive training from large-scale datasets, thus having limited applicability and poor generalization ability to unseen images or styles. To overcome this, we propose a novel framework, dubbed Deep Translation Prior (DTP), to accomplish photorealistic style transfer through test-time training on given input image pair with untrained networks, which learns an image pair-specific translation prior and thus yields better performance and generalization. Tailored for such test-time training for style transfer, we present novel network architectures, with two sub-modules of correspondence and generation modules, and loss functions consisting of contrastive content, style, and cycle consistency losses. Our framework does not require offline training phase for style transfer, which has been one of the main challenges in existing methods, but the networks are to be solely learned during test-time. Experimental results prove that our framework has a better generalization ability to unseen image pairs and even outperforms the state-of-the-art methods.<br />Comment: Accepted to AAAI 2022. Code is available at https://github.com/sunshower76/Deep_Trainslation_Prior

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2112.06150
Document Type :
Working Paper