Back to Search Start Over

Photographic style transfer.

Authors :
Wang, Li
Wang, Zhao
Yang, Xiaosong
Hu, Shi-Min
Zhang, Jianjun
Source :
Visual Computer; Feb2020, Vol. 36 Issue 2, p317-331, 15p
Publication Year :
2020

Abstract

Image style transfer has attracted much attention in recent years. However, results produced by existing works still have lots of distortions. This paper investigates the CNN-based artistic style transfer work specifically and finds out the key reasons for distortion coming from twofold: the loss of spatial structures of content image during content-preserving process and unexpected geometric matching introduced by style transformation process. To tackle this problem, this paper proposes a novel approach consisting of a dual-stream deep convolution network as the loss network and edge-preserving filters as the style fusion model. Our key contribution is the introduction of an additional similarity loss function that constrains both the detail reconstruction and style transfer procedures. The qualitative evaluation shows that our approach successfully suppresses the distortions as well as obtains faithful stylized results compared to state-of-the-art methods. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
01782789
Volume :
36
Issue :
2
Database :
Complementary Index
Journal :
Visual Computer
Publication Type :
Academic Journal
Accession number :
141531538
Full Text :
https://doi.org/10.1007/s00371-018-1609-4