Back to Search Start Over

AttCST: attention improves style transfer via contrastive learning.

Authors :
Zhou, Lei
Zhang, Tao
Source :
Journal of Electronic Imaging; May/Jun2023, Vol. 32 Issue 3, p33018-033018-12, 1p
Publication Year :
2023

Abstract

Arbitrary style transfer has flexible applicability and aims to learn the painting styles of different artists through one model training and re-render to everyday photos. Existing domain-to-domain style transfer methods based on generative adversarial networks achieve good results in image generation quality but need more flexibility in various style transfer tasks. In the image-to-image style transfer method, focusing on the fusion of content image and style image can learn local detail texture very well. Still, there are some defects in the preservation of the content structure. To alleviate the above two problems, we leverage both advantages, using attention improves style transfer via contrastive learning. Specifically, the attention mechanism is used to fuse the multi-scale features of the content and style images. The previous shallow features are fully utilized to supplement the texture information lost by the deep features. Then the currently learned style image is distinguished from other style images by contrastive learning modules, thereby enhancing the ability of the network to learn style images. We also derive a local loss based on the transferred blocks to improve the visual quality of the generated images. We conduct extensive qualitative and quantitative analyses, the results of which demonstrate the effectiveness of our approach. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10179909
Volume :
32
Issue :
3
Database :
Complementary Index
Journal :
Journal of Electronic Imaging
Publication Type :
Academic Journal
Accession number :
164660781
Full Text :
https://doi.org/10.1117/1.JEI.32.3.033018