1. Deep photographic style transfer guided by semantic correspondence
- Author
-
Zhijiao Xiao, Xiaoyan Zhang, and Xiaole Zhang
- Subjects
Structure (mathematical logic) ,Information retrieval ,Computer Networks and Communications ,Computer science ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,020207 software engineering ,02 engineering and technology ,Image segmentation ,Field (computer science) ,Image (mathematics) ,Style (sociolinguistics) ,Domain (software engineering) ,Feature (linguistics) ,Hardware and Architecture ,0202 electrical engineering, electronic engineering, information engineering ,Media Technology ,Scale (map) ,Software - Abstract
The objective of this paper is to develop an effective photographic style transfer method while preserving the semantic correspondence between the style and content images for both scenery and portrait images. A semantic correspondence guided photographic style transfer algorithm is developed, which is to ensure that the semantic structure of the content image has not been changed while the color of the style images is being migrated. The semantic correspondence is constructed in large scale regions based on image segmentation and also in local scale patches using Nearest-neighbor Field Search in the deep feature domain. Based on the semantic correspondence, a matting optimization is utilized to optimize the style transfer result to ensure the semantic accuracy and transfer faithfulness. The proposed style transfer method is further extended to automatically retrieve the style images from a database to make style transfer more-friendly. The experimental results show that our method could successfully conduct the style transfer while preserving semantic correspondence between diversity of scenes. A user study also shows that our method outperforms state-of-the-art photographic style transfer methods.
- Published
- 2019
- Full Text
- View/download PDF