Back to Search Start Over

StyleBrush: Style Extraction and Transfer from a Single Image

Authors :
Feng, Wancheng
Feng, Wanquan
Huang, Dawei
Pei, Jiaming
Cheng, Guangliang
Wang, Lukun
Publication Year :
2024

Abstract

Stylization for visual content aims to add specific style patterns at the pixel level while preserving the original structural features. Compared with using predefined styles, stylization guided by reference style images is more challenging, where the main difficulty is to effectively separate style from structural elements. In this paper, we propose StyleBrush, a method that accurately captures styles from a reference image and ``brushes'' the extracted style onto other input visual content. Specifically, our architecture consists of two branches: ReferenceNet, which extracts style from the reference image, and Structure Guider, which extracts structural features from the input image, thus enabling image-guided stylization. We utilize LLM and T2I models to create a dataset comprising 100K high-quality style images, encompassing a diverse range of styles and contents with high aesthetic score. To construct training pairs, we crop different regions of the same training image. Experiments show that our approach achieves state-of-the-art results through both qualitative and quantitative analyses. We will release our code and dataset upon acceptance of the paper.<br />Comment: 9 pages, 6figures, Under Review

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2408.09496
Document Type :
Working Paper