Photo Stylistic Brush: Robust Style Transfer via Superpixel-Based Bipartite Graph
Fig. 1. The flowchart of SuperBIG algorithms.
Abstract
In this paper, we propose a photo stylistic brush, an automatic robust style transfer approach based on Superpixel-based BIpartite Graph (SuperBIG). A two-step bipartite graph algorithm with different granularity levels is employed to aggregate pixels into superpixels and find their correspondences. In the first step, with the extracted hierarchical features, a bipartite graph is constructed to describe the content similarity for pixel partition to produce superpixels. In the second step, superpixels in the input/reference image are rematched to form a new superpixel-based bipartite graph, and superpixel-level correspondences are generated by a bipartite matching. Finally, the refined correspondence guides SuperBIG to perform the transformation in a decorrelated color space. Extensive experimental results demonstrate the effectiveness and robustness of the proposed method for transferring various styles of exemplar images, even for some challenging cases, such as night images.
Results
We conducted both qualitative and quantitative evaluations on our method, comparing it with L [1], Harmonization [2], Landmark [3], Image Morphing [4], NeutralArt [5], SuperMatch [6] and Data-driven [7]. SuperBIG is our method.
Fig. 2. Visual comparisons of style transfer among different algorithms.
Fig. 3. Visual comparisons of SuperBIG style transfer for different reference images.
Reference
[1] Erik Reinhard, Michael Ashikhmin, Bruce Gooch, and Peter Shirley “Color transfer between images” IEEE Computer graphics and applications, no. 5, pp. 34–41, 2001.
[2] Kalyan Sunkavalli, Micah K Johnson, Wojciech Matusik, and Hanspeter Pfister, “Multi-scale image harmonization,” ACM Trans. Graphics, 2010, vol. 29, p. 125.
[3] Tzu-Wei Huang and Hwann-Tzong Chen, “Landmark-based sparse color representations for color transfer,” IEEE Int’l Conf. Computer Vision, 2009, pp. 199–204.
[4] Yi Chang Shih, Sylvain Paris, Connelly Barnes, William T. Freeman, and Fr´edo Durand, “Style transfer for headshot portraits,” ACM Transactions on Graphics, vol. 33, no. 4, pp. 1–14, 2014.
[5] Leon A. Gatys, Alexander S. Ecker, Matthias Bethge, "Image style transfer using convolutional neural networks", IEEE Conference on Computer Vision and Pattern Recognition, 2016, pp. 2414-2423.
[6] Raj Kumar Gupta, Alex Yong-Sang Chia, Deepu Rajan, Ee Sin Ng, and Huang Zhiyong, “Image colorization using similar images,” ACM international conference on Multimedia, pp. 369–378, Nov 2012.
[7] Yichang Shih, Sylvain Paris, Fr´edo Durand, and William T. Freeman, “Data-driven hallucination of different times of day from a single outdoor photo,” ACM Transactions on Graphics, vol. 32, no. 6, pp. 2504–2507, 2013.