Texture Synthesis

Presenter: Kaijian Chen Course Advisor: Prof. Hui Huang

2020-05-25

Visual Computing 2020 Texture

• Definition of Texture - Images containing repeating patterns

Texture example

可视计算研究中心 Visual Computing Research Center 2 Texture acquisition Can we acquire texture in another way? 1. Hand-drawn texture - hard to make them photo-realistic; 2. Scanned texture image • Restricted to fixed resolution; • non-uniform lighting, shadows, etc., lead to visible seams or repetition;

Not photo-realistic Visible seams and repetitions

可视计算研究中心 Visual Computing Research Center 3 Outline

• Example-based Texture Synthesis • MRF – pixel-based, patch-based, optimization-based • Auxiliary algorithm – Nearest Neighbor Search: K-coherence, PatchMatch

• Deep-based Texture Synthesis • Decoupling content and style • Texture synthesis with Generative Adversarial Network

• Procedural Texture Synthesis • Perlin & Turing patterns • Inverse Procedural Texture Synthesis

可视计算研究中心 Visual Computing Research Center 4 Example-based Texture Synthesis

• Goal • Given a texture sample, synthesize a new texture that, when perceived by a human observer, appears to be generated by the same underlying process.

input output

可视计算研究中心 Visual Computing Research Center 5 Example-based Texture Synthesis

• Analysis - Markov random field (MRF) • model a texture as a realization of a local and stationary random process.

�(��|��, ��, ��, … , ���)

Local

Non-stationary Stationary

可视计算研究中心 Visual Computing Research Center 6 Example-based Texture Synthesis

• More specific • For each output pixel in generated texture, its spatial neighborhood is similar to at least one neighborhood at the input.

input output

可视计算研究中心 Visual Computing Research Center 7 Example-based | Pixel-based method

• Main idea - Given input texture and output size, Generate target texture pixel-by-pixel in raster-scan order.

Input � Output � Treating input image as datasets Synthesis pixel based of pixel with neighborhood N; on pixel’s neighborhood;

可视计算研究中心 Visual Computing Research Center [Wei L. Y., et al.: Fast texture synthesis using tree-structured vector quantization] 8 Example-based | Pixel-based method

• Method illustration

Inappropriate with neighborhood size histogram matching

可视计算研究中心 Visual Computing Research Center [Wei L. Y., et al.: Fast texture synthesis using tree-structured vector quantization] 9 Example-based | Pixel-based method

• An efficient way to represent large-scale structure • Gaussian pyramid - represent large scale structures more compactly by a few pixels in a certain lower resolution pyramid level;

Gaussian pyramid

可视计算研究中心 Visual Computing Research Center [Wei L. Y., et al.: Fast texture synthesis using tree-structured vector quantization] 10 Example-based | Pixel-based method

• Coarse-to-fine synthesis • Constructing pyramid for Z & X, synthesis X from low resolution to high resolution; • Large-scale structure constrain - When searching for a match for pixel X, the neighborhood vector is constructed that includes the O’s, Q’s, and Y, in scanline order.

Coarsest generation (like before) Gaussian Pyramid Z & X Results - 1 layer, 2 layers, 3 layers Extended Neighborhood

可视计算研究中心 Visual Computing Research Center [Wei L. Y., et al.: Fast texture synthesis using tree-structured vector quantization] 11 Example-based | Pixel-based method

Results

可视计算研究中心 Visual Computing Research Center [Wei L. Y., et al.: Fast texture synthesis using tree-structured vector quantization] 12 Example-based | Pixel-based method

Texture replacement Extrapolation

Applications

Disadvantage - Slow synthesis

可视计算研究中心 Visual Computing Research Center [Wei L. Y., et al.: Fast texture synthesis using tree-structured vector quantization] 13 Example-based | Patch-based method

• Main idea – like pixel-based method, but unit of synthesis = block. • Much faster: synthesis all pixels in a block at once.

Input Output

可视计算研究中心 Visual Computing Research Center [Alexei Efros, Bill Freeman: Image Quilting for Texture Synthesis & Transfer] 14 block Example-based | Patch-based method

• Overlapping region - combining patches with minimal error boundary cut (dynamic programing);

B1 B2 B1 B2

Neighboring blocks Minimal error constrained by overlap boundary cut

Input texture

Min. error boundary

可视计算研究中心 Visual Computing Research Center [Alexei Efros, Bill Freeman: Image Quilting for Texture Synthesis & Transfer] 15 Example-based | Patch-based method

Generation process Texture Transfer Results

可视计算研究中心 Visual Computing Research Center [Alexei Efros, Bill Freeman: Image Quilting for Texture Synthesis & Transfer] 16 Example-based | Optimization-based method

• Main idea • rather than greedy generation, optimization method tends to directly optimize the error of each patch in output X with its nearest neighbor in input Z.

• Energy function

• Xp – patch of pixel p in X • Zp – nearest patch of Xp in Z

可视计算研究中心 Visual Computing Research Center [Vivek Kwatra, et al.: Texture Optimization for Example-based Synthesis] 17 Example-based | Optimization-based method

• Iterative optimization - For i in range(n_iters):

• Expectation – fixed {�}, modify output X according to {�};

• Maximization – fixed {�}, finding their nearest neighborhood {�};

� � � � �

Source Z � Source Z

Target Frame X Target Frame X Expectation Maximization

可视计算研究中心 Visual Computing Research Center [Vivek Kwatra, et al.: Texture Optimization for Example-based Synthesis] 18 Example-based | Optimization-based method

• Flow-guided Texture Animation

+ =

Input Input Output Warp only With OT Texture Flow Field Sequence

可视计算研究中心 Visual Computing Research Center [Vivek Kwatra, et al.: Texture Optimization for Example-based Synthesis] 19 Example-based | Optimization-based method

• Flow-guided Texture Animation • Texture similarity – Shape, size, orientation of texture elements similar to source; • Flow consistency – Perceived motion similar to flow;

Flow Consistency

Texture Similarity

Source Texture Flowing Target

可视计算研究中心 Visual Computing Research Center [Vivek Kwatra, et al.: Texture Optimization for Example-based Synthesis] 20 Example-based | Optimization-based method

• Flow-guided Texture Animation • E – Find X by solving linear equation: Minimize |�� − �| + �|�� − �| ;

� Blend � Source Z

Target Frame X

w

Warped Frame W 可视计算研究中心 Visual Computing Research Center [Vivek Kwatra, et al.: Texture Optimization for Example-based Synthesis] 21 Example-based | Optimization-based method

• Flow-guided Texture Animation • M – Find � by searching nearest neighborhood: Minimize |�� − �| + �|�� − �| ;

� � Source Z

Target Frame X

w

Warped Frame W 可视计算研究中心 Visual Computing Research Center [Vivek Kwatra, et al.: Texture Optimization for Example-based Synthesis] 22 Example-based | Optimization-based method

• Flow-guided Texture Animation • E – Minimize |�� − �| + �|�� − �| ; • M – Minimize |�� − �| + �|�� − �| ; • Initialize – W -> X; • Multiple 1. Resolution Levels; 2. Neighborhood Sizes;

可视计算研究中心 Visual Computing Research Center [Vivek Kwatra, et al.: Texture Optimization for Example-based Synthesis] 23 Example-based | Optimization-based method

Results

可视计算研究中心 Visual Computing Research Center [Vivek Kwatra, et al.: Texture Optimization for Example-based Synthesis] 24 Example-based | Optimization-based method

Input Wei-Levoy Image Texture [Wei’00] Quilting Optimization [Efros’01] [Kwatra’05] Results

可视计算研究中心 Visual Computing Research Center [Vivek Kwatra, et al.: Texture Optimization for Example-based Synthesis] 25 Example-based | Optimization-based method

• K-Coherence • Observation – Pixels that are together in the input ought to have a tendency to be also together in the output. • Main idea – Propagate the neighbors’ correspondence vector to target pixel;

可视计算研究中心 Visual Computing Research Center [X. Tong, et al.: Synthesis of bidirectional texture functions on arbitrary surfaces] 26 Example-based | Neighborhood search

• Main components of K-Coherence

• Nearest neighborhood search – search from K candidate of � − ���� ;

• Analysis – compute K nearest neighbor (candidate) for each z in {z}; {� }

� − ����� : neighbor s NN of pixel O, � − ���� : shifted by relative position K = 2

可视计算研究中心 Visual Computing Research Center [X. Tong, et al.: Synthesis of bidirectional texture functions on arbitrary surfaces] 27 Example-based | Neighborhood search

• PatchMatch • Basic idea – augments the previous coherence (or propagation) stage with a random search process, which can search for good correspondences across the entire exemplar image.

• Optimize NNF – Iteratively refine Nearest Neighbor Field (NNF) until converge. Method overview

可视计算研究中心 Visual Computing Research Center [Connelly Barnes, et al.: PatchMatch: A Randomized Correspondence Algorithm for Structural Image Editing] 28 Example-based | Neighborhood search

• Method of PatchMatch 1. Initialization – initialize random NN vector {�} for each pixel in X;

可视计算研究中心 Visual Computing Research Center [Connelly Barnes, et al.: PatchMatch: A Randomized Correspondence Algorithm for Structural Image Editing] 29 Example-based | Neighborhood search

• Method of PatchMatch 2. Propagation – calculate neighbor {� } correspondence loss and current

argmin pixel NN (�) loss, propagate �(� ) to �; � Similarity Criterion f

Propagate NN vector

ü minimal

可视计算研究中心 Visual Computing Research Center [Connelly Barnes, et al.: PatchMatch: A Randomized Correspondence Algorithm for Structural Image Editing] 30 Example-based | Neighborhood search

• Method of PatchMatch

3. Random search – improve � by testing a sequence of candidate offsets at an

exponentially decreasing distance from �;

可视计算研究中心 Visual Computing Research Center [Connelly Barnes, et al.: PatchMatch: A Randomized Correspondence Algorithm for Structural Image Editing] 31 Example-based | Neighborhood search Hole filling Hole Image retargeting Image reshuffling

Applications of PatchMatch

可视计算研究中心 Visual Computing Research Center [Connelly Barnes, et al.: PatchMatch: A Randomized Correspondence Algorithm for Structural Image Editing] 32 Example-based | Interactive Texture Transfer

• Task – Interactive texture transfer • User control - the spatial distribution of stylized textures via semantic channels; • Inner structural guidance & propagation;

可视计算研究中心 Visual Computing Research Center [Yifang Men, et al.: A Common Framework for Interactive Texture Transfer] 33 Example-based | Interactive Texture Transfer

Results

可视计算研究中心 Visual Computing Research Center [Yifang Men, et al.: A Common Framework for Interactive Texture Transfer] 34 Example-based | Pro & Con

• Pro üGenerality;

• Con ⅹConsume more memory; ⅹDifficult to edit (adding constrain); ⅹResolution limit (optimization-based);

可视计算研究中心 Visual Computing Research Center 35 Deep-based Texture Synthesis

• Fitting ability – classification & regression

Dog dataset Dog

Cat

CNN

Cat dataset

可视计算研究中心 Classification Visual Computing Research Center 36 Deep-based | Neural Style Transfer

• What can be the content & style representation? • Content (deep feature) - when CNNs are trained on object recognition, they develop a representation of the image that makes object information increasingly explicit along the processing hierarchy.

Deep feature visualization

Deep feature will care about the content of image, rather than its detailed pixel values.

可视计算研究中心 Visual Computing Research Center [Leon A. Gatys, et al.: A Neural Algorithm of Artistic Style] 37 Deep-based | Neural Style Transfer

• Experiment • Optimize noise image, so that the content representations of input and target matches.

Loss BP

Noise Pretrained Pretrained Target input VGG VGG

Content loss only 可视计算研究中心 Visual Computing Research Center [Leon A. Gatys, et al.: A Neural Algorithm of Artistic Style] 38 Deep-based | Neural Style Transfer

• What can be the content & style representation? • Style (Gram matrix) – the correlations (statistical) between the different filter responses over the spatial extent of the feature map. • Multi-scale (multiple layers) �1 �2 �3 �4

�1 �2 �3 �4

Input n_c=4 Gram matrix

可视计算研究中心 Visual Computing Research Center [Leon A. Gatys, et al.: A Neural Algorithm of Artistic Style] 39 Deep-based | Neural Style Transfer

• Experiment • Optimize noise image, so that the style representations of input and target matches.

Loss BP

Noise Pretrained Pretrained Target input VGG VGG

Style loss only 可视计算研究中心 Visual Computing Research Center [Leon A. Gatys, et al.: A Neural Algorithm of Artistic Style] 40 Deep-based | Neural Style Transfer

Results

可视计算研究中心 Visual Computing Research Center [Leon A. Gatys, et al.: A Neural Algorithm of Artistic Style] 41 Deep-based | Neural Style Transfer

• Overview • 1 CNN, 1 texture - Train feed-forward network for mapping noise to one type of texture; • Dataset - patches in single texture image; • Loss function - content and style loss;

可视计算研究中心 Visual Computing Research Center [Dmitry Ulyanov, et al.: Texture Networks] 42 Deep-based | Neural Style Transfer

Texture synthesis (w / o C) Texture transfer (with C)

Results

可视计算研究中心 Visual Computing Research Center [Dmitry Ulyanov, et al.: Texture Networks] 43 Deep-based | GAN

• Generator & Discriminator – Unsupervised way to model �����(�) • Generator – mapping noise to fake data similar to samples from real data; • Discriminator – classify the real & fake data;

Discriminator D

How close they are?

可视计算研究中心 Visual Computing Research Center [Ian J. Goodfellow, et al.: Generative Adversarial Networks] 44 Deep-based | GAN

• Iteratively train D & G • Training D to classify real & fake data; • Training G to “fake” the discriminator;

Optimize D

D(x) Optimize G Real image -> D_LossG_Loss D(z)

Discriminator

Random Generator Fake 可视计算研究中心 Visual Computing Research Center [Ian J. Goodfellow, et al.: Generative Adversarial Networks] 45 Deep-based | GAN

• Advantage – synthesize sharper & realistic unseen results;

Results synthesized from BigGAN, StyleGAN2

可视计算研究中心 Visual Computing Research Center [Ian J. Goodfellow, et al.: Generative Adversarial Networks] 46 Deep-based | Periodic Spatial GAN

• Task • Learning texture manifolds of single image or texture datasets;

Input Patches from input

可视计算研究中心 Visual Computing Research Center [Urs Bergmann, et al.: Learning Texture Manifolds with the Periodic Spatial GAN] 47 Deep-based | Periodic Spatial GAN

• Overview • 1 CNN, multiple texture - mapping latent code to different texture; • Dataset - patches from dataset; • GAN - but use global code, local code and periodic signal to control different aspect of synthesized image; 1. Global code - d ∗ 1 ∗ 1, ��������� ����������; 2. Local code - d ∗ 1 ∗ 1, ��������� ������������; 3. Periodic signal – predicted by global code;

可视计算研究中心 Visual Computing Research Center [Urs Bergmann, et al.: Learning Texture Manifolds with the Periodic Spatial GAN] 48 Deep-based | Periodic Spatial GAN

Training in single image Training in texture dataset Texture interpolation Results

可视计算研究中心 Visual Computing Research Center [Urs Bergmann, et al.: Learning Texture Manifolds with the Periodic Spatial GAN] 49 Deep-based | Results of Periodic Spatial GAN

Results

可视计算研究中心 Visual Computing Research Center [Urs Bergmann, et al.: Learning Texture Manifolds with the Periodic Spatial GAN] 50 Deep-based | Non-stationary Texture Synthesis

• Task – synthesis non-stationary texture. • MRF-based methods lack of modeling varying local statistic, therefore they are unable to maintain non-stationary texture’s original global-varying factor;

Varying local statistic – orientation, scale, etc. SOTA MRF-based

可视计算研究中心 Visual Computing Research Center [Yang Zhou, et al.: Non-stationary texture synthesis with adversarial expansion] 51 Deep-based | Non-stationary Texture Synthesis

• Main idea • When human extracts the varying factor from micro patch, we can indicate the macro-patch from the corresponding micro-patch;

可视计算研究中心 Visual Computing Research Center [Yang Zhou, et al.: Non-stationary texture synthesis with adversarial expansion] 52 Deep-based | Non-stationary Texture Synthesis

• Method • 1 CNN, 1 texture – mapping texture’s micro-patches to macro-patches; • Dataset – patches in texture; • Loss function

可视计算研究中心 Visual Computing Research Center [Yang Zhou, et al.: Non-stationary texture synthesis with adversarial expansion] 53 Deep-based | Non-stationary Texture Synthesis

Results

可视计算研究中心 Visual Computing Research Center [Yang Zhou, et al.: Non-stationary texture synthesis with adversarial expansion] 54 Deep-based | Non-stationary Texture Synthesis

Training Image

Input

Texture Transfer (1 col -> 1 network)

可视计算研究中心 Visual Computing Research Center [Yang Zhou, et al.: Non-stationary texture synthesis with adversarial expansion] 55 Deep-based | SinGAN

• Task – capture the internal distribution of single natural image;

Input Synthesized results

可视计算研究中心 Visual Computing Research Center [Tamar Rott Shaham, et al.: SinGAN: Learning a Generative Model from a Single Natural Image] 56 Deep-based | SinGAN

• Overview • 1 CNN, single image patterns – mapping noise map to image; • Multi-scale synthesis – multiple G & D pairs, modeling patterns in different level; • Loss function

可视计算研究中心 Visual Computing Research Center [Tamar Rott Shaham, et al.: SinGAN: Learning a Generative Model from a Single Natural Image] 57 Deep-based | SinGAN

Results

可视计算研究中心 Visual Computing Research Center [Tamar Rott Shaham, et al.: SinGAN: Learning a Generative Model from a Single Natural Image] 58 Deep-based | SinGAN

Applications

可视计算研究中心 Visual Computing Research Center [Tamar Rott Shaham, et al.: SinGAN: Learning a Generative Model from a Single Natural Image] 59 Deep-based | Pro & Con

• Pro üStrong fitting ability 1. End-to-End regression; 2. Learning desired representation, rather than heuristic design;

• Con ⅹData-driven – synthesis diversity depends on dataset’s diversity; ⅹUnexplainable; ⅹHard to train;

可视计算研究中心 Visual Computing Research Center 60 Procedure Texture Synthesis

• Mathematically describe texture, Synthesis texture based on parameterized procedure.

F(x,y)

Synthesize d Texture

Procedure texture synthesis

可视计算研究中心 Visual Computing Research Center 61 Procedural | Perlin noise

• Task - Synthesize natural noise in

White noise Interpolated

可视计算研究中心 Visual Computing Research Center [Perlin: Perlin noise] 62 Procedural | Perlin noise

• Method • Randomly initialize vector V on the lattice; • For each position in one lattice - value determined by weighted summing the dot products of V and deviation vector P;

P

V

Perlin noise illustration

可视计算研究中心 Visual Computing Research Center [Perlin: Perlin noise] 63 Procedural | Perlin noise

Applications - Simulating hand-drawing shaking, explosion and terrain

可视计算研究中心 Visual Computing Research Center [Perlin: Perlin noise] 64 Procedural | Turing Pattern

• How does the spots and stripes on animal form? • Reaction-diffusion - two or more chemicals can diffuse through an embryo and react with each other until a stable pattern of chemical concentrations is reached;

可视计算研究中心 Visual Computing Research Center [Turk: Generating Textures on Arbitrary Surfaces Using Reaction-Diffusion] 65 Procedural | Turing Pattern

• Reaction-diffusion model - Activator & Inhibitor • Diffusion – activator and inhibitor will diffuse according to local concentration;

Activator Inhibitor

可视计算研究中心 Visual Computing Research Center [Turk: Generating Textures on Arbitrary Surfaces Using Reaction-Diffusion] 66 Procedural | Turing Pattern

• Reaction-diffusion model - Activator & Inhibitor • Reaction – activator activate the generation of itself and inhibitor, when inhibitor inhibits the activator;

Activator Inhibitor

可视计算研究中心 Visual Computing Research Center [Turk: Generating Textures on Arbitrary Surfaces Using Reaction-Diffusion] 67 Procedural | Turing Pattern

• RD Equation a & b - the chemical concentration of the activator and inhibitor; F & G - the corresponding reaction function;

� & � - hyper-parameters controlling diffusion;

Results 可视计算研究中心 Visual Computing Research Center [Turk: Generating Textures on Arbitrary Surfaces Using Reaction-Diffusion] 68 Procedural | Pro & Con

• Pro ü Infinite details and scale; ü No in surface texturing;

• Con ⅹ Not General - Complicated tuning for desire textures; Nodes for ⅹ Hard to describe complicated textures; editing procedure texture

可视计算研究中心 Visual Computing Research Center 69 Procedural | Inverse Procedural Texture Synthesis

• Input - User-provided exemplar • Output - corresponding procedural texture’s parameter

可视计算研究中心 [Yiwei Hu, et al.: Novel Framework For Inverse Procedural Texture Modeling] Visual Computing Research Center 70 Summary

• Example-based Texture Synthesis • MRF – pixel-based, patch-based, optimization-based • Auxiliary algorithm – Nearest Neighbor Search: K-coherence, PatchMatch

• Deep-based Texture Synthesis • Decoupling content and style • Texture synthesis with Generative Adversarial Network

• Procedural Texture Synthesis • Perlin noise & Turing patterns • Inverse Procedural Texture Synthesis

可视计算研究中心 Visual Computing Research Center 71 Thank You

Q/A?

- Visual Computing 2020 - Visual Computing 2020