Texture Generation with Neural Cellular Automata Alexander Mordvintsev* Eyvind Niklasson* Ettore Randazzo Google Research fmoralex, eyvind, [email protected] Abstract derlying state manifold. Furthermore, we investigate prop- erties of the learned models that are both useful and in- Neural Cellular Automata (NCA1) have shown a remark- teresting, such as non-stationary dynamics and an inherent able ability to learn the required rules to ”grow” images robustness to damage. Finally, we make qualitative claims [20], classify morphologies [26] and segment images [28], that the behaviour exhibited by the NCA model is a learned, as well as to do general computation such as path-finding distributed, local algorithm to generate a texture, setting [10]. We believe the inductive prior they introduce lends our method apart from existing work on texture generation. arXiv:2105.07299v1 [cs.AI] 15 May 2021 itself to the generation of textures. Textures in the natural We discuss the advantages of such a paradigm. world are often generated by variants of locally interact- ing reaction-diffusion systems. Human-made textures are likewise often generated in a local manner (textile weaving, 1. Introduction for instance) or using rules with local dependencies (reg- Texture synthesis is an actively studied problem of com- ular grids or geometric patterns). We demonstrate learn- puter graphics and image processing. Most of the work in ing a texture generator from a single template image, with this area is focused on creating new images of a texture the generation method being embarrassingly parallel, ex- specified by the provided image pattern [8, 17]. These im- hibiting quick convergence and high fidelity of output, and ages should give the impression, to a human observer, that requiring only some minimal assumptions around the un- they are generated by the same stochastic process that gen- erated the provided sample. An alternative formulation of * Contributed equally. 1We use NCA to mean both Neural Cellular Automata and Neural Cel- the texture synthesis problem is searching for a stochastic lular Automaton in this work. process that allows efficient sampling from the texture im- 1 age distribution defined by the input image. With the advent example. rxs denotes a matrix of per-component gradients 2 2 of deep neural networks, feed-forward convolutional gener- over x, and rxs is a vector of laplacians . The evolution ators have been proposed that transform latent vectors of of the system starts with some initial state s0 and is guided random i.i.d. values into texture image samples [35]. by a space-time uniform rule of f. We don’t imply the exis- Many texture patterns observed in nature result from lo- tence of a static final state of the pattern evolution, but just cal interactions between tiny particles, cells or molecules, want the system to produce an input-like texture as early as which lead to the formation of larger structures. This dis- possible, and perpetually maintain this similarity . tributed process of pattern formation is often referred to as self-organisation. Typical computational models of such 2.2. From PDEs to Cellular Automata systems are systems of PDEs [34,4], cellular automata, In order to evaluate the behaviour of a PDE system on multi-agent or particle systems. a digital computer, one must discretize the spatio-temporal In this work, we use the recently proposed Neural Cel- domain, provide the discrete versions of gradient and Lapla- lular Automata (NCA) [20, 26, 28] as a biologically plausi- cian operators, and specify the integration algorithm. Dur- ble model of distributed texture pattern formation. The im- ing training we use a uniform Cartesian raster 2D grid with age generation process is modelled as an asynchronous, re- torus topology (i.e. wrap-around boundary conditions). current computation, performed by a population of locally- Note that the system now closely mirrors that of a Cellular communicating cells arranged in a regular 2D grid. All cells Automata - there is a uniform raster grid, with each point share the same differentiable update rule. We use backprop- undergoing time evolution dependant only on the neigh- agation through time and a grid-wide differentiable objec- bouring cells. The evolution of the CA state st(x; y), where tive function to train the update rule, which is able to syn- x and y are integer cell coordinates, is now given by thesise a pattern similar to a provided example. The proposed approach achieves a very organic-looking pt = concat(st;Kx ∗ st;Ky ∗ st;Klap ∗ st) dynamic of progressive texture synthesis through local s = s + f(p )δ communication and allows great flexibility in post-training t+1 t t x;y;t adaptation. The decentralized, homogeneous nature of the Discrete approximations of gradient and Laplacian opera- computation performed by the learned synthesis algorithm tors are provided by linear convolutions with a set of 3x3 potentially allows the embedding of their implementations kernels K , K and K . We use Sobel filters [31] and a in future media, such as smart fabric displays or electronic x y lap 9-point variant of the discrete Laplacian: decorative tiles. 2−1 0 13 2−1 −2 −13 21 2 13 2. Neural CA image generator 4−2 0 25 4 0 0 0 5 42 −12 25 −1 0 1 1 2 1 1 2 1 We base our image generator on the Neural Cellular Au- Kx Ky Klap tomata model [20]. Here we summarize the key elements of the model and place them in the context of PDEs, cel- We call p a perception vector, as it gathers information lular automata, and neural networks to highlight different about the neighborhood of each cell through convolution features of the model. kernels. The function f is the per-cell learned update rule that we obtain using the optimisation process, described 2.1. Pattern-generating PDE systems later. The separation between perception and update rules Systems of partial differential equations (PDEs) have allows us to transfer learned rules to different grid struc- been used to model natural pattern formation processes for a tures and topologies, as long as the gradient and Laplacian long time. Well known examples include the seminal work operators are provided (see section 4.4). by Turing [34], or the Grey-Scott reaction diffusion patterns [22]. It seems quite natural to use PDEs for texture synthe- Stochastic updates The cell update rate is denoted by sis. Specifically, given a texture image sample, we are look- δ . In the case of the uniform update rate (δ = c), the ing for a function f that defines the evolution of a vector x;y;t x;y;t above rule can be interpreted as a step of the explicit Euler function s(x; t), defined on a two-dimensional manifold x: integration method. If all cells are updated synchronously, @s initial conditions s0 have to vary from cell-to-cell in order = f(s; r s; r2 s) @t x x to break the symmetry. This can be achieved by initializ- ing the grid with random noise. The physical implemen- where s represents a k dimensional vector, whose first three tation of the synchronous model would require existence components correspond to the visible RGB color channels: 0 1 2 3 k−1 s = (s = R; s = G; s = B; s ; :::; s ). The RGB 2We added the Lapacian kernel to make the system general enough to channels should form the texture, similar to the provided reproduce the Gray-Scott reaction-diffusion system. 2 with learned NCA models in real time. We refer readers to 4 the supplemental materials and the code release . Parameters The cell-state vector size (including visi- 12 ble RGB) is st 2 . Perception vector size is 4 ∗ 12; R p 2 R48. The hidden layer size is 96. Thus, matrices W0;W1 have dimensions 48x96 and 96x12. Total number of CA parameters is 5868. Figure 1. Texture NCA model architecture. 3. Training the Neural CA 3.1. Objective of a global clock, shared by all cells. In the spirit of self- In order to train a NCA we need to define differentiable organisation, we tried to decouple the cell updates. Fol- objective (loss) functions, that measure the current perfor- 3 lowing the [20], we emulate the asynchronous cell updates mance of the system, and provide a useful gradient to im- by independently sampling δx;y;t from f0; 1g for each cell prove it. We experiment with two objectives - a VGG- at each step, with Pr(δx;y;t = 1) = 0:5. Asynchronous based texture synthesis loss [11] and an Inception-based updates allow to CA to break the symmetry even for the feature visualisation loss [23]. Hereinafter, we refer to these uniform initial state s0. as ”texture-loss” and ”inception-loss”, respectively. These losses are applied to the snapshots of CA grid state s, and 2.3. From CA to Neural Networks are only affected by the first three values of state vectors, The last component that we have to define is the update that are treated as RGB-color channels. function. We use f(p) = relu(pW0 + b0)W1 + b1, where p is a perception vector, and W0;1, b0;1 are the learned pa- Texture Loss Style transfer is an extensively studied ap- rameters. If we look at the resulting system from the dif- plication of deep neural networks. L. Gatys et al. [11] in- ferentiable programming perspective, we can see that the troduced the approach common to almost all work since - whole CA image generator can be represented by a recur- recording and matching neuronal activations in certain lay- rent convolutional neural network (Fig.1), that can be built ers of an ”observer” network - a network trained to com- from standard components, available in modern deep learn- plete a different task entirely whose internal representations ing frameworks.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages10 Page
-
File Size-