Quantum-Inspired Identification of Complex Cellular Automata
Total Page:16
File Type:pdf, Size:1020Kb
Quantum-inspired identification of complex cellular automata Matthew Ho,1, 2, ∗ Andri Pradana,1, y Thomas J. Elliott,3, 2, 1, z Lock Yue Chew,1, 2, x and Mile Gu1, 2, 4, { 1School of Physical and Mathematical Sciences, Nanyang Technological University, Singapore 637371, Singapore 2Complexity Institute, Nanyang Technological University, Singapore 637335, Singapore 3Department of Mathematics, Imperial College London, London SW7 2AZ, United Kingdom 4Centre for Quantum Technologies. National University of Singapore, 3 Science Drive 2, Singapore 117543, Singapore (Dated: March 29, 2021) Elementary cellular automata (ECA) present iconic examples of complex systems. Though de- scribed only by one-dimensional strings of binary cells evolving according to nearest-neighbour update rules, certain ECA rules manifest complex dynamics capable of universal computation. Yet, the classification of precisely which rules exhibit complex behaviour remains a significant challenge. Here we approach this question using tools from quantum stochastic modelling, where quantum statistical memory { the memory required to model a stochastic process using a class of quantum machines { can be used to quantify the structure of a stochastic process. By viewing ECA rules as transformations of stochastic patterns, we ask: Does an ECA generate structure as quantified by the quantum statistical memory, and if so, how quickly? We illustrate how the growth of this measure over time correctly distinguishes simple ECA from complex counterparts. Moreover, it provides a more refined means for quantitatively identifying complex ECAs { providing a spectrum on which we can rank the complexity of ECA by the rate in which they generate structure. t-1 We all have some intuition of complexity. When pre- (a) t sented with a highly ordered periodic sequence of num- Class I Class II Class III Class IV bers, we can often spot a simple pattern. Meanwhile, (b) Class I Class II Class III Class IV highly disordered processes, such as a particle undergo- ing Brownian motion, can be described using tractable mathematical models [1]. Other processes { such as stock markets, living systems, and universal computers { lack such simple descriptions and are thus considered com- TRIVIAL PERIODIC CHAOTIC COMPLEX plex; their dynamics often lie somewhere between order and disorder [2]. Yet, a quantitative criterion for identi- FIG. 1. (a) The state of each cell in an ECA takes on a binary value, and is deterministically updated at each timestep ac- fying precisely what is complex remains challenging even cording to the current states of it and its nearest neighbours. in scenarios that appear deceptively simple. A rule number codifies this update by converting a binary Elementary cellular automata (ECA) provide an apt representation of the updated states to decimal. Shown here is Rule 26 (00011010) . (b) The evolution of an ECA can example of systems that may conceal vast complexi- 2 be visualised via a two-dimensional grid of cells, where each ties. Despite featuring dynamics involving only nearest- row corresponds to the full state of the ECA at a particular neighbour update rules on a one-dimensional chain of time, and columns the evolution over time. Depicted here are binary cells (see Fig. 1), they can exhibit remarkably examples of ECA for each of Wolfram's four classes. rich behaviour. Wolfram's initial classification of their behaviour revealed both simple ECAs that were either highly ordered (e.g., periodic dynamics) or random (e.g., neous sequence of 0s has no complexity, as there is only chaotic dynamics), as well as others complex enough to one past and thus nothing to record. A completely ran- encode universal computers [3{5]. Yet, an identification dom sequence also has no complexity { all past observa- arXiv:2103.14053v1 [quant-ph] 25 Mar 2021 and classification of which ECA rules are complex re- tions lead to the same future statistics, and so a model mains a challenge; Wolfram's initial classification was gains nothing by tracking any past information. How- soon met with myriad of alternative approaches [6{17]. ever, modelling general processes between these extremes While they agree in extremal cases, the classification of { such as those that are highly non-Markovian { can re- many borderline ECA lacks a common consensus. quire the tracking of immense amounts of data, and thus In parallel to these developments, there has also been indicate high levels of complexity. These favourable prop- much interest in quantifying and understanding the erties, together with the clear operational interpretation, structure within stochastic processes. In this context, has motivated the wide use of statistical complexity as a the statistical complexity has emerged as a popular can- quantifier of structure in diverse settings [20{23]. didate [18, 19]. It asks \How much information does Further developments have shown that even when a model need to store about the past of a process for modelling classical data, the most efficient models are statistically-faithful future prediction?". By this measure, quantum mechanical [24{31]. This has led to a quan- a completely-ordered process that generates a homoge- tum analogue of the statistical complexity { the quantum 2 statistical memory { with distinct qualitative and quan- They are determined by means of an equivalence relation 0 0 titative behaviour [27, 29, 32{37] that demonstrates even −y −y P (−!Y −y ) = P (−!Y −y ), equating different better alignment with our intuitive notions of what is pasts∼ iff they() have coincidingj futurej statistics. This par- complex [32, 38], and a greater degree of robustness [39]. titions the set of all pasts into a collection of equivalences Here we ask: Can these developments offer a new way classes , called causal states. An "-machine then oper- to identify and classify complex ECAs? At each timestep ates withS memory states in one-to-one correspondence t of the ECA's evolution, we interpret the ECA state as with these equivalence classes, with an encoding func- a stochastic string, for which we can assign a quantum tion " that maps each past to a corresponding causal (t) statistical memory cost Cq . We can then chart the evo- state Sj = "( −y ). The resulting memory cost is given by lution of this over time, and measure how the amount of the Shannon entropy over the stationary distribution of information that must be tracked to statistically replicate causal states: the ECA state grows over time. Asserting that complex X ECA dynamics should yield ever more complex strings, Cµ = H[P (Sj)] = P (Sj) log P (Sj); (1) − 2 and thus enable continual growth in structure, we can j (t) then interpret the growth of Cq as a measure of an P where P (Sj) = − − P ( y ). ECA's complexity. Our results indicate that this inter- y j"( y )=Sj − pretation has significant merit: ECA that were unani- When the model is quantum mechanical, further re- mously considered to be complex in prior studies exhibit duction of the memory cost is possible, by mapping (t) causal states to non-orthogonal quantum memory states continual growth in C , while those unanimously con- q S σ . These quantum memory states are defined sidered simple do not. Meanwhile, its application to more j j implicitly! j i through an evolution operator, such that the ambiguous cases offers a more refined view of their rel- action of the model at each timestep is to produce out- ative complexities, and indicates that certain seemingly- put y with probability P (y S ), and update the memory chaotic ECA may possess some structure. j state [28, 30]. The detailsj of how such models are con- Stochastic Processes and Structure. To formalise structed, and the memory states determined can be found our approach, we first need to introduce some back- in the Supplementary Material. The memory cost of such ground on stochastic processes and measures of structure. quantum models is referred to as the quantum statistical A bi-infinite, discrete-time stochastic process is described memory [40], given by by an infinite series of random variables Yi, where index i denotes the timestep. A consecutive sequence is de- Cq = Tr(ρ log2 ρ); (2) noted by Yl:m := YlYl+1 :::Ym−1, such that if we take − 0 to be the present timestep, we can delineate a past where ρ = P P (S ) σ σ is the steady-state of the j j j jih jj Y− := limL!1 Y−L:0 and future −!Y := limL!1 Y0:L. As- quantum model's memory. The non-orthogonality of the sociated with this is a set of corresponding variates y , i quantum memory states ensures that in general Cq < drawn from a distribution P ( Y;− −!Y ). A stationary pro- Cµ [24], signifying that the minimal past information cess is one that is translationally invariant, such that needed for generating future statistics { if all methods P (Y0:τ ) = P (Yk:τ+k) τ; k . of information processing are allowed { is generally lower 8 2 Z To quantify structure within such stochastic processes, than Cµ. This motivates Cq as an alternative means we make use of computational mechanics [18, 19] { a of quantifying structure [24, 32, 33, 41], where it has branch of complexity science. Consider a model of a been shown stronger agreement with intuitions of com- stochastic process that uses information from the past plexity [32, 38]. In addition to this conceptual relevance, to produce statistically-faithful future outputs. That is, the continuity of the von Neumann entropy makes Cq for any given past −y , the model must produce a fu- much more well-behaved compared to Cµ, such that a ture −!y , one step at a time, according to the statistics small perturbation in the underlying stochastic process of P (−!Y −y ). Since storing the entire past is untenable, leads to only a small perturbation in Cq, which becomes operationally,j this requires a systematic means of encod- particularly relevant when inferring complexity from a fi- ing each −y into a corresponding memory state S −y , such nite sample of a process [39].