
Compressionism: A Theory of Mind Based on Data Compression Phil Maguire ([email protected]) Ois´ın Mulhall ([email protected]) Department of Computer Science National University of Ireland, Maynooth Rebecca Maguire ([email protected]) School of Business, National College of Ireland IFSC, Dublin 1, Ireland Jessica Taylor ([email protected]) Department of Computer Science, Stanford University, California Abstract science. However, the concept runs much deeper. Data com- pression occurs when information is bound together through The theory of functionalism holds that mental states are consti- tuted solely by the functional organisation of an implementing the identification of shared patterns. For example the se- system. We propose that the specific mechanism supporting quence 4, 6, 8, 12, 14, 18, 20, 24... can be simplified as both intelligence and consciousness is data compression. Re- the description “odd prime numbers +1”. The latter repre- cent approaches to cognition and artificial intelligence, based on a branch of theoretical computer science known as algorith- sentation is shorter, hence we can say that it has been ‘com- mic information theory (AIT), have proposed a computational pressed’. The elegance and concision of this representation view of induction, prediction, understanding and conscious- suggests that it is the true underlying pattern which governs ness which is founded on this concept. Building on existing literature we propose the term ‘compressionism’ as a label to the sequence. Accordingly, someone who manages to iden- unite theories which recognise intelligent, cognisant systems tify this pattern might claim to have understood the sequence, as sophisticated data compressors. We show how data com- because they appreciate its relationship with the causal mech- pression can shed light on information binding and offer a novel perspective on the hard problem of consciousness. anism that produced it. Keywords: Consciousness, data compression, artificial intel- The above example highlights the close link between data ligence, integrated information, combination problem. compression and prediction. As per Occam’s razor, concise models make fewer assumptions and are thus more likely to Introduction be correct. Levin’s (1974) Coding Theorem demonstrates Throughout history people have speculated whether the hu- that, with high probability, the most likely model that explains man mind might be viewed as a sophisticated automaton. a set of observations is the most compressed one. In addition, The theory of functionalism holds that mental states are con- for any predictable sequence of data, the optimal prediction stituted solely by their functional role (e.g. Putnam, 1960). of the next item converges quickly with the prediction made From this perspective it doesn’t matter how a given system is by the model which has the simplest description (Solomonoff, implemented functionally, whether it uses electronic circuits 1964). All successful predictive systems, including machines or mechanical gears. What matters when attributing intelli- and humans, are approximations of an ideal data compressor. gence and cognisance is solely the computational processes Because of its connection to prediction, data compression being implemented, a stance which opens the door for con- can be viewed as providing reliable proof of understanding. sciousness to be realised in many different forms. According to Chaitin (2006), “A useful theory is a compres- While functionalism tells us that intelligence and con- sion of the data; compression is comprehension” (p. 77). The sciousness may be implemented in a number of ways, it does more compression is achieved, the greater the extent to which not bring us any closer to understanding the specific form of a system can be said to understand a set of data. Based on this information processing that gives rise to these phenomena. principle, an enhanced version of the Turing test for machine Here, we explore the idea that the key function of the brain intelligence has been established in which the challenge is to which supports both intelligence and consciousness is data compress, to the greatest extent possible, 100 megabytes of compression. In the following sections we elucidate the na- textual information drawn from Wikipedia (see Hutter prize; ture of data compression and bring together existing theories Legg & Hutter, 2007). While people are very good at iden- which has highlighted its connection with intelligent thought tifying patterns that link words and sentences together, com- and subjective experience. puter programs struggle to identify the underlying meaning of the text and hence require longer descriptions to encode it. Data Compression Shannon (1951) carried out experiments with human par- At first blush the concept ‘data compression’ might seem es- ticipants to establish the entropy of English. He presented oteric, a niche idea related to information storage in computer masked text and asked participants to guess the subsequent 294 character, finding that, on average, each one required roughly Schmidhuber (2009) proposes data compression as the two guesses (an entropy of 1 bit per character). For a com- simple principle which explains essential aspects of subjec- puter program to match this rate of compression it would tive beauty, novelty, surprise, interestingness, attention, cu- have to make predictions as accurately as humans, meaning riosity, creativity, art, science, music and jokes. He argues it would need to identify the same deep patterns embedded in that data becomes temporarily interesting once an observer the text, which relate to real world structures and concepts. learns to predict (i.e. compress) it in a better way, making it This form of deep understanding is what we refer to as appre- subjectively simpler and more ‘beautiful’. From this perspec- ciating the ‘meaning’ of the text. tive, curiosity can be viewed as the desire to create and dis- Take a simple drawing as another example. Without back- cover patterns that allow for compression progress, with the ground knowledge and prior embodied experience, a com- level of interestingness being related to the effort required. puter program cannot associate the drawing with previously According to Schmidhuber, this drive for compression moti- experienced objects. To encode the image it must individually vates exploring infants, mathematicians, composers, artists, register every pixel on the screen, resulting in a large data file, dancers and comedians, as well as artificial systems. even when compression algorithms are applied. However, a In a similar vein, both Maguire et al.’s (2013) theory of human can recognise an even more complex pattern: for ex- subjective information and Dessalles’ (2011) simplicity the- ample, a drawing of a house. The pixels are not coloured ory view data compression as a key explanative construct in randomly. Instead there is a clear pattern that explains their the phenomenon of surprise. When people experience a stim- ordering and relations, e.g. the presence of a roof, windows ulus which is expected to be random, yet is found to be com- and a door. These regularities can be summarised concisely pressible, it triggers a surprise response. Observations of this without having to spell them out pixel by pixel. form suggest the existence of an underlying pattern where The greater the extent of the identified patterns, the shorter none was anticipated, resulting in an urgent representational the resulting representation and the better predictions it sup- updating process. Accordingly, Maguire et al. (2013) suggest ports. Imagine that pixel (254, 56) is corrupted by noise. The that people often rely on data compression rather than proba- uncompressed representation that stores each pixel individ- bility theory to judge likelihood and make decisions in many ually cannot correct this error. However, if we can identify real world scenarios. patterns in the data, we understand how it is connected. By Adopting the perspective of the mind as a compressor, appreciating this redundancy we can repair and filter errors. Gauvrit, Zenil and Tegner´ (2015) connect AIT to experimen- For instance, we realise that pixel (254, 56) is part of the roof tal observations in the areas of working memory, probabilistic of the house and thus should be coloured in the same way. reasoning and linguistic structures. They argue that the con- A weakness of the Turing test (Turing, 1950) is that a pro- cepts of data compression and algorithmic complexity pro- gram might pass the test simply by exploiting weaknesses in vide an important normative tool which can shed light on a human psychology. If a given system passes the test we can- broad range of cognitive processes, from language use to the not be sure if it was because of the quality of the responses interpretation of EEG and fMRI data (e.g. Wang et al., 2014). or the gullibility of the judge. In contrast, Hutter’s compres- sion test is more reliable. The more that data is compressed, In the field of AI, Hutter (2005) views data compression as the harder it becomes to compress it further (Chaitin, 2006). the key to an ideal mathematical definition of intelligence. He Because there is no way to cheat by using a simple heuris- defines intelligence in terms of an agent’s ability to achieve tic, data compression presents a reliably hard standard. We goals and succeed in a wide range of environments, a capacity argue that this process of identifying deep patterns through
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages6 Page
-
File Size-