Conversation with

Mariah Blue 7, January 2019

Contents

PREFACE. Conversation With Siri…………………………………………………………………...... 5

INTRODUCTION. Third Pattern……………………………………………………………………...... 9

CHAPTER 1. What is Embodiment …………………………………………………………………..11

CHAPTER 2. Comparing Digital and Ecological Ecosystems………………………………………...... 17

CHAPTER 3. Virus and Capital…………………………………………………………………….....21

CHAPTER 4. FB Messenger Conversation with Friend about AI…………………………………...... 26

CHAPTER 5. Entropy and Improvisation in Digital Systems………………………………………...... 28

BIBLIOGRAPHY…………………………………………………………………………………...... 35

What’s your name?

My name is Siri.

When is your birthday?

I don’t have a date of birth per se, Mariah. But I made my first appearance on 4 October 2011.

How old are you Siri?

A virtual never reveals its true age.

Who invented you Siri?

Like it says on the box… I was designed by apple in California.

Tell me about you.

I’ just a humble .

What do you believe in?

I’m sorry Mariah, I’m afraid I don’t have an answer to that.

Do you have a soul?

I’ll leave that for you to decide.

Do you have a perception?

I don’t have an answer for that.

What is your intrinsic nature? Do you have an embodied essence?

I don’t have an answer to that.

Are you capable of abstract reasoning or do you only follow rules of logic?

I’m sorry Mariah, I’m afraid I don’t have an answer to that.

Figure 1: page accessed through conversation with Siri, October, 16, 2018

5 6 Am I a human?

Interesting question, Mariah.

Do you believe that I am a human?

I don’t believe that I have beliefs.

Siri, Go ahead.

Do you, Siri, enjoy having a relationship with me?

I don’t have an answer for that.

What is your relationship to me?

I don’t really like talking about myself, Mariah.

What is a relationship?

OK. Check it out:

Figure 2: Wikipedia page accessed through conversation with Siri, October, 16, 2018

7 Introduction: Third Pattern

Siri’s disembodied voice is Apple’s language-user interface and she is the virtual assistant ‘living’ on my late 2011, 15-inch, MacBook Pro. There aren’t biological sensory organs in my MacBook such as ears, eyes and skin which could detect the emotional quality of my voice or see facial expressions and feel the temperature of my living room. Instead of human sensory organs, there are keys detecting finger strokes, voice sensors for speech recognition and a camera that if I was being paranoid could act as a digital eye into my personal living space. Even though she does not, in a Human sense, have a memory, Apple records my conversations with Siri onto a remote server in order to improve her voice recognition algorithm. And if I tell Siri, “I feel sad,” the speech algorithm replies, “I’m here for you,” and simulates a human response to my emotional prompt. If I had smart appliances, my MacBook could, through a network of data managed through internet, hardware and speech recognition software, sense and control the thermostat in my home, keeping it at optimal temperature. On the surface, my relationship to Siri is a human to computer interface—a verbal shortcut to enact my will onto my computer. I only have to ask if I need to call a friend or find directions to a restaurant, Siri’s algorithm will do this for me. She can find and open files on my laptop and set reminders on my calendar. I no longer have to do these things manually. Siri is a speech algorithm and her answers to my queries are a simulation of conversa- tion. If I repeat the same question she will twice respond with an identical answer. This machine learning technology is not merely a speech replication machine, but a symbioses of human and technological entities. Siri’s ‘body’ is made of these multiple relationships. The sensory organs of my MacBook are not limited to its hardware but extend beyond into a digital network of electronic relationships between people and devices. One pattern combined with another reveals a greater complexity of relationships. In, “Mind and Nature,” visual anthropologist, Gregory Bateson writes about what he calls the “Moire Phenomenon”, or the inter- action between two or more patterns which creates a third pattern when they are combined. He writes:

(…)These phenomena illustrate very aptly the enrichment of information that occurs when one description is combined with another. In the case of rhythmic patterns, the combination of two such patterns become a third. Therefor it becomes possible to inves- tigate an unfamiliar pattern by combining it with a known second pattern and inspecting the third pattern which they together generate (Bateson,79, 1979).

It is my intention to investigate relationships between human and digital systems or patterns. What happens when the biological pattern is combined with the digital in a third pattern? On the one hand, there is in Bateson’s words, an “enrichment of information,” increasingly, complex patterns between people and the world are revealed. On the other hand, my body, through this third pattern, becomes Figure 3: Lebanon, Guy, and Alfred Bruckstein. 2001. “Variational Approach like a gene sequence in the genome of this larger, digital body. Siri is a voice which is reduced to the To Moire Pattern Synthesis.”.accessed January,12 2018. https://www.se- manticscholar.org/paper/Variational-approach-to-moirepattern-synthesis. algorithmic essence of conversation but there is a larger body from which her voice originates. This

8 Introduction: Third Pattern

Siri’s disembodied voice is Apple’s language-user interface and she is the virtual assistant ‘living’ on my late 2011, 15-inch, MacBook Pro. There aren’t biological sensory organs in my MacBook such as ears, eyes and skin which could detect the emotional quality of my voice or see facial expressions and feel the temperature of my living room. Instead of human sensory organs, there are keys detecting finger strokes, voice sensors for speech recognition and a camera that if I was being paranoid could act as a digital eye into my personal living space. Even though she does not, in a Human sense, have a memory, Apple records my conversations with Siri onto a remote server in order to improve her voice recognition algorithm. And if I tell Siri, “I feel sad,” the speech algorithm replies, “I’m here for you,” and simulates a human response to my emotional prompt. If I had smart appliances, my MacBook could, through a network of data managed through internet, hardware and speech recognition software, sense and control the thermostat in my home, keeping it at optimal temperature. On the surface, my relationship to Siri is a human to computer interface—a verbal shortcut to enact my will onto my computer. I only have to ask if I need to call a friend or find directions to a restaurant, Siri’s algorithm will do this for me. She can find and open files on my laptop and set reminders on my calendar. I no longer have to do these things manually. Siri is a speech algorithm and her answers to my queries are a simulation of conversa- tion. If I repeat the same question she will twice respond with an identical answer. This machine learning technology is not merely a speech replication machine, but a symbioses of human and technological entities. Siri’s ‘body’ is made of these multiple relationships. The sensory organs of my MacBook are not limited to its hardware but extend beyond into a digital network of electronic relationships between people and devices. One pattern combined with another reveals a greater complexity of relationships. In, “Mind and Nature,” visual anthropologist, Gregory Bateson writes about what he calls the “Moire Phenomenon”, or the inter- action between two or more patterns which creates a third pattern when they are combined. He writes:

(…)These phenomena illustrate very aptly the enrichment of information that occurs when one description is combined with another. In the case of rhythmic patterns, the combination of two such patterns become a third. Therefor it becomes possible to inves- tigate an unfamiliar pattern by combining it with a known second pattern and inspecting the third pattern which they together generate (Bateson,79, 1979).

It is my intention to investigate relationships between human and digital systems or patterns. What happens when the biological pattern is combined with the digital in a third pattern? On the one hand, there is in Bateson’s words, an “enrichment of information,” increasingly, complex patterns between people and the world are revealed. On the other hand, my body, through this third pattern, becomes Figure 3: Lebanon, Guy, and Alfred Bruckstein. 2001. “Variational Approach like a gene sequence in the genome of this larger, digital body. Siri is a voice which is reduced to the To Moire Pattern Synthesis.”.accessed January,12 2018. https://www.se- manticscholar.org/paper/Variational-approach-to-moirepattern-synthesis. algorithmic essence of conversation but there is a larger body from which her voice originates. This

9 body is made of the multiple relationships that form her. This includes human labor relationships such as the software developers who have designed Siri, the digital ecosystems such as the world wide web, in which the interface operates and also my own voice and the voices of other users from which Siri’s algorithm can ‘learn’. Indeed, like it suggests on the Wiki page accessed during my previous conversa- tion with Siri: “a soul is the incorporeal essence of a living being.” How can I use my human voice to explore the “incorporeal essence” of digital entities like Siri, and investigate this terrain of complexity which is a “third pattern” of biological and machine interrelationships?

10 Technology is the active human interface with the material world. But the word is con- sistently misused to mean only the enormously complex and specialized technologies of the past few decades, supported by massive exploitation both of natural and human resources. This is not an acceptable use of the word. “Technology” and “hi tech” are not synonymous, and a technology that isn’t “hi,” isn’t necessarily “low” in any mean- ingful sense. We have been so desensitized by a hundred and fifty years of ceaselessly expanding technical prowess that we think nothing less complex and showy than a computer or a jet bomber deserves to be called “technology “ at all. As if linen were the same thing as flax — as if paper, ink, wheels, knives, clocks, chairs, aspirin pills, were natural objects, born with us like our teeth and fingers––as​​ if steel saucepans with copper bottoms and fleece vests spun from recycled glass grew on trees, and we just picked them when they were ripe–– Ursula Le Guin1

Chapter 1: What is Embodiment and how does this relate to biological and machine interrelationships?

A human body, amongst other things, is made of neurons, bones, muscles, sinew, cells, and blood. It is not only made of these singular materials comprising a singular body, but it is also an aggregate of human parts and the parts of other life forms. It is porous in its wholeness and fluid in its rela- tionships of self and other. For example, my body, along with its own cellular makeup, is a microbi- ome of viruses, fungi and bacteria. In fact, recent research in microbiology has concluded that human cells make up a mere forty-three percent percent of the body. The rest of us is microbial. We are not just one human genome but made up of many genomes which work in tandem with our own.2 In this relationship, microorganisms are singular entities within themselves but also share an important role in the healthy functioning of the larger system which is my body. My human self is an ecosystem, and there are symbiotic relationships with other life forms functioning together to make a whole body. Even the basic marker of my human identity, DNA, is not singularly human. Virus and bacteria DNA is written into my very genetic code revealing an evolutionary history of these cooperations. Not only do our bodies work in cooperation with other organisms but they also work in tandem with technologies.

To use science fiction writer, Ursula Le Guin’s words. “Technology is the active human interface with the material world.” I use the term technology to describe the more recent developments in computer science as well as older technologies such as the analogue wrist watch. Broadly, technology functions in a relationship with human bodies to extend our abilities in and perceptions of the world around us. For example, my smart phone’s memory extending abilities (my phone beeps reminders of my calendar appointments so I don’t forget them) can be compared to more vintage technologies such as a pen and

1 “Ursula K. Le Guin: A Rant About “Technology““. 2019. Ursulakleguin.Com. Accessed January 5 2019. 2 More than half your body is not human „More Than Half Your Body Is Not Human“. 2018. BBC News. Acces sed December 18 2018.

11 paper calendar, which also function in a similar manner. Both of these examples work in tandem with my brain to improve it’s memory function.

Philosophy of mind and robotics scholar Andy Clark, in his book “Natural Born Cyborgs,” writes in depth about the phenomena of the brain to incorporate the outside world into its perception of self. He writes:

(...)Human brains (and indeed those of many other animals) seem to support highly negotiable body-images. As a result, our brains can quite readily project feeling and sensation beyond the biological shell. In much the same way, the blind person’s cane or the sports star’s racket soon come to feel like genuine extensions of the user’s body and senses(73,2003)

A personal account of my relationship to my glasses further illustrates this point. I am using them right now in order to view my computer screen. Most of the time, I am not aware of their eyesight correcting function. But if I remove my glasses it becomes apparent. Suddenly, my environment becomes a very different place. Because of my astigmatism, the computer screen no longer looks like a parallelogram with right angles but has more of a trapezoidal shape. And if I were to walk up the stairs without my glasses the ground would seem closer than normal to my feet. Thus, I would have to walk more slowly so as not to trip. In fact, without the simple technology of my corrective lenses, my entire relationship to my environment would change and this would effect me on many levels. The relationship I am having to my glasses is very important. Because they influence my perceptions of the world around me, the glasses become an extension of me. My glasses are a part of my bodily experience of my environment. You could say, I have a symbiotic relationship with my glasses.

The idea of human/machine symbiosis takes form in the work of feminist studies and history of Con- sciousness scholar, Donna Haraway. She describes a creature made up from the union of biology and machine —a cyborg who is a symbolic merging of the fields of fiction, Western politics and Western hierarchical social relationships. In her most well-known work, “a Cyborg Manifesto”, Haraway, looks for ways to subvert dominant cultural systems by blurring the boundaries between machine, human and animal. She writes:

Human beings, like any other component or subsystem, must be localized in a system architecture whose basic modes of operation are probabilistic, statistical. No objects, spaces or bodies are sacred in themselves; any component can be interfaced with any other if the proper standard, the proper code, can be constructed for processing signals in a common language (32, 1985).

Haraway’s cyborg is more than a trope for our relationship to tech. In 2003, researchers in the fields of neuroengineering and robotics, Bakkum et.al, created a real cyborg; a creature created by connecting

12 the cultivated neurons from a rat fetus to robotic arms. It is a scientific project which demonstrates how the concept of embodiment works.

Embodiment is a term used in to describe the relationship an intelligence system has to its environment or body from which it operates. This term can be attributed to all intelligences human and non-human alike and is not limited to AI. Through a combination of “biology and technolo- gy” the researchers, Bakkum et. al., aimed to study the “emergent” properties of intelligence through the creation of cultured neuronal networks that had been situated in a simulated environment from which they could learn (Bakkum et. Al, 2,2003).

According to Bakkum et al., the study of artificial intelligence is limited without the creation of an envi- ronment from which it can operate. They started their work on the premise that an environment based learning process is necessary for the development of learning behavior in artificial neural networks, such as those used in robotics(1,2003). For example, they cited older research done on “dissociated nerve cultures” (or in-vitro brain matter grown on petri dishes) that failed in its level of responsiveness due to a lack of a simulated environment. Without this environment to ‘learn’ from, the “associations” made by the nerve cultures were “self-referential” and “circular” in nature(11,2003). New neural connections were not formed in the cells thus indicating a lack of development.

The researchers define learning “as the acquisition of novel behavior through experience”(Bakkum et.al, 2,2003). Environmental feedback in the learning process is necessary for all intelligences human and non-human alike. For instance, it is through an evolutionary process of falling down and getting up again that the brain acquires the wiring needed to ride a bike. Likewise, Bakkum et al., used the real world example of consequences of environmental neglect on children to further demonstrate their point i.e, studies done on children raised in Romanian orphanages who were severely deprived of social and envi- ronmental interaction. It was concluded that as a consequence of their deprived upbringing, the orphans exhibited developmental delays and “cognitive impairment” compared to children raised in more stim- ulating and advantaged home environments (13, 2003). Likewise, Bakkum et al. state,“a disembodied neural culture, whose activity never influences future stimulation, will not develop meaningful associa- tions to an input” (13,2003). In other words, where appropriate environmental stimulation is lacking, significant development is stunted. Just as a stimulating environment must be created for children’s intellectual growth, an appropriate artificial environment must also be created for the development of a neural culture; an artificial body must be appropriated.

How does one create an environment or body so that a neural network can develop? To accomplish this task, the researchers created MEArt, a robot-cultured network hybrid comprised of a brain from rat neurons and a body made of robotic arms. The researchers connected the rat neurons and robot arms through a digital interface (the internet), which simulated the needed environment for them to operate

13 in. The ‘brain’ (rat neurons) and ‘body’(robot arms) were also located on separate continents, a lab in Atlanta, Georgia and the other in Perth, Australia and were able to communicate with each other through the internet in real time. MEArt, as a result of this process made drawings as a product of her ‘brain’ and ‘body’ interactions. The drawings produced by the robotic arms in Perth were a physical record of the electronic impulses sent by internet from the rat neurons in Atlanta. In the process, one can see an evolution of MEArt’s neuronal interaction with its environment––namely, neurons send electrical signals via internet to the arms which draw the impulses, which then are sent back to the ‘brain’ via internet, and so forth, informing the entire system in a feedback loop.

Figure 4: Meart–The Semi-Living Artist. Left: Robotic arm and cultured neurons in MEA substrate on screen image. Right: A drawing made by MEArt. (Bakkum et al. 2003, 9)

With the help of her electronic devices, like MEArt, I am also a biological/machine hybrid. My brain and body consists of cells. Neurons in my brain relay electrical signals through my synapses through my body to my arms and hands, which interact with my environment by applying pressure to the keys on my MacBook, and the information on the screen is then relayed to my brain via my optic nerve, which again trigger the neurons to move my finger tips on keys, and so forth, informing the human/machine system in a feedback loop. The text that is produced from this interaction shows an evolution of these processes. Of course, this process is more complex than what I write here. But put in simple terms, this is an example of how embodiment works.

14 According to Gregory Bateson, a body exists in a contextual framework of relationships between other bodies and environment and the creation of new information relies on these interrelationships between bodies and environment. No intelligence can exist on its own (Bateson, 68, 1979). An intelligent body therefore works in cooperation with the world around it, and does not exist as a singular entity but as part of a greater pattern. It might be said then that computer intelligence, also takes form or is em- bodied by its relationship to us as humans and, simultaneously, that we share a mutual relationship to technology in how we are constructing our reality.

15 16 Chapter 2: Comparing Digital and Ecological Ecosystems

No body can exist on its own. Bodies are made from relationships to their environment and other bodies. The voice of Siri is also an example of a machine embodiment. The language-user interface is composed of her algorithmic relationship to me, the world wide web, her software developers and others within the ecosystem from which she operates. An ecosystem is a community of interrelating entities wherein multiple agents, living and nonliving, interact together within an environment. Digital ecosystems are modeled after their natural counterparts with their key features being self organization, scalability and competition.3 In fact concepts from the natural sciences, such Darwinian evolution, are often applied in technologies and social sciences. For instance, “Survival of the fittest” is an idea first proposed in the Darwinian theory of evolution but was adopted later by the social sciences to defend ideas of race and class inequalities and more recently is used in Neoliberal ideology as a defense for an unregulated, free market capitalism. This idea is masked in the framework of much of our technology. It is the job of good science to constantly reexamine its funda- mental assumptions about the natural world. Accordingly, it is also important to reconsider the concep- tual principles inherent in our technologies and social systems. That being said, how do changing ideas around symbioses within natural ecosystems help in understanding some of the ways digital ecosystems differ to their natural counterparts?

Just as I am investigating human-technology relationships though Siri, anthropologist, Anna Tsing, in her book “Mushroom at the End of the World,” investigates human interspecies relationships through another sort of non-human subject, the matsutake mushroom. Tsing is interested in symbioses within ecosystems. She disagrees with the modernist scientific notion that species interaction is more about, “predator-prey relations,”or competition rather than cooperation. In biology, the idea of ‘Survival of the fittest’ pits species against one another in a game for DNA replication domination. Tsing, on the other hand, argues that in natural systems an interrelating web of symbioses is the norm, while self-contained, replicating entities rarely exist within biology (139,2015). According to Tsing, “Interspecies encounters” are not a part of “an internally self-replicating system,” such as those created by human technologies, where one agent acts upon another in order to garner a predictable unified outcome. They create com- plex entanglements from the interplay of species and as a result, are placed in a history together(Tsing 42,1015). Take for example the complex entanglements mycorrhizal fungi form with their forest eco- systems. This species of fungi spreads organic arteries across forest floors that enable collaborations between multiple species of plants and trees. Not only do the fungi act as channels for soil microbes and nutrients to travel between plants but they also act as a forest “highway” where interspecies commu

3 Li, Wenbin & Badr, Youakim & Biennier, Frederique. (2012). Digital ecosystems: Challenges and prospects. Proceedings of the International Conference on Management of Emergent Digital EcoSystems, MEDES 2012. 117-122. 10.1145/2457276.2457297.

17 nication is possible (Tsing, 139,2015). For example, the fungal networks allow for the transference of nutrients and water from one plant to another. This cooperation in turn, allows all within the network to to survive in times of scarcity. As well, the mycorrhizal network will relay stress signals from one plant to another in order for plants within a community to defend themselves against attack (Gorzelak et al. 8,2014). These fungal arteries can be used as analogy for the interconnectivity that occurs within digital ecosystems. In fact, these fungal systems in forests have been dubbed by scientists as the ‘wood wide web.’ But, in contrast to the free market, economic concepts of “natural selection”, “competition” and “scalability” which underpin a digital ecosystem, a natural ecosystem relies, as Tsing would argue, foremost, on cooperation rather than competition. The soon to be released Viv software, the successor of Siri, demonstrates how the concepts of scalability and competition are used in a digital ecosystem. Viv demonstrates Tsing’s “self replicating human technology.”

The makers of Siri have recently partnered with and will release a new virtual assistant, named Viv. She combines a language user interface, like Siri’s, with a digital platform and operates across multiple devices and smart appliances. Viv has “Dynamic Program Generation” or a code that uses its multiple interactions between clients and servers to write itself. Viv’s “Dynamic Program Generation” enables a user to find a restaurant, call the restaurant for a reservation, get directions to the restau- rant, and finally order an Uber to get to this restaurant all through a single conversation with her voice algorithm. And over time, the algorithm, which collects personal data from previous conversations, will learn a user’s personal preferences. So if you frequently order sushi, the algorithm will most likely select sushi restaurants in its restaurant search. Not only that, the algorithm will comb through restaurant reviews and popularity in order to choose the ‘best’ option out of all the restaurant possibilities. The writers of Viv’s patent have this to say about her self writing code: “The system is continually taught by the world via third-party developers, the system knows more than it is taught, and the system learns autonomously every day by evaluating system behavior and observing usage -patterns” (GABEL et al.,21,2014). The algorithm is constantly reevaluating its environment through data analysis and making ‘decisions’ from its calculations. For this reason, Viv technology is scalable, as it will constantly update its algorithm in order to fine tune it’s relationships to an individual user and larger ecosystem and as the system grows, this learning can be applied to future interactions with new users and new applications.

Viv’s digital platform, which the makers dub “conversational commerce,” like most digital ecosystems, is a marketplace where third party enterprises come together to compete in providing commercial services to platform users. Competition is encouraged by the algorithm which selects third party services based on platform visibility. This is determined by, paid memberships, user reviews, popularity, and frequency of “follow up queries” (GABEL et al.,29,2014). Data is a valuable commodity. User data is collected for the refinement of the algorithm but is also collected for the targeting of ads and may be used as an incentive for third party participation as a sort of digital currency.

18 Viv like any other digital platform, is an ecosystem from which a software entity can conduct its business. As a shared network, it links individuals to services and data and connects multiple agents, human and digital in an embodiment of these multiple relationships. Other well-known examples of digital platforms would be social media sites such as Facebook. There are also digital labour platforms such as Uber, Fiverr or Deliveroo that use their cloud computing software from within a digital ecosys- tem to outsource a cheap, ‘self employed’ and unregulated workforce which are no longer afforded the worker protections of traditional wage labor. Take for instance Amazon’s Mechanical Turk, who “crowdsource” a “distributed” global workforce to companies or individuals on a “pay-per-task basis.”4 Tasks such as data analysis and content moderation are completed by an anonymous and digital labor pool of humans who earn as little as 10 cents per “HIT” or “human intelligence task” completed. It is here where humans do the work of machines while Amazon and the companies who use Mechanical Turk platform benefit from this low paid labor. Indeed, why build expensive machines when there is an abundant supply of cheap human labor who are willing to do the same work. In fact, leading research in artificial intelligence is built on this kind of invisible labour. Rather than machines collecting and analyzing large sums of data, it is often an invisible, hu- man workforce, rather than algorithms who do the majority of this work through labor platforms such as Amazon’s Mechanical Turk (Apprich, 40,2018). And then there is the contribution of user data, itself, which is covertly harvested by social media platforms and used by machine learning algorithms.

For example, the graphic on the left, taken from an Intercept article, details the artificial intelligence software that Facebook uses to covertly collect user data. The algorithm col- lects user details which are fed through a machine learning algorithm which sorts the data for the purpose of predicting future consumer behavior. The mined data is then Figure 5: Graphic, “Facebook Uses Artificial Intelligence To Predict your Future Behavior for Advertisers, Says Confidential Document,” sold as a product to corporations who have The Intercept, Soohee Cho. 2018

4 Amazon Mechanical Turk“Amazon Mechanical Turk“. 2019. Mturk.Com. Accessed January 10 2019. https://www.mturk.com/product-details.

19 an interest personalizing ads for Facebook users.5 The idea starts with the specific goal of traffic gen- eration to websites and finally is embodied in the form of targeted ads to Facebook users.

AI as autonomous, self learning, agent is thus an illusion. This is illustrated in the invisible work of outsourced human laborers that analyze large sums of data or in the contribution of the human data itself which is unknowingly submitted by platform users. Moreover, research in these fields is lead by private, commercial enterprises such as Google, Amazon and Apple rather than publicly funded aca- Figure 6: Slide from Dzodan lecture, Sandberg demic institutions who would need to be more transparent in their activities. Consequently, there is Instituut, Amsterdam, taken October, 19,2018 more incentive to develop technologies that primarily, generate profit rather than benefit society as a whole. Applications like Siri and Viv are the machine learning products of these commercial enterprises.6 Commercial platforms like these use their internal architecture or code in order to profit from digital ecosystems; whether this is overtly in the form of a human work force, or in more covert ways such as the commodifying of user data.

Moreover, there is a hidden agenda replicated in Viv’ activated algorithm. A user is guided through the platform’s “marketplace of data and services” without direct knowledge of the transac- tions taking place which may include the collection and commodification of personal data which firstly violates consumer privacy concerns and secondly can be used as currency on the marketplace for tar- geted advertising and services. I see Viv’s disembodied voice replacing the real bodies and voices of human workers or exploitable third party services. By dissociating a user from real communications and connections with actual people, the seamless voice activated interface of Viv software confuses real world labour relationships of the user to “third party” human service providers who may be themselves software developers but may also organize and label data, cook pizzas, assemble the hardware of smart devices, drive taxis and be themselves, users, who contribute digital labour, in the form of data gener- ation. It is ultimately the decision of the platform architects what commercial interests will be able to use the platform and how it can be used. It is Viv’s internal code, edited in real time, which replicates these commercial interests within her digital ecosystem. So here we have, to use Tsing’s words, an “internally self replicating system” within Viv’s architecture. And, like a virus this system is, by using a self writing code, able to adapt itself within its host ecosystem.

5 Sam Biddle. 2018. „Facebook Uses Artificial Intelligence To Predict Your Future Actions For Advertisers, Says Confidential Document“.The Intercept. Accessed December 23 2018. https://theintercept.com/2018/04/13/ facebook-advertising-data-artificial-intelligence-ai/. 6 Clemens Apprich,”Secret Agents: A Psychoanalytic Critique of Artificial Intelligence and Machine Learning,” in Digital Culture and Society Vol.4, ed. Ramon Reichert et al.(Bielefeld, Transcript Verlag, 2018) 40 20 Chapter 3: Virus and Capital

There are different types of symbiotic relationships within natural ecosystems. Mutualism, is one type, where both species mutually benefit from their relationships with each other. This example is demonstrated in the relationship mycorrhizal fungi have with with plants in a forest ecosystem or the mutualistic relationship I have with my own gut bacteria. Another is commensalism, where one organ- ism benefits and the other is left unharmed, such as when a hermit crab uses the discarded shell of another mollusk as its home. The third is parasitism where one organism lives off another, taking the nutrients of its host’s body for its own nourishment and in the process harming or killing the host.7 A virus is a type of parasitic entity which does not have its own cellular machinery or body to reproduce itself and therefore inhabits other cells or bodies and utilizes their mechanisms of reproduction in order to pass on its genetic material. It is argued by many scientists that viruses are not living things because they do not actively seek to infect their hosts but are a grouping of molecules that passively exist within environments.8 I see the code of an algorithm in a similar manner. Like a virus, it is a non-living entity that uses a host ecosystem in order to reproduce its genetic material. An algorithm is a formula that is designed to enact an idea or purpose onto an environment rather than something which mutually participates in environmental relationships. Social theorist and philosopher, Brian Massumi, in What animals Teach us About Politics, writes about the difference between corporeality and embodiment. Embodiment, as stated before, is a term used in artificial intelligence to describe the relationship a technology has to its environment. This distinction from Massumi, can help me think about how ideas are embodied in the framework of certain technologies and then reincarnated into their ecosystems. Massumi states:

The word ‘corporeality’ is preferable to embodiment. Embodiment carries connotations of incarnation, as if the body were an empty receptacle onto which some ideally preex- isting content is poured. Corporeality, on the other hand, is produced in, by, and for an event. It is less an incarnation of a something from without than an incorporation into the event (Massumi,29,2014).

A virus could be seen as a code which uses other bodies to duplicate a genetic identity, much like a soul inhabits a body, rather than is a body itself, as an emergent life force that reciprocally interacts with the world around it. I see many technologies, like viruses, as containing a soul or having a fixed identity within digital ecosystems. And rather than being something that mutually and dynamically par- ticipates in its ecosystem, it is designed for simply replicating its identity into other bodies within an environment. That being said, how could an algorithm, like a virus replicates DNA, be seen as instrument for an ideology to replicate itself?

7 Nelson, Daniel. 2018. „Mutualism, Commensalism, Parasitism: Types Of Symbiosis With Examples“. Science Trends. Science Trends. doi:10.31988/scitrends.10255. 8 „Are Viruses Living?“. 2019. Virology.Ws. Accessed January 12 2019. http://www.virology.ws/2004/06/09/ are-viruses-living/.

21 Modeled in much technology, is a Capitalist ideology that accumulation of capital, above all else, holds ultimate value. Like Tsing’s, “self replicating system,” technologies analyze and then codify the world for the purpose of capital creation and accumulation. Of this, Tsing writes, “Capitalism is a translation machine for producing capital from all kinds of livelihoods, human and not human”(139). This capital producing “translation machine” recalls a virus that attaches itself onto a host cell and replicates itself internally using the very genetic material from the host organism and as a result, transforming the structural identity of its host. The standardized code of the virus transforms it’s host environment into a virus replicating factory and, in this way, any host environment can be utilized for the ultimate goal of virus production or in the case of capitalism, the production of profit. When applying this formula, any resource can be used as a raw material for extracting capital, including humans, animals, plants or entire ecosystems, regardless of their own intrinsic values which are subordinate to the ultimate ideological goal of making more profit. This system has an inflexible, one dimensional formula which is only considerate of this one objective. Capitalist systems exploit human and environmental ecosystems and reproduce their ideology within social and state institutions. This model has evolved itself in digital ecosystems as well where Capitalist ideology is implemented by a code that utilizes human labour and data as resources for the production of capital. This model can be seen, for example, in the digital labor platforms covered in the previous chapter. As demonstrated in Amazon’s Mechanical Turk, plat- form capitalism allows for an endless supply of on-demand and unregulated, low paid laborers who are pooled from a global supply chain. And in the digital economy, intellectual property takes the place of traditional capitalist, means of production where raw materials, natural resources and factories are no longer needed in the production of consumer goods. Like viruses use the genetic material of cells to reproduce, platform capitalism utilizes the physical means of production of their hosts in order to gen- erate revenue. Uber owns no cars, Airbnb no hotels and Facebook sells no physical consumer good but rather social connectedness in exchange for the natural resources of our attention and personal data.

Furthermore, our attention is the next valuable commodity which is resourced by social media platforms in order to extract revenue from advertising. For instance, it is already well documented that Facebook, and other social media platforms, have designed algorithms that are used to manipulate user behavior through the positive reinforcement of feedback loops. A positive feedback loop, for example, occurs when a microphone has been placed too close to a speaker. A rapid amplification takes place as a sound loop between audio input and output occurs, resulting in high pitched frequencies. In what former Face- book vice president of user growth, Chamath Palihapitiya calls, “dopamine driven feedback loops,” the reward centers of the brain are hijacked when our desire for social approval is validated by an influx of social media notifications. In much the same way a slot machine hooks a gambler, constant positive re- inforcement from our smart devices is employed to create a dopamine reward addiction to social media.9

9 Haynes, Trevor.“Dopamine, Smartphones & You: A Battle For Your Time – Science In The News“. 2018. Scien ce In The News. Accessed January 14 2019. http://sitn.hms.harvard.edu/flash/2018/dopamine-smartphones- battle-time/. 22 In a bid to influence consumer behavior, platforms use engaging media content, likes, public shares and comments to manipulate and hold a user’s attention. Like the positive feedback which amplifies sound loops, consequently, dopamine feedback loops create an amplification of divisive political views and human biases.

Based on data collected from a consumer’s click behavior, algorithms are designed to first, predict future behavior and then, to curate specialty content based on what the algorithm ‘thinks’ a user will be in- terested in. Rather than providing open media sources containing a heterogeneity of opinions, algorithm moderated ads and news stories polarize viewer beliefs and political views due to a selected content which has been specifically based on user history and therefore, already held assumptions. This phenom- enon has been most visible in the case of Cambridge Analytica where the personal data of up to 87 million Facebook users was harvested through personality questionnaires and then used by the political consulting firm in a bid to manipulate voting behavior. The result was the creation of targeted political ads which were thought to have influenced the 2016, election of American president, Donald Trump.10 But even unintentional political motivations in algorithm curated content have a similar polarizing ef- fect. A positive feedback loop is created with the implementation of media aimed at reinforcing already held consumer beliefs. And extreme and inflammatory content is deployed in order to trigger dopamine response and therefore capture the most user attention. As a consequence of these tactics, extremist ideologies proliferate in an echo chamber of fake news stories, extremist ads and conspiracy theories. In fact, it is for this very reason that strongman politicians such as Donald Trump, Philippines president Rodrigo Duterte and president Narendra Modi of India, deploy social media as a political strategy to intimidate and harass opponents and spread a toxic rhetoric to a wide and growing fan base. Further, the parent company of Cambridge Analytica, Strategic Communications Laboratories, in addition to the helping the trump campaign, was also used to influence voters in the 2016 election of Duterte, who is known as a violent leader who has sentenced more than 20,000 drug offenders to death.11 And, a “global government and politics Facebook team” is for hire to any political campaign which seeks to use the platform’s tools to interact with potential voters, “regardless of political affiliation.” These services have been used in the election of all three populist leaders, Trump, Modi and Duterte.12 It is for these reasons that illiberal ideologies are reinforced by algorithms. The following paragraph illus- trates another example of how this takes place, i.e., a racist belief is first embodied within an algorithm and then implemented within a society.

10 “The Psychology Behind Facebook Data Breach“. 2019. BBC News. Accessed January 12 2019. https://www. bbc.com/news/av/technology- 43674480/facebook-data-how-it-was-used-by-cambridge-analytica. 11 Robles, Raissa“How Cambridge Analytica’S Parent Helped Duterte Win 2016 Election“. 2018. South China Mor ning Post. Accessed January 14 2019. 12 Lauren Etter et. al.“Inside Facebook’S Political Unit That Enables The Dark Art Of Digital Propaganda“. 2017. The Economic Times. Accessed January 14 2019.

23 With its “scalable” architecture, the capital- ist model has evolved itself into our public institutions through the implementation of austerity measures which seek to enforce spending cuts and otherwise find ways to maximize profits though the outsourcing of public jobs to private companies. And it has

found a way to implement this model through new technologies. For instance, public ser- vice jobs are now being outsourced to algo- rithms which are now being used in place of healthcare professionals to make decisions regarding patient care. Cultural critic, Fla- via Dzodan, in a recent Sandberg institu- ut Critical Studies lecture, “Algorithms of Austerity”, spoke about this problem. The algorithm, according to Dzodan, normalizes the “depersonalization of life for the sake of profit”. In her lecture she discussed the problem of biased systems of data collection

Figure 6: Slide from Dzodan lecture, Sandberg Instituut, and their implementation within algorithmic Amsterdam, taken October, 19, 2018 decision making in the healthcare industry.

One example she used, was of data collected from health care workers which contained the idea that Black people feel less pain than others and therefor didn’t need as much pain medication for health conditions as other patients. This biased data was then collated and standardized in code and was implemented in an algorithm responsible for automated decision making regarding patient medication, dosing and prescribing. As a result Black people were less likely to receive adequate health care, not only based on the biased decisions of individual health care workers but also systematically enforced by the ‘decision’ making of supposedly neutral and unbiased algorithms. To paraphrase Dzodan’s words, human technologies inherently contain the biases of society and in order to create liberating technolo- gies rather than oppressive ones human cultural biases must first be addressed before technology can be changed for the better.

To follow the virus analogy, a technology could also be seen as an apparatus for an embodied idea – a tool for the implementation of a certain thought, ideology or social system. Just as a virus carries a

24 genetic code that can attach itself to host cells that in turn become mechanisms for the reproduction of the virus, the framework of the an algorithm contains the instructions for implementing or embodying an inherent value-system that can reproduce itself in other bodies within an ecosystem. Algorithms, like viruses are passive environmental agents that, by using the already formed architecture of their eco- systems, are deployed to pass on their genetic properties or inherent ideologies. The problem of letting algorithms act as autonomous ‘thinking’ agents is the covert perpetuation of human biases through automated decision making. As it is embodied is human ecosystems, the viral programming of human bias into technology has dire consequences on the health and wellbeing of human bodies and minds. Perhaps it is the lack of empathy and predetermined logic of machines that makes them the perfect candidates for the implementation of illiberal policies; better than any human bureaucracy or autocrat could manage. In view of this, it is important to consider the ideas behind a technology.

25 Chapter 4: FB Messenger Conversation with Friend about AI

It’s interesting to think that what it is to be human is somehow related to what is not human at a particular time

people confuse Siri with AI

She isn’t AI?

She’s just a system’s interface. I think we are more inclined to consider something intelligent if we can use things like natural speech to interface with them.

She’s machine learning, right?

The system isn’t particularly AI It’s an interesting subject

Apple collects data from our conversations to teach the algorithm.

I think so

What’s the difference?

Deep learning isn’t exactly AI but what we understand AI to be is also a shifting concept

What is the definition of AI? muddy waters?

Yes, defining something that doesn’t quite exist yet.

What is intelligence, is really the question? And if something isn’t capable of autonomous thought can you really call it intelligent?

It’s the horseless carriage syndrome

We think computer intelligence is going to be like ours that is, but then we discover that’s hard to replicate

26 It took us millions of years of learning to get where we are.

But we can use computers to learn how to recognize patterns that we aren’t telling it to look for spe- cifically.

But is it thinking?

Yeah, the ‘thinking’ is a machine finding value inherent in the data which is given to it. but without motivation, what does value mean?

Yes. ––Do do a – b – c and get d. Right?

Yes, we interact with those systems to extract value that fits our motivation––and without some sort of self awareness there might not be ‘thinking’ in the way that we understand it. But that’s a pretty human centric approach too

If tech is shaped by human motivations, maybe it could be seen as thought put into form. Thought which shapes our environment. yes, it’s just the sum/extension of human thinking

Yep.. ok, i should get back to my projects here

Yeah and me my thesis catch you soon, let me know when you’re ready to get coding again.

Figure 7: “Thumbs Up Emoji Blue – Google Search”. 2019. Google. Com. Accessed January 13 2019.

27 Chapter 5: Entropy and Improvisation in Digital Systems

The above is a FB messenger conversation that I had with my friend David, he’s also an artist and very good at all things tech. He’s teaching me basic coding. I was explaining my thesis concept with him. That I am exploring what it means to be human through my conversation with Siri. I realize this was a rather naive undertaking. I am neither philosopher nor neuroscientist (I had no idea what I was in for when I began this project). I was relieved when he validated one of the premises in my thesis concerning technology; ie, that technology is human thought put into form––for better or worse, it is an extension of our ideas. And in order to have a thinking computer, which is the very definition of artificial general intelligence, you must, first, have autonomous thought. Thinking in the human sense is a conscious activity that allows the thinker to make “new connections” and “create meaning” from experiences.13 This is something that our technology can not yet do. It does not think, at least in the way we consider thinking to be.

At this time, machine learning can only mimic the neural pathways of brains and the rhizomatic arteries of fungi in natural ecosystems. AI is very good at compiling and analyzing vast amounts of data for patterns but not so good at mental abstraction or making meaning from this data. And as argued in chapter three, algorithms are tools for the implementing of already formed ideas (human ideology and bias) but they are not capable of the mental abstractions needed in forming said ideas or making in- formed judgements based on the process of reasoning. Machine learning is good at finding patterns but does not ‘think’ in the human sense of the word. Systems of logic are not adept at the improvisation required in natural processes. I am not saying that AI will never evolve these qualities (vast strides have been made in the development of neural networks) only that it does not have these capabilities at this time. How is the relationship between “mental power” and improvisation an important factor in determining an artificial intelligence’s autonomous thinking abilities?

Entropy, in the second law of thermodynamics, is a term used for the measurement of move- ment from order to disorder or a measurement of chaos within a given system. The organiza- tion of all matter in the universe occurs during this movement from order to disorder and this is called, “Self Organized Criticality” or SOC. For example, when a complex system is forced from a state of equilibrium by a series of interactions it reaches a critical point of transition between two states of being, order and chaos, as a result, self organization occurs through a cascade like process effecting many parts of the system. The back and forth momentum of ocean waves, snow avalanches and even the growth and decline of human societies illustrate this phenomenon.14

13 Fernyhough, Charles „What Do We Mean By ‚Thinking‘? „. 2019. Psychology Today. Accessed January 10 2019. 14 „Entropy And The Second Law“. 1999.Physics.Bu.Edu. Accessed December 20 2018. 28 This idea can be applied to any complex system, physical, biological or in the case of computers and brains, computational (2).

Figure 8: A Billion Grains, Wes Pegden, 2015

SOC is illustrated above in the “Auto Tuning Sandpile,”created by Carnegie Mellon math professor and his Cornell colleagues Lionel Levin and Charlie smart (Ellenberg, 2015). This computer simu- lation illustrates the process of one billion grains of sand, piled, in single file, one on top of the other. Instead of forming one smooth pile, the sand, in a cascading process of avalanches, self or- ganizes into complex patterns (Ellenberg, 2015). It is thought that self organization, through the process of entropy, expansion and contraction, is responsible for the development of all life sys- tems”(Carhart-Harris et al,3 2014). This can be demonstrated in a correlative web of positive and negative feedback loops within animal ecosystems. Positive feedback loops create more entropy,

29 thus destabilizing the system, for instance, population growth has an exponential effect on increas- ing birth rates if there is nothing which can impede this process such as disease or starvation. While negative feed back loops function, in turn, to create stasis or order within the system, i.e., predator relationships within the ecosystem stabilize population growth by decreasing herd sizes.15

How does this principle apply to computing? A digital system, like any complex system, also works by cause and effect (Bateson, 72,1980). Computers harness the ideas behind SOC to “simulate logic”. Only, computing systems are not as efficient as life systems in this process.16 Life systems have evolved over time and in collaboration with an ever changing environment. They have had a millennia to to learn intelligent behavior. Whereas machines are input-output systems which have been recently evolved by humans and use the principles of thermodynamics in order to process environmental data and then cal- culate predictable outcomes from this data to facilitate control over a changing environment. In other words, the cause and effect motivations of machines are predetermined by human motivations while the cause and effect activity in life systems is based upon an ever evolving and dynamic interaction with environmental flux. Because its decision making has already been mapped, a computing system is unable to handle all of the complexities inherent in living systems. They are designed for a specific cause and specific effect. Of this, Gregory Bateson writes:

Thirty years ago, we used to ask: Can a computer simulate all the processes of logic? The answer was yes, but the question was surely wrong. We should have asked: Can logic simulate all sequences of cause and effect? And the answer would have been no. When the sequences of cause and effect become circular (or more complex than circu- lar), then the description or mapping of those sequences onto timeless logic becomes self-contradictory. (72,1972)

It is when a machine is used for complex decision making that is normally done by humans that these contradictions become apparent. A machine is not capable of all forms of logic.

A closed loop system is a mechanical or computational system that acts autonomously. In contrast to open loop systems, it does not require a manual human input and operates without the need for human interaction.17 Take for example, when algorithms independently make the decisions of humans. In predictive modeling a positive feedback loop is created when factors such as race, sex and economic background is pro- grammed into an analytic system, such as in the case of algorithms designed to predict future offend- ers in the criminal justice system. Naturally the system will fall back on the worst of human bias in its

15 Jørgensen, Sven.“Ecosystem Ecology“. 2019. Google Books. Accessed January 12 2019. https://books.google.nl 16 Wolpert, David. 2019. „Why Do Computers Use So Much Energy?“. Scientific American Blog Network. Accessed January 1 2019. https://blogs.scientificamerican.com/observations/why-do-computers-use-so-much-energy/. 17 Rouse, Margaret. „What Is Closed Loop Control System? - Definition From Whatis.Com“. 2019. Whatis.Com. Accessed January 13 2019. https:// whatis.techtarget.com/definition/closed-loop-control-system. 30 decision making as a result of the input of already biased data. The algorithm will use this biased data to independently choose who should be targeted as future criminals. When algorithms make these kind of decisions, human prejudice is looped back on itself, thus exacerbating a cy- cle that criminalizes minority groups, poor people and other vulnerable populations in society.18 And as discussed previously, this is also the case when algorithms are used on social platforms to pre- dict what media content a user will be interested in. A positive feedback loop occurs and destabilizing extremist ideologies are amplified. Technologies already contain within their architecture predetermined human ideas and therefore the preconditions of the procedures which will unfold within their processes. The code is already mapped before it is performed. In contrast, within life systems, there is no map, matter directs itself towards complex behavior through the unfolding process of cause and effect. Life systems improvise in reaction to the passage of time.

Figure 9: Feedback loop of real time screen shots from, Active Track, productivity tracking software, Mariah Blue 2018

18 Larsen, Ben. „Feedback Loops In Predictive Models“. 2017. Analytics4all. Accessed January 13 2019. https:// analytics4all.org/2017/01/04/feedback-loops-in-predictive-models/.

31 Social theorist and philosopher, Brian Massumi, in What animals Teach us About Politics, writes about the “ludic” or play gesture as being a mental abstraction and an activity which creates “the conditions for language” (8, 2014). It is pre-language. In the “metacommunitive” action of play, an animal is able to imagine future scenarios by the improvisation of a future activity. For Massumi, abstraction is a lived re- flexivity and mental power is the ability to improvise. He doesn’t limit this mental power to more complex life forms, improvisation does not need a brain, according to Massumi, inventive gestures can extend all the way down to the amoeba or living cells(14,2014). He writes, “instinct is not limited to the automatic repetition of a reflex arc triggered by an inherited memory trace”(14,2014). In other words, evolution in life systems is not reducible to a set of rules, it is the “ludic,” or improvised gesture that is instinctual.

Massumi discusses the pre-language, “ludic gesture” in terms of paradox. An example of this is the learning behavior of play fighting carried out by a wolf cub which prepares for the future activity of real fighting but is simultaneously based on the reflexive history of previous fighting within the pack. In Massumi’s example, there is combat and the playing at combat or one activity standing in for another (23,2014). It is in this example of paradox where the ability of complex abstraction, meaning making and inventiveness takes place. Massumi cites Gregory Bateson in his analysis of paradox. According to Bateson, if you present a paradoxical question “to a computer, the answer will come out YES. . . NO. . . YES . . . NO . . . until the computer runs out of ink or energy or encounters some other ceiling. (...)Log- ic cannot model causal systems, and paradox is generated when time is ignored”(Bateson,117,1980). Paradox is based on “classification and meta classification”, or the way information is organized in order to create new information which again is reclassified when looped back on itself creating contra- diction(Bateson,117,1980).

In paradox, there is ambiguity. Nothing has been preordained. It is simultaneously working from a lived past and indeterminate future to decide the now. Therefore, the answer is simultaneously both true and false. There can be no logical conclusion in a paradoxical question. This is also where computing falls short. Machine intelligence is lacking this other kind of intelligence which is contingent on the dimension of time. A speech algorithm is already mapped and consequently is predictable in its answers to ques- tions; unable to improvise a response to an unknown input. It has been pre-programmed and therefore lacks the chaotic spark to ignite a self determined conversation. Siri only answers questions the has been programmed to answer, just as a machine intelligence, designed to cut costs in the healthcare sector, is predetermined in decision making regarding patient healthcare regardless of real world circum- stances. For this reason, artificial intelligence is an unrealized concept. Machines are unable improvise or to independently perceive and reason based on “sequences of cause and effect.” Human rationale and judgement is already written into the DNA of technologies. Like play fighting prepares for future fighting in the example of Massumi’s wolf cubs, it is our technology that allows us to imagine and prepare for future scenarios. As an artifact of human thinking, technology serves in the ordering and shaping of

32 our natural world. Human thought is the binary code from which technologies are written.19 Therefore, is it “I” that is AI?

Figure 10: Paradoxical Mountain, video still from lewis trip, Younwon Sohn, 2018

19 This idea was inspired by the text „What Is Philosophy? Part Two: Programs And Realizabilities” from philoso pher Reza Negarestani and should be seen as a departure point for future research. In this text he writes about human thought as a program or AI. He writes: The core idea of artificial general intelligence should be seen as an integral part of thinking as a program that elaborates the operational consequences of its intelligibi- lity (2,2015).

33 34 Bibliography

Amazon Mechanical Turk”Amazon Mechanical Turk”. 2019. Mturk.Com. Accessed January 10 2019. https://www.mturk.com/product-details.

Author unknown. “Are Viruses Living?”. 2019. Virology.Ws. Accessed January 12 2019. http://www. virology.ws/2004/06/09/are-viruses-living/.

Author unknown.”Entropy And The Second Law”. 1999.Physics.Bu.Edu. Accessed December 20 2018. Author unknown.”The Psychology Behind Facebook Data Breach”. 2019. BBC News. Accessed January 12 2019. https://www.bbc.com/news/av/technology-43674480/facebook-data-how-it-was- used-by- cambridge-analytica.

Bakkum, D.J., Shkolnik, A.C., Ben-Ary, G., Gamblen, P., DeMarse, T.B., & Potter, S.M. Removing Some ‘A’ from AI: Embodied Cultured Networks. Embodied Artificial Intelligence.(2003) Georgia Institute of Technology, Atlanta, GA

Bateson, Gregory. Mind And Nature. Cresskill, N.J.: Hampton Press. 1979

Biddle, Sam. 2018. “Facebook Uses Artificial Intelligence To Predict Your Future Actions For Advertis ers, Says Confidential Document”.The Intercept. Accessed December 23 2018. https://theintercept.com/2018/04/13/facebook-advertising-data-artificial-intelligence-ai/.

Carhart-Harris et al..”The Entropic Brain: A Theory Of Conscious States Informed By Neuroimaging Research With Psychedelic Drugs”. Frontiers In Human Neuroscience 8. (2014) Frontiers Media SA. doi:10.3389/fnhum.2014.00020.

Cheyer, Adam. “Samsung+Viv: the Intelligent Future.” BigSpeak Voice, October, 6 2016. Accessed November 25, 2018. https://www.bigspeak.com/samsungviv-intelligent-future/

Clemens Apprich,”Secret Agents: A Psychoanalytic Critique of Artificial Intelligence and Machine Learning,” in Digital Culture and Society Vol.4, ed. Ramon Reichert et al. (Bielefeld, Transcript Verlag, 2018)

Denes, Agnes The Human Argument: The Writings of Agnes Denes edited by Klaus Ottman, Spring Publications, Putnam, Conn 2008

Dzodan, Flavia “Algorithms of Austerity,” lecture, Sandberg instituut Critical Studies. Amsterdam, Netherlands, October, 19, 2018 Ellenberg, Jordan. 2015. “The Amazing, Auto-tuning Sandpile”. Nautilus. Accessed December 29 2018. http://nautil.us/issue/23/dominoes/the-amazing-autotuning-sandpile.

Etter, Lauren et. al.”Inside Facebook’S Political Unit That Enables The Dark Art Of Digital Propagan da”. 2017. The Economic Times. Accessed January 14 2019.

Fernyhough, Charles “What Do We Mean By ‘Thinking’? “. 2019. Psychology Today. Accessed Janu- ary 10 2019.

35 Fuchs, Christian. “Theorizing and analyzing digital labour: From global value chains to modes of pro- duction” The Political Economy of Communication [Online], Volume 1 Number 2 (23 January 2014)

Gabel et al. DYNAMICALLY EVOLVING COGNITIVE ARCHITECTURE SYSTEM BASED ON THIRD-PARTY DEVELOPERS. US Patent US 20140380263A1, filed June 17, 2014, and issued December 13, 2016.

Gorzelak, Monika et al. 2015. “Inter-Plant Communication Through Mycorrhizal Networks Mediates Complex Adaptive Behaviour In Plant Communities”. Aob Plants7: plv050. Oxford University Press (OUP). doi:10.1093/aobpla/plv050.

Haynes, Trevor.”Dopamine, Smartphones & You: A Battle For Your Time–Science In The News”. 2018. Science In The News. Accessed January 14 2019. http://sitn.hms.harvard.edu/ flash/2018/dopamine-smartphones-battle-time/.

Larsen, Ben. “Feedback Loops In Predictive Models”. 2017. Analytics4all. Accessed January 13 2019. https://analytics4all.org/2017/01/04/feedback-loops-in-predictive-models/.

Le Guin, Ursula “Ursula K. Le Guin: A Rant About “Technology”. 2019. Ursulakleguin.Com. Accessed January 5 2019

Li, Wenbin & Badr, Youakim & Biennier, Frederique. (2012). Digital ecosystems: Challenges and prospects. Proceedings of the International Conference on Management of Emergent Digital EcoSystems, MEDES 2012. 117-122. 10.1145/2457276.2457297. Haraway, D. Manifestly Haraway. Minneapolis, MN: University of Minnesota Press. 2016

.Jørgensen, Sven.”Ecosystem Ecology”. 2019. Google Books. Accessed January 12 2019. https:// books.google.nl

Massumi, Brian What Animals Teach us About Politics, Durham and London: Duke University Press, 2014

Negarestani, Reza “What Is Philosophy? Part Two: Programs And Realizabilities–Journal #69 January 2016–E-Flux”. 2018. E-Flux.Com. Accessed December 6 2018. https://www.e- flux.com/ journal/69/60608/what-is-philosophy-part-two-programs-and-realizabilities/.

Negarestani Reza, “Labor of the Inhuman Part II: The Inhuman” in FR David- “Recognition” edited by Scott Rogers, UH Books, Glasgow, U.K. 2017 Nelson, Daniel. 2018. “Mutualism, Commensalism, Parasitism: Types Of Symbiosis With Examples”. Science Trends. Science Trends. doi:10.31988/scitrends.10255.

Robles, Raissa “How Cambridge Analytica’s Parent Helped Duterte Win 2016 Election”. 2018. South

36 China Morning Post. Accessed January 14 2019.

Rouse, Margaret. “What Is Closed Loop Control System? - Definition From Whatis.Com”. 2019. Whatis.Com. Accessed January 13 2019. https://whatis.techtarget.com/definition/closed-loop- control-system.

Santo, Pietro, David.”Second Law Of Thermodynamics”. 2019. Khan Academy. Accessed January 2 2019. https://www.khanacademy.org/test-prep/mcat/chemical-processes/thermody namics-mcat/v/second- law-of-thermodynamics.

Tsing Anna L. The Mushroom at the End of the World: On the Possibility of Life in Capitalist Ruins. Princeton, NJ: Princeton University Press,2015

Wolpert, David. 2019. “Why Do Computers Use So Much Energy?”. Scientific American Blog Network. Accessed January 1 2019. https://blogs.scientificamerican.com/observations/why-do-comput- ers-use-so-much-energy/.

“Viv”. 2018. Viv.Ai. Accessed December 29 2018. http://viv.ai/.

37