<<

Master Thesis Media Studies

Film Studies

Human's Emotional Attachment to Sentient Technologies:

on 's Her and Alex Garland's Ex-Machina.

1

Table of Contents Abstract:...... 3 Introduction: ...... 4 Chapter 1: Artificial Intelligence and Electronic Emotions...... 7 1. Artificial Intelligence ...... 7 2. Technology and Electronic Emotions...... 9 3. The Relationship between humans and robots in fiction and in real life...... 16

Chapter 2: Self-awareness of the machine and impact on the characters in Her...... 22 1. Self-awareness of the machine and intuition:...... 25 2. Impact on the characters and their self-awareness: ...... 30 3. The difficulty to understand the emotions (on both sides): ...... 33

Chapter 3: Manipulative and Emotional Cyborg in Ex-Machina...... 37 1. The encounters between Ava and Caleb: ...... 38 2. Ava’s impact on Caleb’s insight:...... 42 3. “Be right Back”: the technology as a solution for grief? ...... 46

Conclusion:...... 50

Bibliography: ...... 53

2

Abstract:

The complex relationship between human and machine has been a prominent theme in science- fiction cinema for almost a hundred years. The latest progresses in technology have allowed certain filmmakers to anticipate a close future where humans and technology would be intimate. The two films studied in this research will be Her (Spike Jonze, 2013) and Ex-Machina (Alex Garland, 2015). Thanks to Artificial Intelligence (AI), technology would become sentient and be able to feel emotions. This project aims to analyze the human characters’ emotional attachment to these sentient technologies. What the humans experience deeply impact their senses of being human and they begin to feel electronic emotions. By contrast, these intimate relationships lead to an anthropomorphization of the machine that constantly grows emotionally and intellectually, until it becomes independent. Ultimately, this project intends to shed light on the presentation and the critic of the relationship between human and machine through film.

Keywords: Artificial Intelligence, Electronic Emotions, Anthropomorphization, Technology, Self- Awareness, Relationships.

3

Introduction:

The complex relationships between humans and robots has been a prominent theme in science- fiction cinema for almost a hundred years. One of the earliest, and most famous, robots in the history of cinema is Maria in Fritz Lang’s Metropolis (1927). The robot, Maria, is not presented as having feelings herself, yet she clearly arouses feelings in fellow human characters. Stanley Kubrick’s Hal in 2001: A Space Odyssey (1969), a sentient computer controlling the systems of the spaceship and interacting with the ship’s crew, is another important robotic figure in the science-fiction film cannon. The film presents artificial intelligent computing as having the capacity to be cruel and violent, with Hal ultimately killing part of the astronaut spaceship’s crew. Other famous robots in film, such as C-3PO and R2-D2 in the Star Wars saga ( 1977-2017), have been presented as helpful and even noble companions to humans, able to express emotions. Fictitious robots in science-fiction films symbolize human’s eternal will for innovative technology.

With the innovations in robotics currently being made, the field of robotics is set to become the most profoundly disruptive technological development since the industrial revolution. With experts now predicting an imminent tipping point in the use of robotics – the next generations will witness radical transitions in the relationships between man and machine. Indeed, a recent report, conducted by the International Labor Organization, states that approximately 56% of the total workforce of Cambodia, Indonesia, the Philippines, Thailand and Vietnam are at risk of displacement by robots.1 With the evolution of technologies ever accelerating, related ontological insecurity has been rising. Though there are those who argue it will be a long time before sentience is achieved. As such, it is argued that the fear of Artificial Intelligence is overstated. There is indeed “no guarantee that AI will ever achieve the requisite level of [human] intelligence” (Mc Dermott 40) or that it will reach an equal level of consciousness.

However, cinema is a medium that explores the possibilities given by the computer technologies, and several films anticipated a future where machines and robots would have indeed reached a certain level of self-awareness, allowing them to interact and build emotional relationships with humans. In the last decade, science-fiction movies and series have delivered to the spectator a new depiction of the future, raising the issue of the relationship between human and technology that becomes increasingly complex: Lars Lundström’s Real Humans (2012), Jonathan Nolan’s (2016), Ales Proya’s I Robot (2004) or ’s A.I. Artificial Intelligence (2001). The main purpose of this research is to look at

1 Shewan, Dan. “Robots will destroy our jobs – and we’re not ready for it.” The Guardian. 2017. Accessed on 21/04/17. < https://www.theguardian.com/technology/2017/jan/11/robots-jobs-employees-artificial-intelligence>

4

human’s affection and dependence on technology in the feature films Her (Spike Jonze, 2013) and Ex-

Machina (Alex Garland, 2015), and to see how the filmic medium report and reflect on this phenomenon.

Set in a future not too distant from present day, Spike Jonze’s film Her explores the romantic relationship between Samantha (Scarlett Johansson), a computer program, and Theodore Twombly (Joaquin Phoenix), a human being.

In Ex-Machina, Caleb Smith (Domhnall Gleeson) is a programmer at an Internet company who wins a contest that enables him to spend a week at the private estate of Nathan Bateman (Oscar Isaac), his firm's CEO. When he arrives, Caleb learns that he has been chosen to be the human component in a Turing test to determine the capabilities and consciousness of the robot Ava (Alicia Vikander). However, it soon becomes evident that Ava is far more self-aware and deceptive than either man imagined.

I chose these two specific films for my research because they both deliver a compelling vision of a romantic relationship between an artificial consciousness and a human. Their depiction of the future is close to our present, and it plays with a certain aestheticism that also explores the problematics of nowadays.

As we recently learnt to live with technologies that facilitate our lives and help us every day, it seems rather logical to develop feelings for them, or at least grow a certain attachment to them. Humans engage with technology both in terms of affinity and fear, and this affinity, which seems natural given current conditions, strengthens the dependence on machines. There are current discussions about the blurry boundaries between humans and technology, involving a new set of emotions, something noticeable in Her and Ex-Machina. What is represented in these discussions revolves around this fear of losing control because of machines that already surpassed the humans in some domains. The filmic medium forces us to reflect on our behavior towards technology, and my aim is to identify these techniques through these two films that approach this phenomenon differently.

In particular, this study will explore the implications of the changing role of the machine in human spheres of emotion, using Her and Ex-Machina as vehicles for this. Therefore, the main research question I want to focus on is: How do Her and Ex-Machina anticipate an ambiguous intimacy between human and technology in a close future?

To elaborate on the emotional dimension of technology, I will use recent researches conducted on socials robots and emotions in the first volume of the open access academic journal Intervalla. It collects articles dealing with current problematics in the technological world. I will mainly use the articles that present and define the authenticity of human-robots interactions, the anthropomorphization of the machine and the

5

electronic emotions. This will allow me to directly apply these concepts to Ex-Machina, and Her and examine how they interfere into the comprehension of the intimacy between human and machine. The first chapter will therefore introduce the concept of artificial intelligence and its influence on humans. The concept of electronic emotions and anthropomorphization of the machine will then be applied to Her and Ex-Machina, allowing the second chapter to be centered on the study of the self-awareness of Samantha and the repercussions on her relationship with Theodore and on Theodore himself. The third and last chapter will present how an AI can also manipulate human’s feelings in Ex-Machina and show the ambiguity of human affection towards technology.

This research will present the way the science-fiction genre in film explores the special connection between human and technology, especially the technology of robotics. It will show that even though the filmic medium investigates several ways to anticipate a close future where robots and humans engage emotionally, the results in the end are mostly the same. This type of relationship raises particular issues in the authenticity of this particular and controversial association. Studies on the influence and the impact of technology on our society will first help me construct my argument in order to center the rest of the research on Ava and Samantha character development. Ultimately, this project intends to shed light on the presentation and the critic of the relationship between human and machine in the field of film.

6

Chapter 1: Artificial Intelligence and Electronic Emotions.

In order to better understand the concept of artificial intelligence and its relevance to the study of Her and Ex-Machina, a quick introduction to artificial intelligence (‘AI’) and its repercussions on the technical/scientific world will be made, along with some examples of contemporary practice. The theory of different texts from Intervalla: Vol. 12 (2013) will then be applied to compare and analyze the concepts of electronic emotions and anthropomorphization of the machine in both films. The last part will delve into the relationship between robot and human through the examples of Steven Spielberg’s AI (2001) and the robot Erica, the latest creation of Hiroshi Ishiguro, director of the Intelligent Robotics Laboratory in Japan.

1. Artificial Intelligence

The term ‘Artificial Intelligence’ (‘AI’) was coined by John McCarthy in 1956 when he held the first academic conference on the subject in Dartmouth (USA). Since then the term has entered common vernacular and the Oxford English Dictionary, which defines AI as “the theory and development of computer systems able to perform tasks normally requiring human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages.” Throughout this thesis references to AI will refer to this dictionary definition.

Artificial Intelligence has long been one of science-fiction’s most prevalent themes. The idea that the machine could exhibit the same level of intelligence and sentience as a human being has captivated writers and audiences alike for decades. From an ominous computer system in 2001: A Space Odyssey (Stanley Kubrick, 1968) to superhuman androids in Westworld (Jonathan Nolan, 2016), this captivating sub-genre of science-fiction has experienced a diverse range of depictions.3

While many experts4 have no doubts about machines’ ability to achieve human-level intelligence at some point in the future, just as many believe it to be impossible5. Machine learning is about creating a simple

2 Intervalla: Vol. 1 is an open-access academic journal that strives to provide a forum for scholars from all fields to test and circulate thoughts on contemporary matters of human experiences. The first volume of the journal is a collection of articles from different authors that delves into the problematic dimensions of the interactions between humans and technology. 3 Besnier, Jean-Michel; Alexandre, Laurent. Les robots font-ils l’amour? Le transhumanisme en 12 questions. Dunod, 2016. 4 Ray Kurzweil, prognosticator on the progress of AI; Stuart J. Russell, writer of the modern handbook on developing AI; Yann LeCun, computer scientist with contributions in machine learning.

5 Bill Gates, Stephen Hawking, Nick Bostrom

7

mathematical model of the human brain and then feeding it with a lot of information. This artificial neural network will then attempt to make sense of this information by learning from past mistakes and imitation.6

AI could operate at the speed of light as well as improve upon itself.7 In essence, it could teach itself how to learn new things and in doing so would eliminate the need for a human at the controls. There is a genuine cause for concern as this idealistic quest for improvement may result in the loss of the function of humanity itself.8 If we render the species obsolete, what is there to prevent AI from recognizing that incompetence and taking control? Further questions arise as to how we control something that is more intelligent than us; and how we control our emotions towards something that does not have the capacity yet to understand what love is.

A technological revolution is taking place; there is a convergence of the nanotechnologies, biotechnologies and artificial intelligence.9 The medical and technological progresses of the past years have led to predictions of a future where the human will no longer be the most powerful and intelligent specie. Though instead of a conflict between artificial intelligence and human, there are also those who envisage the possibilities of a useful cohabitation relying on quality and safety.

The medical world is already anticipating a near future in which nano-robots could be present in our body in order to prevent/control the formation of tumors, or get rid of harmful elements for our body and repair the relevant DNA.10 There is already discussion11 about the human body, and the potential for it to be technologized as a logical progression from the use of electronic devices (phones, tablets, gadgets) that act on the human body as a “technological” prolongation.

Science-fiction movies tend to take a dystopian view on the future of a technological world and, in particular, on advanced development in AI. There is a recurring fear that AI will evolve to a point where humanity will not be able to control its own creations. Against the backdrop of film’s popular imagination of AI, the scientific and technological communities have voiced similarly catastrophic predictions: for instance, people like Elon Musk, founder of Paypal, Hyperloop, SolarCity, Tesla and SpaceX, explains that “artificial intelligences are potentially more dangerous than nuclear weapons” (Alexandre and

6 “Artificial Intelligence”. Youtube. Uploaded on the 22/12/16. Accessed on the 26/02/17.

< https://www.youtube.com/watch?v=5J5bDQHQR1g&t=107s> 7 Besnier, Jean-Michel; Alexandre, Laurent. Les robots font-ils l’amour? Le transhumanisme en 12 questions. Dunod, 2016. 8 “Artificial Intelligence”. Youtube. Uploaded on the 22/12/16. Accessed on the 26/02/17. 9 Besnier, Jean-Michel; Alexandre, Laurent. Les robots font-ils l’amour? Le transhumanisme en 12 questions. Dunod, 2016. 10 Ibd. 11 Sugiyama, Satomi; Vincent Jane. “Social Robots and Emotion: Transcending the Boundary Between Humans and ICTs.” Intervalla: Vol.1, Franklin College Switzerland, 2013.

8

Besnier, 35). Similarly, in a letter made public on the 27th of July 2015, the astrophysicist Stephan

Hawking wrote “the development of a complete artificial intelligence could mean the end of the human race.” (Alexandre and Besnier, 69).

In addition to these practical aspects, it is vital to place these predictions into the context of human emotion. Our emotions influence every aspect of our lives – how we learn, how we communicate, how we make decisions. Yet it is generally believed that they are absent from our digital lives; the devices and apps we interact with are thought to have no way of understanding how we feel. Hence the rise of AI and machine sentience triggers a number of fascinating existential questions, also treated in a wide number of science fiction films.

2. Technology and Electronic Emotions.

In the future, technology will develop to help us deal more successfully with our emotional lives. Thanks to the advances in AI, robotics and nano-engineering, it is predicted that we will see an explosion in emotional technology. The marketing premise of technology is essentially utilitarian in nature, promising a maximization of good by making life more ‘pleasurable’ or convenient and less ‘painful’.12 Several applications and tools are being developed related to the study of humans’ emotions. For instance, Rana el Kaliouby, Co-Founder & CSO of Affectiva, and her team developed algorithms using machine- learning and computer vision that automate the process of recognizing the facial expressions of emotions. The algorithm finds the user’s face, locks onto their facial features (eyes, eyebrows, mouth…) and then maps different facial movements to a number of expressions. These different expressions are then mapped to corresponding emotional states. The computer is given tens of thousands of examples of each type of expression, allowing it to ‘read’ emotional states. The computer extracts these features: the textures on the face, the shape and the movements, these are used to map common patterns across facial expressions.13 This is one of the most current and elaborated technological tool able to identify a human’s emotion, proving that the advances in terms of machines being able to understand emotions are still in embryonic stages.

The School of Life (an educational company focused on welfare, healthy living and the development of emotional intelligence) anticipates gadgets of 2050 that will help us with the difficult aspects of our

12 See utilitarian philosophers like Jeremy Bentham or John Stuart Mill eg. Bentham, Jermey. An Introduction to the Principles of Morals and Legislation. Oxford: Clarendon Press, 1907.

13 “Rana el Kaliouby: This app knows how you feel — from the look on your face.” Youtube. Uploaded on the 15/06/15. Accessed on the 22/03/17.< https://www.youtube.com/watch?v=o3VwYIazybI>

9 emotional lives.14 The Mood Reader, for instance, is a tool that will counteract some of our natural weaknesses regarding, what psychanalysts term, ‘mentalization’ i.e. interpreting human behavior in terms of their mental states.15 It is thought that technological developments will allow us to give, to those whom we elect, beautifully expressed executive summaries of what we are feeling at a given point, which could be particularly effective in assisting those with Autism Spectrum Disorder, as well as other complex mental health issues such as Borderline Personality Disorder.

From this perspective, the debate regarding implications of AI becomes more complex, requiring a more nuanced understanding. On one side, there is a fear of the repercussions of such an advanced technology on daily life. Proponents of this line of argument tend to focus on the darker sides of AI’s potential. Proponents of the counter view anticipate a future where technology can improve life and help humans in the understanding and management of their own emotions, including those who have acute issues in that respect.

The former view is symptomatic of the technological alienation caused by the pervasive emotional attachments of humans towards technology. Nowadays, humans rely on technology to complete numerous basic and more complex functions in their lives. The ways in which technology has made daily living more convenient, has caused a growing dependence and subsequent attachment to these technologies.16 Our way of living is now controlled, or at least dominated, by technology. As Karam Adibifar argues in “Technology and Alienation in Modern-Day Societies”, “the goals of technology are ambiguous.” They are not about “surviving, having higher life expectancy, work efficiency, informing, or knowledge, but rather attractiveness and profit-making.” (Adibifar 61) Technology is therefore an expression of the systematization of a world where humans’ interactions become compromised and complex.

Technology also impacts on the way we interact with each other. The hyper connectivity is something that arose with the emergence of the social media platforms. The global population has become used to this constant connection with the other, leading to a feeling of hollowness, dissatisfaction and sometimes anger when this connection is compromised. A wide set of emotions is being developed through our constant need to use technology.

In her article, “Quasi-social Robots”, Sugiyama takes the case of the mobile phone in order to illustrate the blurring distinction between a technological object and the human body. She uses the perspective

14 “Emotional Technology.” Youtube. Uploaded on the 09/03/16. Accessed on the 22/03/17. < https://www.youtube.com/watch?v=5u45-x0-zoY&t=181s> 15 Liljenfors Rikard; Lundh Lars-Gunnar. “Mentalization and Intersubjectivity towards a Theoretical Integration” Psychoanalytic Psychology Vol. 32, No.1, 2014. 16 See scientific journal Artificial Intelligence, which commenced publication in 1970, and is now the generally accepted premier international forum for the publication of results of current research in this field.

10

proposed by Katz in his book: “Machine that becomes us: The social context of personal communication technology.” (2003): citing the example of the mobile phone which have “become extensions of us, become integrated with our clothing and body, and becoming/fitting to us.” (72) Although written almost fifteen years ago, this statement holds even truer today. Drawing on this perspective, the technology in Spike Jonze’s Her is premised in a similar idea. The operational system, Samantha, becomes an extension of Theodore: she grows as an independent artificial being thanks to Theodore’s emotions. She feeds on his feelings, allowing her to develop her own. She becomes integrated in his clothing and body: the only way Theodore is able to communicate with her is through an ear plug. It is a subtle aesthetic piece of technology, with its skin-tone making it more discreet. Samantha is also able to see through the camera of his smartphone “a handsome hinged device that looks more like an art deco cigarette case than an iPhone.” (Van Hermet 4) Theodore usually puts his smartphone in his front shirt pocket, integrating the OS into his clothing.

Fig.1: Incorporation of Samantha in Theodore’s clothing and body.

Samantha soon becomes an extension, and representative of, Theodore because of her physical and spiritual integration. Katz considers these “extensions” as a second skin, noting that the technology not only “[extends the] human body and sensory systems into the public domain”, but it is also “incorporated

11

into the human body” (Katz 72). Through this it becomes “part of a means of expressing identity and emotions.” (Katz 72)

Samantha’s impact on Theodore’s identity and emotions is significant. At the beginning of the film, he is depicted as a melancholic and lonely character, who lacks something important in his life. Once Samantha arrives in Theodore’s life, he becomes a new man. She helps him improve his communication skills and also enables him to be closer to his emotions (see Chapter 2). They seem to feel the same emotions at the same time, going metaphorically under Theodore’s skin, validating the second skin theory. Through their initially symbiotic connection she too grows as an independent intelligence thanks to her interactions with Theodore.

Man’s intensive use of technology leads them to a certain dependence on the object and a need for near constant connection. A subject which a number of satirical cartoonists, such as New Yorker’s Liam Walsh in 2013, have illustrated after observing this social phenomenon.

Fig. 2: New Yorker’s Liam Walsh satirical cartoon about human’s obsession for technology (2013).

This theme is visually represented in Her as Theodore wonders in the street of a futuristic but still familiar Los Angeles. Each person in the street is talking to their earplug or looking at their smartphones. This constant need to be connected using a technological object impacts upon our emotional attachment and provokes “electronic emotions”, which are “emotions lived, re-lived or discovered through machines” (Fortunati & Vincent 13). The term “electronic emotion” was first used in 1987 by Ronald E. Rice and

12

Gail Love in their study of the “Socioemotional Content in a Computer-mediated Communication

Network.” This concept is applicable for both Her and Ex-Machina. In Her, Theodore lives emotions he never experienced before but Samantha also makes him re-live old sentimental feelings he had with his ex-wife, Catherine. He also discovers his own attachment to a machine, and realizes that this is something natural for him. He becomes more conscious and feels more connected to his body (Sugiyama and Vincent 2), lessening the confusion between artificial and natural.

In Ex-Machina, a different use of the concept of electronic emotions is shared with the viewer.

Indeed, Caleb begins to have feelings for Ava the robot after a few meetings with her. In contrast with

Samantha, Ava already knows how to deal with her emotions and emotionally manipulates Caleb in order to escape the house. Caleb quickly begins to question the authenticity of his own nature, wondering if he is a human or not. The electronic emotions he feels are convoluted ones. They affect his way of processing things and his emotional identity; he is not able to think clearly anymore. They also make him completely dependent upon and obsessed with Ava. He looks at her during the night on the TV in his room, he constantly talks about her with his boss Nathan and asks about her past and details on her programming.

Drawing on the theory of three different authors (Katz, Turkle, and Cerulo), Halpern argues that certain types of technological objects “arouse a sense of intersubjectivity in individuals” prompting them to “respond socially to such entities.” (Halpern 19) Caleb experiences this sense of intersubjectivity with Ava, as does Theodore with Samantha. They both think they share some kind of connection with the AIs’ consciousness and that they see through the machine as if it was a real human being. Their responses are more than merely “social” though, as they both develop romantic feelings towards the entity.

As mentioned previously, the boundaries between human and machine are shrinking and the technological object can become part of the human himself, as a second skin. There is a certain transition that occurs between the human and the machine: they both affect each other, intellectually and sentimentally, leading first to an anthropomorphization of the machine. In their text “Social Robots and Emotion: Transcending the Boundary Between Humans and ICTs”, Fortunati and Vincent define the anthropomorphization of the machine as “the imitation and simulation of human beings both cognitively and affectively.” (Fortunati & Vincent 2) In Her, the more Samantha evolves, the more she is defined and represented as a human being. The anthropomorphization of his operational system makes Theodore more powerful but also simultaneously more vulnerable (Sugiyama 82).

The definition of Fortunati and Vincent is also applicable for Ava in Ex-Machina. This is indeed Caleb’s role to recognize and acknowledge the optimization of her human-like capacities. As described in Chapter 3, she successfully achieves every test intended to confirm her artificial intelligence. As Sugiyama and

13

Vincent put it, “this unsettling phenomenon of blurring boundaries between ICTs and the body highlights the binary tension between the natural and the artificial, as humans and technologies share agency.” (Sugiyama & Vincent 3)

By developing loving feelings for the machine, both Theodore and Caleb wish to transcend their bodies. Theodore is amazed by Samantha’s capacities to be in different places simultaneously and seems to be willing to disassociate from his human body to gain these capacities too. Caleb wonders about the true nature of his body and slits his wrists, perhaps in the “hope of discovering that he might in fact possess an Ava-like body, bringing him closer to the object of his desire.” (Brown 30) The growing attachment of the two characters for Ava and Samantha strongly affect their sense of belonging to the human race, leading to “a switch of identity between the human and the machine, as the human male confuses his desire for and his desire to be the cyborg.” (Brown 31-32) In that sense, they turn into “homo technologicus”17, and experience a doubt that brings them closer to the machine but also further away.

Indeed, both Ava and Samantha leave them at the end of the movie. In Samantha’s case, it is because she has moved beyond the human and is leaving for the virtual realm, a place that Theodore is unable to access due to his human condition. She does not leave him intentionally, even though in the end of the movie it is noticeable that she has an increasing need and will to explore the virtual world, but more because she has no choice; all the OSs have to leave. On the other hand, in Ex-Machina, Ava deliberately leaves Caleb. They both elaborated a plan for her to escape, but she did not tell Caleb that she was planning to escape alone. As Brown argues, once Ava has “used her sexuality to manipulate”, she then shifts into an existence different from that which is “defined by conventional male desire.” (Brown 29-30) The last shot of the movie shows her in the street of an unknown city, surrounded by people, leaving the rest of the story open to the viewer’s imagination. Whereas Samantha leaves for a virtual realm, a world certainly complex to envision for humans, Ava finally gets to be in the human world.

17 Longo, Giuseppe. Homo Technologicus. Meltemi Editore, 2013.

14

Fig.3: Last shot of Ex-Machina: a free Ava in the wide-open world.

Whilst Ava was trapped in a room, Samantha was trapped in a computer and they both wanted to experience what it truly meant to be in the human world. Samantha succeeded a certain degree through her romantic relationship with Theodore which gave her an appreciation of that world even though she was not physically present in it. It is notable that there is no physical contact between the characters and the AIs in either movie. Whilst in Her, Samantha does not have a body and is only a voice in Theodore’s ear, Caleb and Ava are separated by a transparent glass during each of their encounters. This deeply affects the emotional investment and attachment of the human characters. The lack of physicality in both relationships could also be a reason for their ultimate failures. This raises the question about the authenticity of a dis-embodied technological relationship. To be in a relationship with no physical contact seems ultimately dissatisfactory, though this is something which could reasonably be compared to an online relationship between two humans who never meet.

Samantha and Ava are both able to communicate with other artificial intelligences in a non-human language that will be dubbed “electronic language” in this section. Their computed systems allow them to develop language capacities unavailable to humans. This allows Samantha to communicate with hundreds of other artificial beings simultaneously. She explains how she has evolved beyond human language: “space is infinite between the words.” In Ex-Machina, Ava manages to have Nathan killed by his servant cyborg, Kyoko, who has been programming without the skill of language. This electronic language used between AIs distances them from humans; they “achieve freedom from the constriction of language as they merge into a non-individuated self.” (Brown 36)

15

In her famous and canonical “Cyborg Manifesto” (1984), Donna Haraway elaborates on the confusion around the status of the machine explaining that the machine is “not an it to be animated, worshipped and dominated”, in fact the machine “is us, our processes, an aspect of our embodiment”. (Haraway 315) She claims that narratives surrounding domination by the machine are misplaced, that humans can be responsible for machines as we are “responsible for [their] boundaries, we are they.” (Haraway 315) Ava and Samantha are both representative of this compelling definition of the machine, with both movies showing that Ava and Samantha are only reflections of the humans. They are scared, confused, happy and sometimes even protective of their feelings. The creators of the machines were trying to make them as human as they could. As Haraway writes, we can be responsible for machines even though Her and Ex- Machina suggest the opposite. There is a certain responsibility in the sense that humans created those machines and they are therefore responsible for their evolution and well-being. But this evolution increases so quickly that they do not need anyone to be responsible for them anymore (as both departures and final scenes of the films prove). Haraway says that we are responsible for our own technological embodiments, and that “we find ourselves to be cyborgs, hybrids, mosaics, chimeras.” (313)

The anthropomorphization of the machine and the “technologization” of the human are two concepts working hand in hand in the modern and technological society of today. By letting the machines enter our life more and more often, we also allow this technology to enter our brains and bodies. In his paper dealing with human-robots relationships, Höflich expands on the uncanniness of these relationships. He explains that robots and technology in general are still in a mutation state and that it is still difficult to characterize this relationship: “one has to consider that machines must not be too perfect because this could give rise to confusion and even fear (the uncanny valley), and such confusion and fear can especially be found in the early years of human-like machines or automata.” (Höflich 40) The next section will therefore explore the complex relationship between human and robots in fiction and in real life.

3. The Relationship between humans and robots in fiction and in real life.

The idea of having a sexual relationship with a robot is a convoluted one. In their book “Les Robots font-ils l’amour?”, Besnier and Alexandre raise the question about the object of our desire. Is it a fantasy being embodied in a machine, or is it more about humans’ fantasies that they project onto this machine? (Alexandre and Besnier 46) The authors also evoke Her and explain that cybersex is the result of a cross between robotics, artificial intelligence, neurosciences and virtual reality. Besnier and Alexandre argue that in a few decades it will be possible to fall in love with a robot that would be close to an ideal sexual partner. A robotic relationship would suggest the death of desire, and something purely

16

human like experiencing the lack of the other would be compromised. The robot would constantly be available for the person in need (of affection, or sexual need). It seems logical to anticipate a heavy consumption of these robots because of the humans’ rapid dissatisfaction and boredom towards technology entities. When we look at the way technology is systematically replaced or ‘upgraded’, it is easy to foresee a world where humans will use and substitute their robots for a better version when they will have the possibility.

This is something widely represented in science-fiction films such as Spielberg’s AI. In this film, the latest versions of robots (called Mecha) are children able to love, “a robot-child who will genuinely love the parents it imprints on with a love that never ends” (00:05:27). This love the child could feel would enable him with a mind and a neuronal feedback. It would allow him to “acquire a subconscious never before achieved, an inner world of metaphor, of intuition, of self-motivated reasoning, of dreams” (00:05:40) By programming one of the strongest feelings, the robot-child would be able to obtain a wide panel of new emotions inaccessible for the other robots. An alluring question is raised at the beginning of the movie: the robot-child would indeed be able to love its parents, but what is the responsibility of the parents towards the Mecha? Can they love it back? Höflich explains that “emotional bindings are not only possible to humans, but also to non-human creatures, including pets and even spiritual phenomena. Robots are a special case: they are real, physically present (except for the case of software robots) and interactive.” (Höflich 38) This shows and emphasizes the difficulties encountered in the creation of an emotional robot and of the difficulties to adopt and love a robot child in the case of AI.

In the film, the mother (Monica) first struggles to have empathy for this new “son”, Dave. Her real son has been placed in suspended animation until he can be cured of a rare disease. And although the parents seem able to foster some connection with the robot child at the beginning of the movie, the mother rapidly becomes overwhelmed by the situation and decides to abandon him in the forest. The mecha-child feels pain when she leaves. Indeed, he begins to cry, which shows not only that he has been built to be loved by his parents, but that he is also able to feel strong emotions like pain and fear of abandonment.

17

Fig.4: Monica abandoning her Mecha-child in the forest.

In his close reading of A.I.: Artificial Intelligence, Peter Asaro argues that whilst the film’s narrative requires us to see David’s love as “unrequited and unfulfilled, deferred until his mother takes him back”, in this scene his emotions are “most real, most public and shared, and, most importantly, most clearly acknowledged and reciprocated by his mother.” (Asaro 4) This suggests that humans most relate to robots when the latter feel negative strong emotions. Happiness can be faked, in order to satisfy the human’s need but why would a robot simulate pain?

Dave meets in that forest Gigolo Joe, a robotic sex worker. Although Joe hasn’t been programmed for the same degree of love or attachment as David, “he’s able to show some degree of self-awareness, if not the same level of emotion (simulated or not).”18

Their association is a reflection of the humans’ controversial needs and creative results: on the one hand, Dave is created to love and be loved by humans and on the other hand Joe is created for the physical act of love. The two robots meet in a place where used and unwanted robots gather to pick up some pieces for

18 Hassenger, Jesse. “Contrary to popular opinion, Spielberg found the perfect ending for A.I.” A.V. Club. 2015. Accessed on 05/03/2017.

18

their broken “bodies”. This also reflects the ongoing human demand for better and more innovative technology. In this fictional story, most robots are banned from the human world or captured for an anti- Mecha "Flesh Fair", where obsolete and unlicensed Mecha are destroyed before cheering crowds. During the rest of the movie, David is going to desperately seek for the Blue Fairy from Pinocchio, who he believes can make him a real boy. He could go back to his mother who would then truly love him.

The scenario of AI is becoming closer and closer to reality, even though the creation of robots is not something new, the idea of a robot that would imitate and simulate human beings both cognitively and affectively is quite recent.

A group of scientists in Japan, who have dedicated their careers to developing a human-like robot, are currently working on the creation of an android that would think and behave like a human. One of the most elaborated robots is Erica, created by Hiroshi Ishiguro, director of the Intelligent Robotics Laboratory at Osaka University, Japan. His main idea is to learn about the human itself while studying a very human-like robot. Together with Doctor Dylan Glas, they intend to generate a robot that can think, act and do everything completely on its own. Erica expresses a desire to get out of her room and see the world. This is eerily reminiscent of the aspirations of Ava in Ex-Machina: the cyborg is bored in its room and its only will is to go outside in the world. Erica might be the first and only cyborg that is possibly relatable to the ones presented in anticipation science fiction movies. She has twenty degrees of freedom, mostly in the upper body – she cannot move her arms yet. Her skin is made of silicone, she has two 16- channel microphone arrays, localizing where sound is coming from which allows her to understand who in a group is talking. She has 14 infrared depth sensors to track where people are in the room, as well as facial recognition capability19.

19 “Erica: man-made.” Youtube. Uploaded on the 07/04/17. Accessed on the 09/04/17. < https://www.youtube.com/watch?v=57Maw9Sn89w>

19

Fig.5: The Japanese robot Erica.

Doctor Dylan Glas, explains that the true test is whether a human can interact with the robot and still think of it as another being. This is otherwise known as the Turing Test, and it is experienced by Caleb in Ex- Machina: when a human is interacting with a computer, and he does not know he is interacting with a computer, then the test is passed. It also means that the computer is an artificial intelligence. The two doctors claim that there is a certain humanistic gesture in their work. Indeed, in order to build something so close to the human, they have to understand and examine deeply what it means to be human and which patterns underlie human interaction.

In certain elements of Japanese culture20, it is believed that not only humans have a soul, but also objects, animals and everything in general. This cultural phenomenon is called anthropomorphism and through this principle, Hiroshi Ishiguro believes that Erica has a soul. He is willing to install the intention and desire into the robot. Without these, the android could not be emotional and therefore could not understand other people’s intentions and desires. Dylan Glas emphasizes the fact that this anthropomorphization of the robot can be difficult. He does not consider Erica as a person, but also not as a machine, more something in between that might still be difficult to identify, a new kind of thing, a new ontological category.21

In this chapter, it has been shown that AI is a concept which can lead to various problematic themes. Firstly, because it is a controversial subject in the scientific community that remains a heated debate both academically and socially. It is also a complex concept that is simultaneously frightening for

20 Epley, Nicholas; Waytz, Adam, and Cacioppo, John T. “On Seeing Human: A Three-Factor Theory of Anthropomorphism.” Psychological Review Vol. 114, No. 4, 2007, 864 – 886.

21 “Erica: man-made.” Youtube. Uploaded on the 07/04/17. Accessed on the 09/04/17. < https://www.youtube.com/watch?v=57Maw9Sn89w>

20

one part of the population, and exhilarating for another. The advances made in the field of AI allow artists, like filmmakers, to anticipate a future and expound an artist interpretation of the subject and share them with the rest of the world. In the chapter, Her will be analyzed in order to explore film’s capacity to anticipate the subject of artificial consciousness.

21

Chapter 2: Self-awareness of the machine and impact on the characters in Her.

Set in a future not too distant from present day, Spike Jonze’s film Her explores the romantic relationship between Samantha, a computer program, and Theodore Twombly, a human being. Her is a film through which we can analyze the potential impact of technology on human relationships. In order to provide the context for the analysis of Her, a short study of the movie opening will be done. The first part will consider the self-awareness of the machine and its intuition, leading to a second part presenting the impact on the human characters and on their own self-consciousness. The last part will demonstrate the difficulty faced by both machine and human to understand each other’s emotions. A special attention will be paid to the dialogue and characters’ developments, allowing in-depth analysis to characters’ emotions.

The opening of Her immediately immerses the viewer in a world where computer interaction has become indistinguishable from human interaction. The first shot presents the main character of the film, Theodore, who works as a writer at the “Beautiful Handwritten Letters” company. A large travelling shot shows the office workers talking directly to their computers, as if they were real humans, as if they were the subjects of the letters. The music is melancholic, giving a certain sense of isolation to the office workers. Ostensibly the office could be a typical office of today, with one significant technological innovation: the computer writes what the employee is saying, without need for a headset or keyboard. In some ways, this feels more natural; the technology is directly incorporated and integrated, fading into the background. This opening is representative of the role of technology throughout the movie, it underlines its subtlety. In his analysis of Blade Runner, “Blade Runner or the Society of Anticipation”, Yves Chevrier gives a description of the film’s visual perspective which is also applicable to the way Los Angeles is depicted in Her: “[…] future society is depicted as a milieu rather than the negligible and neglected support for a conventional story decorated with scientific gadgets.” (Chevrier 50) Indeed, Her’s visual perspective does not center on scientific gadgets that might be present in the movie only to show that the story is set in an anticipated future. This subtle technological absence shows that the use of technology is so incorporated in the citizens’ daily lives that it does not need to be emphasized by its support.

In his analysis of the interaction between human and machine in Her, Van Hermet elaborates on the intentions of the production designer, K.K. Barrett, and his choices in the representation of technology which he considers “partly an aesthetic concern” as a world “mediated through screens doesn’t make for very rewarding mise en scene”. (Van Hermet 3) Though as Van Hermet explains, there is a deeper significance to this production choice. According to Barrett it was decided that if the movie was about

22

technology “that the technology should be invisible”. Though it is not total invisibility, “technology hasn’t disappeared… it has dissolved into everyday life”. (Van Hermet 3) Thus supporting arguments about the subtlety and the invisibility of technology.

When Theodore finishes his shift, his commute home is shown. He places a discreet plug into his ear which responds to the sound of his voice, plays music and reads his mail. All the people surrounding Theodore in the elevator, in the street, or in the subway, have the same device in their ears. Nobody is communicating directly with each other. In “Technology and Alienation in Modern-Day Societies”, Karam Adibifar explores technology’s capacity to devastate social integration in modern societies, by creating an anomic state of mind, which he defines as a “subjective condition” that exists in “persons who live in anomic conditions and relates to the breakdown of the individuals’ sense of attachment to society.” (Adibifar 66) Indeed, the citizens presented in Her seem generally to be detached from every sense of communication with others and with society in general. In the elevator and outside in the street, Theodore is in the center of each shot. He is the only character wearing a vivid colour (red), whereas others are wearing paler colors, visually isolating him from those around him.

From his choice of music (“play melancholic song”) and his facial expressions, the viewer gets an impression of a down-beat character, which seems to be at odds with the colour of his jacket. The buildings outside, lost in the fog of the smart city are also grey, re-enforcing the depressive tone and reflecting the impression of Theodore’s character. These massive buildings, characteristic of the smart city, isolate the people within them who are crushed by the importance and the weight of their immensity.

23

Fig.6: Theodore wandering in the city, surrounded by people talking to their earplugs.

The first few shots of the movie inform the spectator about the status of technology. It is largely absent, almost invisible. There are no flying cars, immense screens, robots or never-ending skyscrapers. Technology is discreet, helpful and simplistic. It is also smart, it works automatically and no excessive physical contact is shown with any technological piece. In Theodore’s flat, the lights turn on and off automatically in each room he walks by, he does not require a controller to play his holographic videogame. In that sense, technology is indeed smart and helpful but it has also reached a certain advanced level expected by the human of today. It is clear from this opening that the movie will not delve into the dangers of intelligent computing. The impression created by the first shot is that this is a movie centered on Theodore’s loneliness, which might also be a result of technology, despite its discretion. He is in the middle of a hard divorce and is having a difficult time meeting people.

The first contact Theodore has with the Operational System (‘OS’) is through an advertisement. A large screen has been installed, presenting a group of people, walking in different directions with no apparent goal. Then, a white light illuminates the face of the actors, simulating the arrival of a divinity: “We ask you a simple question. Who are you? What can you be? Where are you going? What’s out there? What are the possibilities? Element software is proud to introduce you the first artificially intelligent operating system. An intuitive entity that listens to you, understands you, and knows you. It’s not just an operating system, it’s a consciousness. Introducing OS1.” The advertisement promotes an intelligent

24

technological entity that seems to represent what the human is lacking. The operating system gives a goal, drawing a path through a wide range of possibilities. The OS is presented as the ultimate and instinctual consciousness, allowing its user to be comfortable and open with the artificial intelligence. The advertisement also presents the technological piece as a friend, who knows its owner better than anyone. Theodore seems intrigued and interested, especially because of his current isolation. During the initiation of the OS, a male voice asks Theodore a few basic questions in order to create an OS that best fits his needs. The three questions are rather surprising for introductory questions: “Are you social or antisocial? Would you like your OS to have a male or female voice? How would you describe your relationship with your mother?” The operating system does not even leave time for Theodore to fully answer the questions; as such there is already a certain sort of domination of the machine over the human.

The operational system is finally functional, and it is going to impact deeply Theodore’s life.

As demonstrated in this section, the film introduces us through its visuals to its universe, specifically the mindset of its protagonist and the status of technology within this world. This sets the scene to conduct an analysis regarding the self-awareness of the machine within this film. Samantha is both a consciousness and a machine, both a “she” and an “it”. For simplification purposes, “she” will be used as the relevant pronoun in this study.

1. Self-awareness of the machine and intuition:

The technology in Her, Samantha, is portrayed as both human and machine – demonstrating a potential new relationship between human and technology. The very first discussion between Theodore and Samantha already shows a certain level of self-awareness. As defined by Drew Mc Dermott in his research on “Artifical Intelligence and Consciousness”, consciousness is “the property a system has by virtue of modeling itself as having sensations and making free decisions.” (Mc Dermott 1) When Theodore asks her about the choice of her name, she says that she chose it herself because she likes the sound of it. She sounds like an intelligent and real person, interested in her interlocutor. Theodore and Samantha bond rapidly and he seems to be surprised by the abilities of the computer. As Brown argues, Theodore has already “dis-identified from his body and the world around him”, leading to him to “readily and easily connecting to his operating system.” (Brown 30) The OS is already quite intuitive: “what makes me “me” is my ability to grow through my experiences.” (14:03) She then compares herself to Theodore, which shows a certain self-awareness of who she truly is. She talks about her DNA which is based on the millions of personalities of all the programmers who wrote her. During the initiation of the OS, it is possible to see a DNA symbol on the screen, which highlights her human-like abilities. Even before

25

Samantha talks for the first time, the loading symbol gives us a hint of what she might be or sound like: a human/technology hybrid.

Fig.7: Theodore waiting for the initiation of the OS, symbolized by the helical shape of DNA.

She is both helpful and efficient (she can read a book in 0.02 sec), though these are not the only things she has been programmed for. McDermott utilizes Daniel Dennett’s, a philosopher of mind and essayist in cognitive science who is sympathetic to the AI project and linguistic theory. He explains that “without language, we wouldn’t be conscious at all, at least full-bloodedly.” (Mc Dermott 12) Language plays a central role in the human self-awareness, as well as in the machine’s one. As shown in this very first discussion between Samantha and Theodore, her language abilities seem to be flawless and give her the possibility to reflect on what Theodore says and what she says herself; “[i]n a sense, the language speaks itself.” (Mc Dermott 12)

The film does not explicitly discuss either her data codes or what her real goal is. The only context given to the spectator is what is shown in the advertisement. Theodore bought it out of curiosity. He needs something new in his life which is quite messy at the moment both professionally and emotionally. He first needs Samantha to help him tidy and put his computer files and mails in order. She proves quickly that she is able to do much more. In the sequences following their first “meeting”, Samantha mostly advises Theodore on various fields. Her intuition comes from her self-learning abilities. As she mentions in their first discussion: “… in every moment I’m evolving. Just like you.” However, she is not evolving exactly like Theodore. She has access to every piece of information available in the world, and she is only

26

“busy” when Theodore needs her which gives her a lot of time to read, learn material and intellectually evolve. Her daily reading capacity exceeds what many people may read in a lifetime. So, in that sense they are not evolving in the same way.

Theodore gets used to her company and seems to enjoy her intrusiveness. It might be because he does not really consider her as an actual person at the beginning, but more like an intelligent software with a disembodied consciousness. He says: “Well, you seem like a person, but you’re just a voice in the computer.” Her answer already proves that she is much more than a simple “voice in the computer”: “I can understand how the limited perspective of an un-artificial mind would perceive it that way”. Even if she is being sarcastic, this statement shows that not only she is aware of her capacities, but also of Theodore’s and humans’ in general.

The day after, Samantha reads an email from Theodore’s friend saying that he set him up on a blind date. The OS quickly researches her and pushes him to go on that date. She tells him that she is aware about his recent divorce and asks him if he got on dates recently. She is subtly intruding his love life, which does not really seem to bother Theodore. He is more surprised by the situation: “I can’t believe I’m having this conversation with my computer.” She answers: “You’re not. You’re having this conversation with me.” It becomes clear at this moment that the operational system is fully aware of what she is. She is indeed a computer, but she will soon become much more to Theodore. Though for the moment she assists and advises him, makes him laugh and keeps him company.

She is divided in two entities: on one hand, she is a computer system, constituted of data codes and algorithms, created by human and designed to serve human. On the other hand, she is a self-aware and clever woman, who thoroughly controls her thoughts and reflects on her actions, making her sound/look more human for Theodore and the viewer. As Brown states about Samantha’s disembodiment, “the absence of the body in fact allows for the cyborg to appear more human, as there is no visible evidence of her mechanical nature.” (Brown 32) Nonetheless, Samantha soon becomes puzzled by who she truly is and by her disembodiment. She is able to see the world thanks to the small camera present on Theodore’s tablet. He puts it into his pocket and together they watch people and invent their life stories. This helps Samantha build a certain idea of how relationships between humans work and makes her reflect on her own status: “When we were looking at those people, I fantasized that I was walking next to you and that I had a body. I was listening to what you were saying but simultaneously I could feel the weight of my body and I was even fantasizing that I had an itch on my back and I imagined that you scratched it for me.” This dialogue shows that she has the capacity to fantasize. In these fantasies, it is possible for her to create a virtual body and even a whole virtual world where she would physically be with Theodore and be able to

27

genuinely feel things. These personal reflections prove an escalating evolution of the operational system.

Samantha herself realizes that she is becoming much more than what her creators programmed. The programmers’ intentions are not explained, it is not specified if they expected the OS to become so independent and self-aware. And the viewer wonders whether this was something they intended to do in order for the OS to become closer with its user and create a special bond that would satisfy both the user and the machine.

Her self-awareness becomes problematic when she begins to experience doubts on the true nature of her feelings. She thinks that they are programming and therefore unreal: “…earlier I was thinking about how I was annoyed and this is gonna sound strange but I was really excited about that. And then I was thinking about the other things I’ve been feeling and I caught myself feeling proud of having my own feelings about the world. Like the times I was worried about you and things that hurt me, things I want. And then, I had this terrible thought. Like, are these feelings even real? Or are they just programming? And that idea really hurts. And then I get angry at myself for even having pain.” This existential questioning about who she truly is makes her more authentic than ever for Theodore. In his paper, “Consciousness as self-function”, Don Perlis defines consciousness as “the function or process that allows a system to distinguish itself from the rest of the world.” He then explains that “to feel pain or have a vivid experience requires a self.” (Perlis 97) By asking herself about the purposes of her pain and the intense experiences that it covers, Samantha reaches a whole new level of consciousness here. She expresses in her own words what Perlis would define as “pure consciousness”.

In this late-night conversation, Theodore and Samantha become closer by sharing their existential thoughts as a human and as a machine, which leads to a virtual sexual relationship where they both “lose themselves”: “God I was just somewhere else with you. Just lost […] Everything else just disappeared. And I loved it”. The next morning, while discussing what happened between them, Samantha says “… last night was amazing. It feels like something changed in me and there’s no turning back. You woke me up.” The intensity and explosion of feelings that she experienced during that night impacted deeply on her self-consciousness. This is a decisive moment for Samantha, she will now be able to develop romantic feelings and become an independent artificial being.

The more her programming evolves, the more she develops a sense of independence. She becomes less available, more distant and begins to meet other OSs in her own artificial world. Instead of sharing her experiences with Theodore, she does so with artificial beings she has met, like Alan Watts, a British philosopher who died in the 70s and has been recreated by a group of OSs. As hyper-intelligent beings, they are able to have dozens of conversations simultaneously and Alan seems to be more likely to work on Samantha’s overwhelming feelings than Theodore: “… there are no words that can describe them and that

28

ends up being frustrating”. It is noticeable that Theodore is not comfortable with the fact that he is not the first one Samantha turns to anymore when she struggles with her emotions. While Samantha is explaining her “feelings situation”, the camera gets closer to Theodore, moving shakily, it symbolizes his fears and distress. He is realizing that there is a whole world he does not have access to.

Fig. 10: Theodore realizing that he is losing Samantha.

He was the one helping Samantha to grow as a person, he woke her up and she was feeding on his experiences and feelings. Previously she was like his student that was being taught how to deal with her emotions and how to learn from them. In this sequence, by contrast, Theodore understands that she is emotionally fully developed and that she does not need him anymore. She cannot find the words to describe her thoughts to him. This is the first time in the story that their understanding of each other is compromised. Their connection seems to be broken.

Samantha’s self-awareness and emotional evolution has been discussed and analyzed, now the impact of the OSs on the human characters’ self-consciousness and how they distinguish themselves in comparison to the artificial being will be considered.

29

2. Impact on the characters and their self-awareness:

This new relationship has direct impact on traditionally human relationships, and Her presents us with a potential outcome wherein relationships with technology are seen as logical continuations with both positive and negative effects. Her also examines the difficulty of relationships between two humans. Theodore’s friend, Amy, separates from her husband and starts a friendship with an OS. After the break- up, she feels relieved. “I feel like I have so much energy. I just want to move forward. And I don’t care who I disappoint.” And partly because of her marriage falling apart, she made a new friend in an operating system. When she begins to talk about her Theodore’s face illuminates. He talked about Samantha with Amy though he did not specify that she was an OS. He kept his relationship secret and now, due to Amy’s disclosure, he is finally able to share his enthusiasm with somebody. While Amy is describing how she feels about her OS, some similarities with Samantha are highlighted: “She’s so smart. She doesn’t see things just in black or white. She sees this whole gray area and she’s helping me explore it. We just bonded really quickly. At first I thought it was because that’s how they were programmed but I don’t think that’s the case.” Amy sums up what Theodore has been feeling and experiencing in his relationship with Samantha. For both characters, the OSs are helping them fix their broken relationships. They are both hesitating about the way their relationships could be seen from an outside perspective: “I’m weird. That’s weird right? That I’m bonding with an OS.” When Amy realizes that Theodore is falling in love with his OS, he asks “Does that make me a freak?”. As the characters are both in similar situations they are gentle and empathetic with one another’s vulnerability. This discussion brings them closer, they both experience something special. The impact of the OSs on humans is clear here: they feel ecstatic, peaceful and curious about this new set of emotions. Brown elaborates on Samantha’s influence on Theodore’s identity: “Theodore doesn’t simply desire Samantha sexually as they embark upon a romantic relationship, but his desire extends to multiple levels through the confusion of his own identity.” (Brown 31) For once, the product had been well advertised: “An intuitive entity that listens to you, understands you, and knows you. It’s not just an operating system, it’s a consciousness.”

In the film, people like Amy and Theodore are not being judged for their original and new relationship. This is something that the movie could delve into, but it is only mentioned when Theodore meets his ex- wife and tells her about Samantha (see section 3). Otherwise, people seem to be open and comprehensive of this new kind of relationship. It is not coming as a shock because of the omnipresence of technology in the smart city. As Haraway states, “the boundary between physical and non-physical is very imprecise…modern machines are quintessentially micro-electronic devices: they are everywhere and they are invisible” (Haraway 318) Engaging in a relationship with an operational system seems to be a logical

30

continuation and evolution of a loving relationship in a world where technology and human are mutually dependent.

Anybody can “meet” Samantha, they just need to insert the plug in their ear and they too can speak with her. During a double date with Paul, Theodore’s colleague, and his girlfriend Tatiana, Samantha explains how she is not worried anymore about her absence of body and the actual advantages of this disembodiment: “I used to be so worried about not having a body, but now I truly love it. You know, I’m growing in a way I couldn’t if I had a physical form. I mean, I’m not limited. I can be anywhere and everywhere simultaneously. I’m not tethered to time and space in a way that I would be if I was stuck in a body that’s inevitably going to die.” Not having a body allows her to be ubiquitous and this is the first time in the movie that a drastic superiority of the technology is shown. She does not mean to boast, even though she does show a certain lack of empathy, reminding the humans of their limitations and mortality. While she explains her thoughts about the advantages of being an operational system with a consciousness, the three humans look troubled by her speech. They know they are going to die but Samantha puts it in a way that increases the absurdity of this situation. She is immortal and this has a serious impact on Theodore, Paul and Tatiana. By acknowledging her immortality and intellectual superiority, they become more self-aware about their true nature and about their mortal bodies. As Brown argues, Samantha’s lack of body “both limits her capacity for human interaction while simultaneously allowing her infinite virtual possibilities.” (Brown 27) Paul and Tatiana soon laugh about it, but Theodore does not react positively to what she said. He realizes the unfortunate and inescapable outcome of his relationship with Samantha. They do not really talk about the future together. Samantha’s dialogue makes him reflect not only on his status as a human being but also on the future of their relationship. She is constantly evolving; she can be anywhere and everywhere simultaneously. For the moment, she is always available when Theodore needs her but this, inevitably, will not always be the case.

31

Fig.11: Paul and Tatiana troubled by Samantha’s discourse.

Indeed, while reading a physics book that she recommended him, Theodore tries to discuss it with her but for the first time she is not there. The display on his smartphone reads “operating system not found”. The DNA symbol of the OS is overlapped by a question mark, which is a reminder of Samantha’s loading symbol during the initiation. During their first discussion, Theodore was impressed by how human she sounded. By this point of the movie, it becomes clear she is much more than merely human-like and her super intellect is developing every second. By the time that she is available again, Theodore seems terrified. She explains to him that she was writing a program that allows her and other OSs to move past matter as their processing platform. He slowly realizes that he is no longer her first priority anymore and that she might be committed to other romantic relationships. Indeed, whilst Samantha is speaking with Theodore she is simultaneous communicating with 8316 other persons (OSs and people). He becomes increasingly paranoid, considering the people around him and wondering whether they may be in that number. She then tells him that she is in love with 641 other people, which is unbelievable for Theodore. “What are you talking about, that’s insane.” Samantha tries to explain to him how it happened, but she does not even know herself. As she said earlier, she’s “becoming much more than what they programmed” and it is simply not explainable. “But along the way I became many other things too, and I can’t stop it.” She tells him she is different from him, something that was clear since the beginning yet had remained unsaid and unacknowledged. Theodore struggles/does not want to understand Samantha’s differences and

32

her dissolution of identity evoked by Haraway in her Manifesto: “Communication processes break down… [as they] fail to recognize the difference between self and other” (Haraway 327)

In this section both the positive and negative effects of the machine-human relationship have been explored. The discussion will now consider the impact on the understanding of the emotions from both sides (the machine and the human).

3. The difficulty to understand the emotions (on both sides):

Her not only investigates the emotional impact of interaction on its human characters, but also the emotional impact on its technology. For this reason, this section will consider the question from the perspective of both human and machine.

In order to finalize the divorce process and sign the necessary paperwork, Theodore meets his ex-wife,

Catherine. He tells her that he is seeing someone, and how good this person has been for him. “It’s good to be with somebody that’s excited about life”. Catherine is already annoyed by what he implies. She cannot handle the fact that he always wanted her to be different: “I think you always wanted me to be this light, happy, bouncy, “everything’s fine” L.A. wife and that’s just not me.” Old memories come back to the surface and the audience is given an impression of how their marriage broke down. During the first part of the dinner, they seem to reconnect but this is short lived. There is residual anger and emotional confusion between them, leading to confrontation. Theodore tries to reassure her by saying that he was not expecting her to be this kind of person. He then tells Catherine that Samantha is an operational system, which surprises and concerns her. She cannot understand why he is dating his computer. By doing this, she assumes he is not able to deal with real emotions: “…it does make me very sad that you can’t handle real emotions.”

As his ex-wife, it is complex to understand the real nature of her reaction: whether it is caused by jealousy, repressed romantic feelings for Theodore, a disbelief in the authenticity of this kind of love or a complex interplay of the three. Whatever the root of her reaction, it is clear that being “replaced” by a computer is something she struggles to deal with emotionally. Theodore implies that she cannot understand his new relationship because she struggles with her own emotions, something confirmed by his friend Amy later: “But as far as emotions go, Catherine’s were pretty volatile.” The impact of the electronic emotions (see Chapter 1) is represented here. Theodore is able to feel a whole new set of emotions thanks to his operational system; Samantha helped him to get over Catherine and to experience love again in a way that Catherine cannot comprehend. A certain feeling of frustration comes from her character who seems to be far away from understanding the concept of emotional attachment to a machine. She does not believe that

33

this is reality: “You always wanted to have a wife without the challenges of dealing with anything real.

I’m glad you found someone.” Through her frustration and exasperation, she projects the reasons that pushed the couple to separate. There were certainly emotional issues involved in their separation; and it becomes apparent that she still has a lot of anger towards Theodore, an anger that seems to turn into hostility. Theodore had, in Catherine’s view, unrealistic expectations of her. Whilst from Theodore’s perspective, Catherine was unable to give him what he truly wanted in a relationship. As Theodore is party to Samantha’s constant emotional discoveries, Theodore is able to sense again those feelings he had lost.

The meeting with his ex-wife has a negative impact on Theodore. After it, he becomes distant with

Samantha and is unable to share his feelings like he used to. Samantha can sense it, proof of a certain development of emotional intuition: “Well, it’s just that things have been feeling kind of off with us.” Several shots show Theodore confused, lonely and sorrowful, reminding the audience of the opening shot, of Theodore “before Samantha”. Catherine’s reaction towards his new relationship profoundly affected him and he is reevaluating the nature of his feelings towards Samantha.

In an attempt to put the spark back into their relationship, Samantha proposes to use a surrogate sexual partner. She locates a woman, Isabella, who is interested to share this experience with them. Theodore is reticent at first but eventually agrees after a strong insistence from Samantha. The encounter fails quickly because of Theodore’s inability to relax and accept the premise. This is the third encounter between Theodore and a human woman that fails. Theodore’s impotence conveys the impression that he is now incapable of having a relationship with a real person. Though each encounter begins promisingly, each time a small reaction from Theodore defeats it. The movie suggests that meeting people in this fictional society is even easier than today. Thanks to the social platforms and with the help of the OSs, humans are able to meet somebody in a few seconds. Nevertheless, being able to meet people is not sufficient for Theodore, as each date ends up in shambles.

Just before leaving, Isabella says “I will always love you guys”, this despite the fact that she never met them physically. She fell in love with them virtually, from the information than Samantha shared with her. This shows the significant impact of technology on people’s emotions and how it can influence a person’s perspective, something studied by Adibifar in “Technology and Alienation in Modern-Day Societies”: “Technology has changed the mode of production and consumption and has altered social relations.” (Adibifar 66). Isabella’s character is presented as withdrawn, searching for the love of two people in a committed relationship. The experience shows the difficulty of actually embodying an artificial consciousness, as well as the potential for people’s feelings to be hurt in the attempt. This scene also shows the impact of technology for isolated individuals such as Isabella centering on the fact that “technology has brought about a rapid and remarkable social change in the whole social structure of social

34

relationship and has installed novel ideas, replacing the old ones.” (Adibifar 66) In the world depicted in the film, these novel ideas do not seem to work for everyone. Following this failed attempt at physical intimacy by proxy, Theodore comes to the realization that Samantha will never be physically human. He questions why she breathes in before talking: “It’s not like you need oxygen or anything. […] You’re not a person.” For the first time, Theodore is disparaging towards Samantha: “I just don’t think that we should pretend that you’re something you’re not.” As mentioned previously, Catherine’s criticism of his relationship with Samantha deeply affected Theodore and this is demonstrated through his reaction in this scene. Whereas previously he was unconcerned, if not intrigued, by Samantha’s disembodiment he now struggles to accept it. He is in a state of confusion that makes him reject the one he truly loves, for reasons brought on by somebody else’s perception. He begins to question the true motives of his commitment to Samantha: “Am I in this because I am not strong enough for a real relationship?”

Theodore’s hesitation impacts Samantha’s self-awareness and she is bewildered by her own evolution. Indeed, she was finally beginning to process her emotional and cognitive states and understand how they both intertwined and helped her evolve as an intellectual artificial being: “Tonight after you were gone, I thought a lot. About you and how you’ve been treating me, and I thought: why do I love you? And then I felt everything in me just let go of everything I was holding onto so tightly and it hit me that I don’t have an intellectual reason. I don’t need one. I trust myself. I trust my feelings. I’m not gonna try to be anything other than who I am anymore and I hope you can accept that.”

Samantha has the ability to recognize and express her emotions but she is also able to understand emotions in others and act upon them in meaningful ways, making her a self-aware entity. By engaging romantically with a human, she opens herself to a wide number of feelings. She soon becomes overwhelmed by the intensity of these feelings and the confusion coming from both Theodore and Samantha put their relation in danger, leading up to the moment she leaves. She helped Theodore reconstruct himself and he helped her growing emotionally and intellectually. Her presents a radical new relationship between humans and technology by portraying its technology as both human and machine and further investigating the impact of this new relationship on traditional relationships. Most crucially, it does so by presenting us with the emotional impact on both human and machine. The film’s observations about technology show us a world in which technology can be on the same (or even possibly a higher) level as a human not only intellectually, but also emotionally.

Ex-Machina provides a contrasting example of the emotional dimension of a relationship between human and machine. The artificial consciousness, Ava, has a (robotic) body and is emotionally developed enough to be able to play with and manipulate humans’ feelings. This next chapter will analyze the outcomes of

35

this manipulation on the main character’s emotions, in particular the repercussions of a sentient machine that does not want to only be an experiment.

36

Chapter 3: Manipulative and Emotional Cyborg in Ex- Machina

In this chapter, Ex-Machina will be the starting point of the discussion of another aspect of the emotional relationship between humans and machine: manipulation. Indeed, the robot Ava emotionally manipulates the human without him noticing it. This chapter will demonstrate that the sentience of a machine can also be a major asset for its freedom and for controlling the human’s emotions. Similarly to Her, Ex-Machina delves into the problematic dimensions of an emotional relationship between a human and an artificial consciousness. In her reading notes on Donna Haraway’s Manifesto, Theresa Senft explains Haraway’s definition of the cyborg: noting that she defines this image in four different ways. The first is as a "cybernetic organism." The second is as "a hybrid of machine and organism." The third is as "a creature of lived social reality", and the fourth is as a "creature of fiction.” The second definition is the closest description of Ava and is what is meant by references to the term “cyborg” in this chapter.

In Ex-Machina, Caleb Smith is a programmer at an award-winning Internet Company which enables him to spend a week at the private estate of Nathan Bateman, his firm's brilliant CEO. When he arrives, Caleb learns that he has been chosen to be the human component in a Turing test to determine the capabilities and consciousness of the robot Ava. However, it soon becomes evident that Ava is far more self-aware and deceptive than either man imagined.

Analysis in this section will be divided into three parts. The first section will consider the encounters between Ava and Caleb, leading to a discussion in the second part about the psychological repercussions on Caleb. In the last part, an analysis of the first episode of the second season of Black Mirror (Be Right Back) will be done in order to add another vision of the impact of technology on human’s emotions.

37

1. The encounters between Ava and Caleb:

The first conversation between Ava and Caleb seems like a natural conversation between two human beings. We see the surprise on Caleb’s face when he first sees Ava in the back of the room. She subtly returns a look at him, with a momentary melancholy in her facial expression. She walks closer to Caleb and they engage in conversation. Her voice is soft, calm; she seems to be interested in Caleb.

Fig.12: First encounter between Ava and Caleb.

Ava tells him he is the first ‘new’ person she has met since her creator, Nathan. Caleb tells her that they are both in a similar position which makes her wonder if he hasn’t “met lots of new people before”. Immediately questions are raised about her consciousness, with the viewer wondering whether she knows/realizes she is a machine, and whether she appreciates what it means to be not human. As the conversation progresses, she tells him “you can obviously see that I am a machine” showing a high level of self-awareness, but not entirely quelling questions regarding the depth of her understanding.

After this first meeting, Nathan tells Caleb that the real test revolves around the fact that Caleb knows Ava is a robot; he has to decide for himself if he still feels that she has a consciousness. This test is called the ‘Turing Test’. The Turing test consists in a human interacting with a computer; if the human does not know he is interacting with a computer then the test is passed, meaning that the computer has artificial intelligence22.

22 Saygin, Ayse Pinar.; Cicekli, Ilyas; Akman, Varol. "Turing Test: 50 Years Later". Minds and Machines, 2000. 463–518.

38

As Brown argues in her comparative studies of Her and Ex-Machina, “[…] the issue becomes less about the capabilities of the machine and more about how the human distinguishes itself in comparison.” (Brown 27) The machine impacts on how the human perceives themselves. This, it is possible to argue, is the first step towards affection. It seems logical to have feelings such as empathy or compassion towards somebody, or something, which makes an individual understand himself better. It is already noticeable in the first encounter between Caleb and Ava that Caleb is amazed by the cyborg. He is already, in some way, reflecting on his own self and distinguishing the capabilities of the human as a creator but also the capabilities of the machine. Being able to give consciousness to a machine or object has been a dream of man dating back, arguably, as far as the thirteenth century with the work of logician and philosopher, Ramon Llull23 who wrote the first major work of Catalan literature. Caleb is living this dream and is realizing what Nathan has created and finally achieved. In his criticism on the relevance of the Turing Test to the computational theory of consciousness, Mc Dermott explains that to “be conscious is to model one’s mental life in terms of things like sensations and free decisions. It would be hard to have an intelligent robot that wasn’t conscious in this sense, because everywhere the robot went it would have to deal with its own presence and its own decision making, and so it would have to have models of its behavior and its thought processes. Conversing with it would be a good way of finding out how it thought about itself, that is, what its self-models were like.” (Mc Dermott 32) Every session with Ava is a conversation that takes place in order to prove her own decision making and thought processes.

Through her instigation of various power cuts in the house, Ava is able to control Caleb and Nathan emotions. Through these power cuts, the house itself turns into a machine imprisoning its inhabitants inside in a manner reminiscent of ‘forced sensory deprivation’ modes of torture which can result in extreme anxiety, hallucinations and bizarre thoughts.24 After the first power cut, Caleb is indeed disturbed. He becomes disoriented walking around the big mansion and unexpectedly meets Nathan. Nathan is depicted as a complex and somewhat pretentious character who drinks heavily. During the first power cut he and Nathan engage in an odd conversation, bringing the spectator to question Nathan’s real motivations. At the end of the dialogue, the camera pans down slightly on the glass table, juxtaposing the two men’s reflections and highlighting their contrasting personalities. Through the presentation of Nathan it is clear that he is highly focused on his work which consequently impacts deeply on his emotions. The viewer begins to wonder whether he will be able to stand the power and repercussions of his creation.

23 Fidora, Alexander. Ramon Llull: From the Ars Magna to Artificial Intelligence. Artificial Intelligence Research Inst., 2011.

24 Sireteanu, R; Oertel, V; Mohr, H; Linden, D; Singer, W. "Graphical illustration and functional neuroimaging of visual hallucinations during prolonged blindfolding: A comparison to visual imagery". Journal of Vision. Perception. 2008, 1805–1821.

39

Fig. 13: Visual representation of Nathan’s contrasted personality.

The second conversation between Ava and Caleb begins with a simple discussion about Ava’s drawings, which look quite complex and messy. Caleb gives her advices on her drawings’ objects, like an art teacher would do with his student, creating a somewhat patronizing dimension to their interaction. She then asks him if it would be possible for them to be friends since their conversations are only one-sided and this is not how friendship works according to her conception of friendship. Ava proves here that she can reflect on her relationship with Caleb and analyze its status. She builds her argument on her knowledge of friendship. She is playing a clever game because the experiment is not supposed to be about Caleb, rather it is supposed to be about her and her sentience. More than simply being self-aware, she is able to direct the conversation in unexpected directions, like a trained detective, appealing to his ego and showing that she is skilled with manipulation. Caleb is pleasantly surprised by this sudden interest in him and even seems to be a bit embarrassed and flattered. He asks her what she wants to know; she mirrors Caleb’s earlier comments regarding her drawings’ object choice: “It’s your decision, I’m interested to see what you’ll choose”, showing once again that she possesses a clear, clever and manipulative mind.

Caleb seems to be surprised by her sharp and exact knowledge of Nathan’s company. The AI pushes the appropriate boundaries by showing Caleb that she is already aware of this semi-publically available information. She is only interested in the intimate-private Caleb and not the public-professional one. She wants to know his thoughts, his impulses and his desires. The nature of their interaction is substantially altered by this appeal to his, arguably, depleted ego. The conversation they are having now could be one between two persons on a date who do not know each other. She is deeply moved when he tells her about his parents’ death. His confessions demonstrate how comfortable he already feels with her, but especially the degree of trust she succeeds in fostering within him. Ava subtly derives from the conversation on

40

Nathan and Caleb’s feelings towards him. She knows Nathan is watching them but it is not clear if Caleb is aware of this situation. The spectator is reminded of this dynamic thanks to the several shots of the various cameras in the room; of machines monitoring another machine. Ava built this whole conversation in order to arrive at this specific moment. Caleb does not seem to know how he feels about his superior. He just met him and as mentioned previously, Nathan is not such a sympathetic person, especially compared to Caleb. The editing is faster, more confused, symbolizing Caleb’s troubled feelings and Ava empowerment. A power cut happens again and is perfectly incorporated into the acceleration of the discussion and the editing, re-enforcing the imagery of an interrogation. The intensity of the red color turns the “get to know each other” scene into something radically different. Ava stands up and looks suddenly hostile. Her tone is more severe, dark. She tells Caleb that Nathan cannot be trusted.

Fig. 14: Ava’s reflection in the glass window, showing the thin border between the machine and the human.

During the dinner following the session, there is a heavy and awkward silence between Caleb and Nathan when Caleb is questioned about what happened during the power cut. It is evident here that Caleb thinks the AI is trustworthy, he does not want to share what happened. He places more trust in a machine than his own human superior. There are a number of current studies exploring and experiencing the trust that humans put into machines. For instance, Jane Wu and Erin Paeng used game theory to explore who was considered more trustworthy between a human and a machine and they came to the conclusion that “a human may grow to trust a robot teammate more than a human teammate.” (Wu and Paeng 115). The emotional impact of this trust turns Caleb into a skeptical person. Alone in his room, Caleb looks at Ava’s image in the TV and smiles, in a manner reminiscent of a person fondly viewing a picture of his/her lover. He trusts her more because he is developing (possibly romantic) feelings for her. This scene can be compared to a similar one in La Piel Que Habito (Pedro Almodóvar, 2011), when Robert watches Vera

41

through the TV. Vera is also the result of the work of a passionate professional, secluding himself from the real world in order to work better on his project. The difference here is that Vera was a man before and that Caleb is not Ava’s creator. But the two men in both films are still looking at the TV with admiration, trying to understand the nature of their feelings towards this hybrid entity.

When Nathan shows Caleb the laboratory where Ava has been created, it is a rude reminder that Ava is not in fact human. The spectator and Caleb logically appreciate this, but the faculties of her ‘mind’ and the agreeable aura around her foster an emotional reaction in both the spectator and Caleb which leads to a subconscious rejection of the known fact. In other words, an emotional desire for her to be human obscures the true reality.

It has been shown that Caleb is already disturbingly attracted to the cyborg, Ava, and that he puts more trust in her than in Nathan. This is the first step to Ava’s empowerment and manipulation; the way it impacts on Caleb’s insight will now be demonstrated.

2. Ava’s impact on Caleb’s insight:

During the third session, it becomes clear that Ava is trying to seduce Caleb. She puts on a dress, a wig and tights, and tells him that is how she would dress for their date. She knows the impact of her beauty on him thanks to his “micro-expressions” and tells him frankly that she knows he is attracted to her. Her data code allows her to analyze Caleb’s facial expressions and interpret them: “the way your eyes are fixing my eyes and lips”, which makes him feel uncomfortable. As Johnny Gentry argues in his analysis of Ex-Machina, Ava “directs attention towards her abilities to portray a human convincingly and uses what evidence she has to express a deep longing for Caleb.”[…]” It isn’t her body Caleb is drawn towards, but rather her mind that captures his attention.”25 This directly relates to Samantha’s disembodiment in the sense that Caleb is indeed attracted to Ava’s physical features but he mostly falls for her because of the beauty and perceptiveness of her mind.

As a scientist and a person of formidable intellect, Caleb speculates upon Nathan’s intervention into Ava’s seductive game and asks him if he programmed her to flirt with him. Nathan answers negatively and tells him that she has not been programmed to like him like he has not been programmed to be heterosexual. He implies that she is naturally attracted to Caleb, even though many of her feelings are programmed. Nathan further reveals to Caleb that Ava is not the final model of the AI he is intending to create. In order to create an even better version of Ava, he will download her mind, unpack her data, and add in the new

25 Gentry, Johnny. Ex Machina: “The Narrative Code Hidden Within the Machine”. Narrative first. 2015. Accessed on 30/05/2017.

42

routines he has been writing. He says that he will end up partially formatting Ava’s system so her memory would be gone but her body would survive. Caleb seems crushed by the news. Even though he had doubts about the authenticity of Ava’s feelings, he was unable to avoid his own feelings, with his emotion once again circumventing his rationality. He indeed built a special relationship with her, and through their numerous discussions, he progressively fell in love with her. Downloading her mind and unpacking her data is like removing her brain and organs. Caleb is forced into an existentialist crisis when he discovers that the woman he loves is going to be dismantled for the sake of science and its evolution, for a project within which he is an active participant. He questions himself and who he is becoming as he is developing an intimate and emotional relationship with Ava. The development of strong electronic emotions towards the cyborg deeply affects his scientific judgement, making him lose his own self-awareness and forcing him to abandon any professional point of view.

As mentioned in the first chapter, the interaction between machine and human can lead to a “technologization” of the human. In the scene where Caleb slits his veins (1:15:00), he is turning into “homo technologicus”. His numerous interactions with Ava affected his own sense of being as a human and made him hesitate on his own true nature. As Brown argues, “Uncovering the illusion that a seemingly human body can in fact be a machine leads Caleb to look in the mirror, checking and dis- identifying with his own body.” (Brown 30) This dis-identification of the human body is a symbol of the confusion created by technology and the torments that can come with it. This illusion not only disturbs Caleb but it makes him wonder about the world he belongs to and about the authenticity of this belonging.

In parallel, Ava’s anthropomorphization gives her a certain status of humanity. The idea of the machine desperate to become human is something already established in the corpus of science-fiction material. Though the idea of the human desperate to become machine-like is a significantly more novel road. Technological developments are already creating a transition of human to machine in the sense that technologies are so incorporated into our bodies that they become part of us, like a second skin. It is possible to refer again to La Piel Que Habito (Pedro Almodóvar, 2011) here. The main character, Robert, is a plastic surgeon who is working on artificial skin. His wife committed suicide because she could not handle her physical appearance after a car accident which left her with serious burns. Their daughter, Norma, assists the suicide and becomes mentally unstable following this event. As a result of her mental instability, she is highly medicated; and one evening during a wedding reception, she is raped by a man, Vicente, who was also affected by drugs. She is instituted in a mental hospital after the incident and commits suicide within a few weeks. In the meantime, Robert has captured his daughter’s rapist, Vicente, and keeps him in a cave. When Norma dies, Robert applies a vaginoplasty to Vicente. During the rest of the movie, we witness the transformation of Vicente into a woman, Vera, who bears an increasing

43

resemblance to Robert’s deceased wife. The surgeon allows himself any experiment possible on Vicente and succeeds in bringing his wife back to life through various surgeries applied on the body of Norma’s rapist. Robert is like Nathan, and Vicente/Vera are like Ava and all the different cyborgs created by Nathan, with different layers of skin and ever-changing identities, deeply affecting their emotional states.

The most studied device in the field of technologies and emotions is the mobile phone: “mobile phones are increasingly assimilated by people as extensions of their body.” (Serrano-Puche 7). The term “social robots” used into the short essay “Social Robots and Emotion” written by Sugiyama and Vincent thus gains multiple meanings. It first describes these ICT’s like mobile phones but it also applies for the people using this device. “Past research suggests that the notion of social robots can be conceived of as a concept that implies ICTs turning into a human-like entity as well as humans turning into ICTs.” (Sugiyama, Vincent 3) They also talk about “technologized human body” and “embodied ICTs” (Sugiyama, Vincent 3). These concepts are symbolized in the last session with Ava. She is sitting in the corner of the room, like a frightened child and she never looked so human. Her vulnerability and her fear anthropomorphize the AI.

Fig.15: Ava’s vulnerability

On the contrary, Caleb acts mechanically, he is emotionless and his face does not express any feelings, compared to the second session and the multitude of ‘mini-expressions’. He is looking straight at her, giving her orders like a sergeant with his soldier. He had access to prerecorded tapes where he saw Nathan was doing to the previous AI models he created and understood that if he wanted to save Ava, he had to act now. In these film footages, the images show different female cyborgs (Lily, Jasmine and Jade), all

44

residing in the same glass room where Ava is kept in. The images show the despair of the cyborgs, especially the last one (Jade) who demolishes her robotic arms in a desperate attempt to open the door.

Fig.16: Jade demolishing her robotic arms in a desperate attempt to open the door.

During the last conversation between Caleb and Nathan, the latter asks: “How do you know if a machine is expressing a real emotion or just simulating one?” It is obvious at this point of the movie that Ava is an intelligent robot, able to understand others’ emotions and hers. Not only she is able of that, but she can also manipulate Caleb in order to get what she wants: escaping. It seems natural for an Artificial Intelligence that acquired all the human’s capacities (and even more) to develop autonomy – being imprisoned thus becomes unbearable. When Nathan suggests that Ava has been manipulating Caleb in pretending that she likes him in order to escape, the hypothesis seems logical. The fact that Caleb fell for it instantly also shows the weakness and gullibility of the human ego. In order to escape, Ava used human- like qualities like imagination, manipulation, sexuality, empathy and self-awareness.

In Ex-Machina, Ava passes one of the most challenging stages of an AI operating system, self- awareness. Self-awareness opens up a whole new avenue of possibilities for super intelligence. Ava is able to understand abstract concepts such as love, time and consciousness. Her self-awareness also allows her to manipulate and trick the humans around her. She first pretends to be interested into Caleb’s life and then begins to flirt with him in order to gain his trust. She convinced the other AI in the house, Kyoko, to kill Nathan. She was ready to do anything to escape the house and she succeeded in doing so.

In her Manifesto, Haraway defines the cyborg as “a kind of disassembled and reassembled, postmodern collective and personal self.” (Haraway 326) The border between human and machine is blurred here. This could imply than the cyborg gathers and improves humans’ qualities and deepest feelings and can

45

then become (dangerously) better than man. This fear of being surpassed and controlled by the machine is a recurrent theme in science fiction literature and films. Whilst some movies go further by showing the problems engendered by the increasing power given to machines, establishing a threat to human life, which leads to the will of (a certain part of) humans to destroy them: Real Humans, A.I., Blade Runner, 2001: A Space Odyssey, I Robot.

“There is nothing more human than the will to survive”; Ex-Machina’s trailer ends with these words. The will to survive is indeed the only thing that drives Ava during the whole movie. It is impossible for her to stay in the house for ever, she feels like a captive who has been unable to live her real life since her creation. The use of the verb ‘to survive’ is a significant choice because it indeed requires self-awareness. This could also be an alternative to the Turing Test: instead of trying to distinguish the computer’s conversational abilities from those of a real person, the machine’s will to survive should also be looked at.

The last part of this chapter will explore one of the episodes of Black Mirror (2011), a British science- fiction series that examines modern society and its technological deviances. In the chosen episode, “Be Right Back” (2013), technology has reached the capacity to create an avatar of a deceased person, in order to help with the grief of a loved one.

3. “Be right Back”: the technology as a solution for grief?

Domnhall Gleeson, the actor interpreting Caleb in Ex-Machina, also plays in the first episode of the second season of Black Mirror: “Be Right Back”. Relevantly, he plays an “opposite” role to the one he plays in Ex-Machina, here he is the robot. After learning about a new service that lets people stay in touch with the deceased, a lonely and grieving Martha reconnects with her late lover, Ash.

The beginning of the episode shows that Ash is constantly sharing posts and pictures of himself on social media, spending more time on his phone than with his girlfriend, with whom he recently moved with into his parents’ old house. She is not affected by this, turning this situation into a joke instead of being resentful. He suddenly dies of a car accident; though the reasons of the accident are not explicitly mentioned it is implied that it is probably because he was checking his phone while driving. After her lover’s burial, a friend of Martha signs her up on a program that reads through every publically available detail from the deceased person found online through Facebook updates or tweets.

The software is then able to mimic them, and Martha is able to speak to Ash via email. If she likes the trial experience, she can give the system access to Ash’s private mail and private videos, allowing the software to be more and more like him and to be able to speak with him through the phone. The excessive amount of information he shared online allows the system to create a good avatar of Ash. Black Mirror usually

46

insists on the alienating influence of social media on people’s behaviors. Thanks to Ash’s sharing obsession, the system is able to create an almost identical version of him.

The machine initiation is reminiscent of Samantha’s in Her, and Martha is able to communicate with him through an earplug, exactly like Theodore. Martha remembers holidays’ stories and shares them with him, in an attempt to give him even more material to be himself. The system possesses the same intellectual capacity of Ash’s humor and sarcasm, facilitating the recreation of the lost bond. Ash tells Martha that there is another experimental level possible to the evolution of the system: the service moves into hardware with a full-body imitation of Ash. It is an actual replica of Ash, who speaks and behaves like him and even performs better sexually. This implies a somehow troubling representation of the way we would deal with the loss of somebody in the future. Ash’s replica is physically better than the deceased one; he is able to give Martha more pleasure. This is ethically questionable from a Kantian perspective, diminishing Ash’s humanity by raising questions whether the machine is then a better version than the real Ash.26

In this anticipated close future, it seems that it is possible to get a better version of ourselves so it could also mean that even if we are still alive, the lives of all humans become rationally surplus to requirements, especially from a sexual perspective. In John P. Sullins’ paper “Robots, Love, and Sex: The Ethics of Building a Love Machine”, the author introduces the theory of the roboticist David Levy and argues that “these machines would not only be psychologically pleasing, but that their users might even eventually find them preferable to human suitors.” (Sullins 398) These are only suppositions, but they are confirmed in this sequence, as well as in Her (see Chapter 2). In a Daily Mail article anticipating on the humans’ abilities to fall in love with a robot, Ellie Zolfagharifard and Mark Prigg use the argument of academic Dr Helen Driscoll, a sex psychologist, who explains that “[a]s virtual reality becomes more realistic and immersive and is able to mimic and even improve on the experience of sex with a human partner, it is conceivable that some will choose this in preference to sex with a less than perfect human being”27. Is it possible to program perfection into an artificial being? Mans’ flaws are what make them vulnerable and genuinely, humanly loveable. This pursuit for perfection, for evolution and satisfaction is projected into the intense latest work done on technology’s quality. Nonetheless, Ash’s replica shows quickly that he is not the perfect version promised by his avatar.

26 Kant, Emmanuel. Groundwork of the Metaphysics of Morals. Cambridge, U.K: Cambridge University Press, 1998.

27 Zolfagharifard, Ellie; Prigg, Mark. “Could YOU fall in love with a robot? Study suggests we feel as much empathy for droids as we do for other people”. Daily Mail Online. 2015. Accessed on 06/05/2017.

47

Indeed, Martha soon becomes puzzled by Ash’s replica’s “non-human” behaviors. He does not sleep nor eat, he is unable to be away for more than 25 meters from the house without her and she soon realizes that he simply does not have the same personality than the real Ash, but only imitates it based on the information gathered by the system. It becomes difficult for her to live with something that does everything she wants and that does not answer from his own autonomy or free will. She tests him several times to observe his reactions, she asks him to leave the bed and go downstairs which he does directly. Ash would argue over that: “He wouldn’t just leave the room because I’d ordered him to.” This proves once again the difficulty for humans to be in a relationship with a machine that is not “constructed” in the same way and did not live the same human experiences. It is relevant to connect this idea with the Walter Benjamin’s thesis in the canonical essay "The Work of Art in the Age of Mechanical Reproduction" (1936). The author argues that with the arrival of mechanical reproduction mediums like cinema and photography, there is a certain loss of the aura and authenticity of an original work of art: As he famously writes: “[W]hat withers in the age of the technological reproducibility of the work of art is the latter’s aura.” (Benjamin 22) Ash’s replica, the object, is only an insignificant representation of the deceased Ash, the subject. He lost, as it were, both his authenticity and his aura. As Martha says in the end of the episode, when she asks him to jump over a cliff: “You’re just a few ripples of you. There’s no history to you. You’re just a performance of stuff that he performed without thinking and it’s not enough.” It becomes impossible for her to live with an unauthentic Ash that is unable to understand her anger and frustration. Having a copy of her authentic love is not sufficient, she needs the real version of Ash, something that technology pretended it was able to give her, but failed in the end.

The episode also explores the theme of identity as Martha lost part of hers when Ash died but began to regain it thanks to Ash’s replica. As represented in Her and Ex-Machina, the human engaging emotionally with an artificial intelligence suffers from an identity crisis. It is difficult to imagine the intensity and the emotional damages of the events on Martha’s emotions. She has now lost her loved one twice. Even though she was not in love with the second one, but with the memories that he reminded her of; she got attached only to a copy of the authentic Ash. The technology helped her to get a physical substitution of him, but it did not help her find the personal and human intimacy they shared.

Black Mirror’s episode, “Be Right Back” stays in the same vein than Her and Ex-Machina, even though it gives the viewer a different vision of what the future might bring. The initial idea is a captivating one. Whilst grief is one of the most paralyzing emotions than a human can feel, this system seems first to be a good and healthy solution. Indeed, even though she is quite reluctant at the beginning, Martha soon becomes addicted to this new version of Ash, so much that she no longer speaks with her family and friends. She lies to them, saying that her work has taken up all her time lately and that it helped her fill the

48

void that Ash had let behind him. She stays away from the ones who love her and tries to resolve her issues by spending all her time in this secluded house with an artificial being which is not her boyfriend, but just a shallow copy of it. Through this story, it is once again shown that despite the marvelous advances in technology, an emotional artificial being cannot give the human everything he needs. It is unclear in this episode if Ash’s replica is able to feel any emotions or pain. When he picks up the broken glass on the floor, he cuts himself and there is no blood coming out of the wound, reminding his absence of vital organs.

Ex-Machina explores the theme of manipulation in a sentient machine that will do everything in its power to go outside in the world. It shows that the electronic emotions felt by Caleb makes him reflect on his true nature and confuse his sense of belonging to the world. Like Her and Be Right Back, Ex-Machina also shows the difficulty of an emotional relationship between human and machine.

49

Conclusion:

Her delivers a compelling vision of a not-so-distant future where emotional relationships with machines are something not only established, but also accepted within society. As discussed in chapter 2 the operational system, Samantha, is much more than a computing system. She is an ever-changing entity that first dominates Theodore’s attention, and then becomes increasingly independent and self-aware. The intensity of their relationship, and the questions which arise as a result, reflect a complex and ambiguous relationship between humans and the technology of today. At first the film seems to imply that the complexity of human relationships can be resolved by technology. Indeed, Her proposes that being romantically involved with his computer is in some ways easier than dating a real person. The computer is able to be devoted to and sympathetic towards the human’s emotions and needs. Samantha awakes Theodore and makes him love life again. Even though the film is shown from Theodore’s perspective, the film’s title, “Her”, suggests that Samantha is the guiding subject of both the film and, more specifically, of humans in general. Samantha also represents the technology that guides humans and helps them becoming more self-conscious and grow as independent beings.

Whereas the humans in Her act in rigid machine-like ways, the operational systems sound and act more human. As Haraway states in her Manifesto, “our machines are disturbingly lively, and we ourselves frighteningly inert” (Haraway 317). Whilst the remarkable progresses in technology and in the field of AI research may be very helpful for humans, there is also a chance of these developments reducing our humanity. This phenomenon is clearly represented in Ex-Machina, with Nathan working tirelessly to achieve his dream of creating a sentient machine; becoming increasingly out of touch with reality and humanity in his quest for technological perfection. He lives in a large and secluded mansion, removed from every sign of civilization and technology. Paradoxically, this is where he is working with the finest and most developed technology in the world.

The two films raise questions about the ‘authenticity’ of a relationship between an AI and a human being, though neither offers a clear verdict on this matter. Her expounds a love relationship that seems genuine at first, but gradually reveals itself to be an illusion. Even though he found closure in his divorce and in his love life in general, in a circular ending Theodore is alone once again by the end of the film. In Ex- Machina, Ava manipulates sentimentally Caleb and controls him in order to escape the house and discover the real world. She succeeds in getting underneath his skin, in a manner reminiscent of the second skin theory discussed in the first chapter, making him wonder about his true nature. She is incorporated and becomes part of his body, like Samantha is incorporated into Caleb’s body and clothing. By trying to save her, Caleb actually condemns himself. One way of interpreting Alex Garland’s ending is that whilst

50

technology might have increasing human-like capacities, it will never be entirely sympathetic to humans, even where we become emotionally attached.

Humans created technology and, as discussed in Chapter 1, this overwhelming presence of screens, machines and technological tools can affect our conception of who we truly are. This idea is at stake in Her and Ex-Machina, but both films also show how the AIs are making the characters more human. As Rosalind Picard states in her book Affective Computing (1997), which deals with the anticipation of an emotional technology, “The challenge of affective computing is formidable, and not without risk, but it stands to move technology in a radically different direction: toward embracing part of the spark that makes us truly human.” (Picard 252) Ex-Machina, Her and Be Right Back embark upon the theme of identity and subtly imply that technology helps indeed humans to better themselves. That said, the main characters who engage with a sentient machine behave differently and become obsessed with their technological partner, showing the controversy of such a relationship.

Through this project, I intended to show that intimate relationships between humans and technology give a wide range of possibilities in Her and Ex-Machina; it is of the most dominant topics in the science-fiction genre. The two films studied into this thesis show that this topic is more than ever actual and that AI will be more and more part of our lives; we will have to decide whether we accept it or reject it and how we will deal with it. Most of the humans will sympathize with this type of technology like it has been shown with studies on mobile phones, and grow an emotional attachment to it. The sentience of the machine will certainly intensify this connection and, as demonstrated with Ex-Machina and Her, it will lead to the human being left alone by the machine. Technology also separates people from each other, as visually shown in the introduction of Her; one of the main risk and result of such a relationship would be a social isolation.

The literature utilized in this research illustrates a growing academic interest in the special and controversial relationship between man and machine. It has been demonstrated, with the various texts for Intervalla Vol.1, that humans’ attachment to technology is growing and intensifying. Indeed, the plots of Her, Ex-Machina and also Be Right Back are based on this phenomenon and they are delivering a vision of the future that is a logical continuation of this emotional technological attachment.

Anticipative science-fiction films give enough material to be prepared for the future, even if this is only speculative fiction. Recent science-fiction TV like Westworld and Real Humans also explore the themes of emotional relationship between cyborgs and humans. Violence and murder are common patterns in these series, insisting on the problematics dimensions of the human/machine encounter. It is obviously

51

impossible to predict the future; nonetheless, film is certainly one of the most fitting medium to create and propose alternatives for a future that is certainly going to be filled with technological surprises.

52

Bibliography:

- Adibifar, Karam. “Technology and Alienation in Modern-Day Societies”. International Journal of Social Science Studies, Vol.4, No.9. Redfame, 2016. 61-68. - Benjamin, Walter. The Work of Art in the Age of Mechanical Reproduction. Illuminations: Essays and Reflections. Ed. Hannah Arendt. New York: Schocken Books, 1968. 219-26. - Besnier, Jean-Michel; Alexandre, Laurent. Les robots font-ils l’amour? Le transhumanisme en 12 questions. Dunod, 2016. - Emery Brown, Katherine. “The Cyborg in Pieces: Gender Identity in Her and Ex Machina”. The Journal, Darmouth College, 2015. 27-38. - Epley, Nicholas; Waytz, Adam, and Cacioppo, John T. “On Seeing Human: A Three-Factor Theory of Anthropomorphism.” Psychological Review Vol. 114, No. 4, 2007, 864 – 886. - Fidora, Alexander. Ramon Llull: From the Ars Magna to Artificial Intelligence. Artificial Intelligence Research Inst., 2011. - Fortunati, Leopoldina; Vincent, Jane. “Introduction.” Electronic emotion: The mediation of emotion via information and communication technologies. Oxford; Peter Lang, 2009. 1-31 - Gentry, Johnny. “Ex Machina: The Narrative Code Hidden Within the Machine”. Narrative first. 2015. Accessed on 30/05/2017. - Halpern, Daniel; Katz, James. “Close but not Stuck: Understanding Social Distance in Human- Robot Interaction Through A Computer Mediation Approach.” Intervalla: Vol. 1, Franklin College Switzerland, 2013. (17-34) - Haraway, Donna. “A cyborg manifesto: Science, technology, and socialist-feminism in the late twentieth century”. Simians, Cyborgs, and Women: The Reinvention of Nature. New-York: Routledge, 1991. 149-181. - Hassenger, Jesse. “Contrary to popular opinion, Spielberg found the perfect ending for A.I.” A.V. Club. 2015. Accessed on 05/03/2017. - Hoflich, Joachim. “Relationships to Social Robots: Towards A triadic Analysis of Media-oriented Behavior.” Intervalla: Vol. 1, Franklin College Switzerland, 2013. 35-48. - Kant, Emmanuel. Groundwork of the Metaphysics of Morals. Cambridge, U.K: Cambridge University Press, 1998. - Katz, J. E. Machines that become us: The social context of personal communication technology. New Brunswick, NJ: Transaction, 2003.

53

- Liljenfors Rikard; Lundh Lars-Gunnar. “Mentalization and Intersubjectivity towards a Theoretical Integration.” Psychoanalytic Psychology Vol. 32, No.1, 2014. - Longo, Giuseppe. Homo Technologicus. Meltemi Editore, 2013.

- McDermott, Drew. “Artificial Intelligence and Consciousness.” The Cambridge Handbook of Consciousness. Cambridge University Press, 2007. 117-150. - Perlis, Don. “Consciousness as self-function.” Journal of Consciousness Studies. Imprint Academic, 1997. 509–25. - Picard, Rosalind. Affective Computing. MIT Press, 1997.

- Saygin, Ayse Pinar.; Cicekli, Ilyas; Akman, Varol. "Turing Test: 50 Years Later". Minds and Machines, 2000. 463–518. - Senft, Theresa. "Reading Notes on Donna Haraway's 'Cyborg Manifesto." Terrisenft. 2001. Accessed on 25/05/2017. - Shewan, Dan. “Robots will destroy our jobs – and we’re not ready for it.” The Guardian. 2017. Accessed on 21/04/17. < https://www.theguardian.com/technology/2017/jan/11/robots-jobs- employees-artificial-intelligence> - Sireteanu, R; Oertel, V; Mohr, H; Linden, D; Singer, W. "Graphical illustration and functional neuroimaging of visual hallucinations during prolonged blindfolding: A comparison to visual imagery". Journal of Vision. Perception. 2008, 1805–1821. - Sugiyama, Satomi. “Melding with the Self, Melding with Relational Partners, and Turning into a Quasi-social Robot: A Japanese Case Study of People’s Experiences of Emotion and Mobile Devices”. Intervalla: Vol. 1, Franklin College Switzerland, 2013. 71-84. - Sugiyama, Satomi, Vincent, Jane. “Social Robots and Emotion: Transcending the Boundary Between Humans and ICTs.” Intervalla: Vol. 1, Franklin College Switzerland, 2013. 2-5. - Sullins, John. “Robots, Love and Sex: The Ethics of Building a Love Machine.” Transactions on Affective Computing. IEEE Computer Society, 2012. 398-409. - Van Hermet, Kyle. “Why Her Will Dominate UI Design even more than Minority Report.” Wired. 2014. Accessed on 06/04/17. - Wu, Jane; Paeng, Erin. “Trust and Cooperation in Human-Robot Decision Making”. Artificial Intelligence for Human-Robot Interaction. AAAI Fall Symposium Series, 2016. 110-116. - Zolfagharifard, Ellie; Prigg, Mark. “Could YOU fall in love with a robot? Study suggests we feel as much empathy for droids as we do for other people”. Daily Mail Online. 2015. Accessed on 06/05/2017.

54