Approaching Algorithmic Power Luke Munn
Total Page:16
File Type:pdf, Size:1020Kb
Approaching Algorithmic Power Luke Munn A thesis submitted for the Degree of Doctor of Philosophy (PhD) at the Institute for Culture & Society, Western Sydney University February 2019 Contemporary power manifests in the algorithmic. Emerging quite recently as an object of study within media and communications, cultural research, gender and race studies, and urban geography, the algorithm often seems ungraspable. Framed as code, it becomes proprietary property, black-boxed and inaccessible. Framed as a totality, its becomes overwhelmingly complex, incomprehensible in its operations. Framed as a procedure, it becomes a technique to be optimised, bracketing out the political. In struggling to adequately grasp the algorithmic as an object of study, to unravel its mechanisms and materialities, these framings offer limited insight into how algorithmic power is initiated and maintained. This thesis instead argues for an alternative approach: firstly, that the algorithmic is coordinated by a coherent internal logic, a knowledge-structure that understands the world in particular ways; second, that the algorithmic is enacted through control, a material and therefore observable performance which purposively influences people and things towards a predetermined outcome; and third, that this complex totality of architectures and operations can be productively analysed as strategic sociotechnical clusters of machines. This method of inquiry is developed with and tested against four contemporary examples: Uber, Airbnb, Amazon Alexa, and Palantir Gotham. Highly profitable, widely adopted and globally operational, they exemplify the algorithmic shift from whiteboard to world. But if the world is productive, it is also precarious, consisting of frictional spaces and antagonistic subjects. Force cannot be assumed as unilinear, but is incessantly negotiated—operations of parsing data and processing tasks forming broader operations that strive to establish subjectivities and shape relations. These negotiations can fail, destabilised by inadequate logics and weak control. A more generic understanding of logic and control enables a historiography of the algorithmic. The ability to index information, to structure the flow of labor, to exert force over subjects and spaces— these did not emerge with the microchip and the mainframe, but are part of a longer lineage of calculation. Two moments from this lineage are examined: house-numbering in the Habsburg Empire and punch-card machines in the Third Reich. Rather than revolutionary, this genealogy suggests an evolutionary process, albeit uneven, linking the computation of past and present. The thesis makes a methodological contribution to the nascent field of algorithmic studies. But more importantly, it renders algorithmic power more intelligible as a material force. Structured and implemented in particular ways, the design of logic and control construct different versions, or modalities, of algorithmic power. This power is political, it calibrates subjectivities towards certain ends, it prioritises space in specific ways, and it privileges particular practices whilst suppressing others. In apprehending operational logics, the practice of method thus foregrounds the sociopolitical dimensions of algorithmic power. As the algorithmic increasingly infiltrates into and governs the everyday, the ability to understand, critique, and intervene in this new field of power becomes more urgent. Practice-based Strand: Where the theoretical strand breaks down Uber, Alexa, Airbnb, and Palantir into a series of machines, the artistic strand of this project creates a series of machines in the form of browser-based artworks. These artworks are not 1:1, mirror-like recreations of the previous case-studies, nor offered as reiterations of them. While this practice draws upon discoveries from the theoretical strand, it reverse engineers them, rearranging elements in order to foreground new aspects that either critique the original or posit alternative models. A theoretical inquiry into the history of the algorithmic, for example, led to A Machine for Reducing Risk, a data visualisation of an 18th century maritime insurance company that underpinned slave shipping. Realised as standalone works, they have been presented in a range of contexts: online articles, digital galleries, and film festivals. Through both the making process itself and the feedback from various audiences, they offer insights into the algorithmic that can be read alongside the theoretical strand of the project. Two basic motivations informed the creation of these artworks. The first was critique, a desire to foreground the pathologic politics of these technical regimes which are often overlooked. By transforming these darker aspects of the algorithmic into image, sound, and code, they become less abstract and more embodied—able to be seen, heard or experienced. A Machine for Sonifying Toxic Space, for example, generates a soundtrack from the chemical spills of the Silicon Valley technology industry. As a tool for critique, the artwork also provides a circumscribed object, a form able to intensify subtle tendencies or distill a typically complex set of conditions. A Machine for Sorting Skin, for example, orders photographs of bodies from lightest to darkest, explicitly foregrounding the history of racial classification inherent to technology. In crystallising aspects like bodily harm and environmental degradation, these artworks provide an alternative mode of critique to theory. The second mode of creation was more speculative, less concerned with critiquing existing systems and more interested in exploring unrealised alternatives. Freed from the dictates of the functional or feasible, art provides a productive medium for posing the question: what if? A Machine for Witnessing and Recounting a Crime, for example, extrapolates from a real-world murder with a simulated desktop that recounts the evening’s events via smart home notifications. Other artworks speculate on future possibilities by applying existing techniques like machine learning to domains such as terrorism. A Machine for Dreaming Up New Anxieties, for example, uses a character-based neural network trained on war reports to generate new ‘suspicious’ incidents, which are then illustrated with found footage. Rendering these alternative visions into prototypes not only presents new possibilities, but also provides a way of aesthetically registering the political stakes of algorithmic power. All works can be viewed at http://darkmttr.xyz Artwork descriptions can also be found in the Appendix section of this document. Statement of Authentication: The work presented in this thesis is, to the best of my knowledge and belief, original except as acknowledged in the text. I hereby declare that I have not submitted this material, either in full or in part, for a degree at this or any other institution. Acknowledgements: Thanks to my primary supervisor, Ned Rossiter, for his unwavering yet critical commitment to this work. This holistic support, ranging from feedback to opportunities to present and publish, has developed this work immeasurably. Thanks to the Institute for Culture & Society for the seminars and workshops that not only allowed me to develop my own ideas, but learn from others. Thanks to Western Sydney University for the Australian Postgraduate Award, a scholarship which financially supported this work over three years. Thanks to those who hosted me in various seminars, summer schools and workshops: ‘Historiographies of Digital Cultures’ at the Centre for Digital Cultures, Leuphana University Lüneburg, ‘What is Digital Media?’ at ICS, ‘Research Values’ at the Brandenburgische Zentrum für Medienwissenschaften in Potsdam, later presented at Transmediale in Berlin, ‘Investigating Logistics’ at Humboldt University in Berlin, and ‘The Performance and Performativity of Violence’ at Otago University. Thanks to those who published earlier versions of this material. This includes: Munn, Luke (2018) Unmaking the Algorithm. Lüneburg: Meson Press. Munn, Luke (2018) ‘Rendered Inoperable: Uber and the Collapse of Algorithmic Power’, APRJA 7(1), http://www.aprja.net/rendered-inoperable-uber-and-the-collapse-of-algorithmic-power/. Munn, Luke (2018) ‘Alexa and the Intersectional Interface’, Angles 7, http://angles.saesfrance.org/index.php?id=1492. Munn, Luke (2017) ‘I am a Driver-Partner’, Work Organisation, Labour & Globalisation 11(2): 7-20. Munn, Luke (2017) ‘Seeing with Software: Palantir and the Regulation of Life’, Studies in Control Societies 2(1), https://studiesincontrolsocieties.org/seeing-with-software/. Finally, thanks to my family: to my parents, Stephen and Glenys, for supporting the work in numerous ways, from office space to afternoon coffees, to my brothers Daniel, Seth, Josh, Caleb and sister Rachel for the interest and socialising that kept me sane, and most of all to my immediate family: my wife Kimberlee, for her steadfast support over the years, and my children, Ari and Sol, who care far more about post-trip presents than boring work stuff. Formatting: The text adheres to the Chicago Manual of Style 17th edition (CMOS) in terms of citations and punctuation, but uses Australian spellings unless directly quoting from an American source. Contents Introduction 1 Method 6 Origins and Redefinition 6 Control 11 Logic 14 Machines 21 1. Legitimate Power: Palantir 26 Delineating Life 27 Stack-Tools-Analyst machine 29 Life-DynamicOntology machine 35 Analyst-Thunderbird-LosAngeles machine 40 Deporting