Researching Algorithmic Institutions

RESEARCHING ALGORITHMIC governance produces an essentially harnessed to the pursuit of procedures experimental condition of institutionality. That and realization of rules. Experiments lend INSTITUTIONS which becomes available for ‘disruption’ or themselves to the goal-oriented world of Liam Magee and Ned Rossiter ‘innovation’ — both institutional encodings — algorithms. As such, the invention of new Like any research centre that today is equally prescribed within silicon test-beds institutional forms and practices would seem 3 investigates the media conditions of social that propose limits to political possibility. antithetical to experiments in algorithmic organization, the Institute for Culture and Experiments privilege the repeatability and governance. Yet what if we consider Society modulates functional institutional reproducibility of action. This is characteristic experience itself as conditioned and governance with what might be said to be, of algorithmic routines that accommodate made possible by experiments in operatively and with a certain conscientious variation only through the a priori of known algorithmic governance? attention to method, algorithmic experiments. statistical parameters. Innovation, in other Surely enough, the past few decades have In this essay we convolve these terms. words, is merely a variation of the known seen a steady transformation of many As governance moves beyond Weberian within the horizon of fault tolerance. institutional settings. There are many studies proceduralism toward its algorithmic Experiments in algorithmic governance that account for such change as coinciding automation, research life itself becomes are radically dissimilar from the experience with and often directly resulting from the ways subject to institutional experimentation. of politics and culture, which can be in which neoliberal agendas have variously Parametric adjustment generates sine wave- understood as the constitutive outside of impacted organizational values and practices. like ripple effects in the allocation of time, passion and affect that infiltrates models in Our focus is on the struggle of governance labour, thought and practice. Reflexivity, the process of designing parameters that immanent to the relation between experience long an imperative of social research, now inform the operational rules of algorithms. and experiment, algorithm and institution. demands attentiveness to how these entwined Once operationalized as algorithms, politics What, in short, are the propagating effects of forces of governance and experimentation is enacted as a procedural routine and algorithmic governance as a routine complex produce the research subject. culture is made accountable within metrics of institutional practices? Modes of governance within institutional of calculation. Both instances necessarily This essay examines computational conditions settings are increasingly shaped by externalize experience. Too often economics that organize the world and, increasingly, life. algorithmic architectures of organization. takes command. As an analytical mode, We ask how the operational logic of digital Algorithmic governance details not only the conversely, political economics does not take technology might furnish concepts of power application of computational procedures to command often enough. The contemporary able to describe and explain the empirical issues of operative management, control demarcation of politics and culture from the world. Specifically, the essay focuses on how and decision-making but it further describes economy itself exhibits a nostalgia for pure power is generated within and by digital the re-engineering of organizations to the power or an unpolluted aesthetics — affective infrastructures, systems, operations and demands of those procedures. In idealised modalities which, against a collective despair practices. The broader objective here — one terms, algorithmic governance begins with over the obdurate and thoroughly determined beyond the scope of this present essay — is to a problem that is first tested and formalized state of the economy, seem plausible escape establish empirical co-ordinates that provide within the parametric constraints of a model.2 routes, and which in mass-form return as an analytical basis for populating disciplines As Tarleton Gillespie explains, the problem devices for the securitization of in the humanities and social sciences with is first articulated through the model in institutional control. a conceptual vocabulary coextensive with mathematical terms, then operationalized as A tension continues to persist between contemporary technological conditions. a procedural task performed by an algorithm. experiment and experience in the generation For the purposes of this essay we consider As algorithms are amended, hacked, refined of new institutional forms immanent to the data centre — also known as server or substituted, the fidelity of the model to algorithmic modes of governance. If invention farms, colocation centres or the cloud — as a mathematical proceduralism divorced subsists and foments within the phenomenal the infrastructural core that governs the from environment signals how modes realm of experience, then experiment is akin production of space, time, subjectivity and of governance frequently collapse in the to laboratory life that verifies knowledge economy. Off-the-grid computing aside, wild. Needless to say, formalistic variability incrementally under controlled conditions.4 most data will at some point and time traffic does not contradict the core thesis that we Where experience is expansive and through a data centre. develop in this essay, namely that algorithmic contingent, experiments are necessarily

1 Thanks to Paul James for suggestions on phrasing and framing. 2 See Gillespie, T 2016, ‘Algorithm’, in B Peters (ed.), Digital keywords: a vocabulary of information society and culture, Princeton University Press, Princeton, pp. 19–20. 3 For an analysis and critique of ‘testing’ and ‘demoing’ of ‘solutions’ to address crises (urban, financial, health, and environmental, etc.), see Halpern, O, LeCavalier, J & Calvillo, N 2017, ‘Test-bed urbanism’, Public Culture, vol. 25, no. 2, pp. 273–306. See also Halpern, O, Mitchell, R & Geoghegan, BD 2017, ‘The smartness mandate: notes toward a critique’, Grey Room, no. 68, pp. 106–29. 4 See Neilson, B & Rossiter, N 2006, ‘Towards a political anthropology of new institutional forms’, ephemera: theory & politics in organization, vol. 6, no. 4, p. 394, viewed 1 May 2019, http://www.ephemerajournal.org/contribution/towards-political-anthropology-new-institutional-forms.

Western Sydney University SOVEREIGN MEDIA, result in economic stress for many. For all the organization. The territoriality of data TERRITORIALITY AND THE transactional visibility of distributed public centres carves out new geographies of TRAFFIC OF DATA ledgers such as Bitcoin, the now frequent power, giving rise to forms of infrastructural security breaches and ‘disappearance’ of sovereignty that contest, intersect, multiply We occasionally have a sense that our the crypto-currency from exchanges should and depart from the modern sovereign quotidian decisions are governed by be sufficient enough of a prompt to caution power of the nation-state.8 But how do algorithmic architectures humming away against at least a first-order valorizing the we specify the operational logic of data in the background. Mostly, we are resigned merits of openness.6 centres? Our response is to return to an to machinic authority taking command of analysis of its computational architecture, the extraction of value and monetization of Digital infrastructures such as data centres which here are represented by parallel computational life generated from the surfeit are chief among the milieu of technological processing frameworks: obscure yet of data. Certainly, the last few years have forms in which transactions in data impact on indispensable parts that make possible, for brought greater attention to power exercised a world external to the operational logic of example, neural networks, machine learning by algorithms, without yet producing much signals, transmission, processing and storage. and artificial intelligence applications. by way of a counter-power that can refuse Hosting the servers that support the software We consider this intervention in part as or deflect the assertion of control. Whether and data analytics of ‘the cloud’, data centres constituting a genealogy of power tied it is social media routines, digital accounting can be considered as the invisible shadow of to computational architectures. And in systems, military operations in theatres of war institutional forms such as universities, banks, distilling an infrastructural object such as or governing populations and migration across social media companies, logistical firms, the data centre to computational operations geographic scales and sovereign spaces, there among many others. Indeed, data centres specific to parallel processing, we are also is a transactional logic that attends the traffic provide the operational core of institutional suggesting that there remains an unavoidable in data. Blockchain technologies manage practices dependent on the transactions in necessity to consider technological forces data transactions through a distributed public data. Similarly, the algorithmic procedures of determination. This genealogy takes us ledger. High-frequency trading, by contrast, that organize data in ways that make back to an operational core from which we buries millions of financial derivatives and action possible are central to the governance can begin to make sense of the structuring credit default swaps, resulting in the ‘social of institutions. of the world not reducible to black box 5 abstraction of risk’. Well-documented in a impenetrability. There is no tabula rasa upon scholarly literature, even in a mode of critique From the purview of algorithmic governance, which fantasies and fears may be projected. blockchains and HFT evoke a shared awe subjectivity and the presence of human and dread of the technological sublime. Less personnel are merely functionaries to be attended to are the enduring consequential managed. A peculiar form of technological THE LOGISTICS OF DATA IT banalities of administrative Excel formulas, unconscious also defines the relation Despite a tendency within business and JavaScript functions and database procedures between machine and subject. This is academic circles to think on ‘global’ scales, that calculate hourly rates, leave accruals and despite critiques of algorithmic governance digital operations are not as planetary, or as performance indicators. Alongside global that consider managers and gestures of totalizing, as critics like Benjamin Bratton blockchains, microwave data broadcasting leadership as indistinct from the parametric would have it. Indeed, the computational and neural networks, twenty-year-old horizon of algorithms. Without the distinct metaphor of the ‘stack’ is a tempting software architectures help administer activity of human labour expressed through but limited metaphor for the current organizations and run the world. computational systems the machine-as- reconfiguration of organizational life through institution is a soul depleted of the substance algorithmic governance. While not exclusive The non-representational architectures that 7 upon which its data operations depend. Yet to the digital, Bratton’s model of ‘The underscore many computational transactions this human vessel of unsubstantiated yearning Stack’ refers to planetary-scale technical unsettle a politics of intervention predicated may be just an interval that terminates infrastructures consisting of Earth, Cloud, on the visibility of things. Yet neither is there with social accommodation of the fully City, Address, Interface and User layers. a panacea to be found in making transactions automated organization. Bratton’s insistence on interoperability of data visible. The often moralistically across and between these sectional layers imbued calls for data transparency and Part media-infrastructure and part locus provides the conceptual schema needed accountability are no less vulnerable at of algorithmic decision-making, the data to then propose The Stack as a model of functional, operational levels to unforeseen centre indicates one possible form of such planetary computation, which he assumes as crashes, hacks or inexplicable events that may

5 See LiPuma, E & Lee, B 2004, Financial derivatives and the globalization of risk, Duke University Press, Durham. 6 For a critique of the celebration of openness, see Tkacz, N 2015, Wikipedia and the politics of openness, University of Chicago Press, Chicago. 7 As Stefano Harney quips in an interview with Michael Schapira and Jesse Montgomery, ‘most managers have already been replaced by machines … We know they work not only within the parameters of an algorithm but with its predictions and prescriptions. They are only there to implement and call it leadership’. Schapira, M & Montgomery, J 2017, ‘Stefano Harney (Part 1)’, Full Stop Quarterly, 8 August, viewed 1 May 2019, http://www.full-stop.net/2017/08/08/interviews/michael-schapira-and-jesse-montgomery/stefano-harney-part-1/. 8 On the geography of data centres, see https://cloudscene.com/. westernsydney.edu.au/ics ‘a coherent totality’.9 It is on the basis of this new operative fissures. Academia is of course within which, perversely, hum the machines of totality that Bratton then proposes The Stack not immune to the relentless detection of twenty-first century politics. as inaugurating a new political geography notworking in action, and, as it finds its new that multiplies sovereign power beyond the place as a junior partner to corporatism, has The stacks which operate here can indeed Westphalian state. ‘Platform sovereignty’ become increasingly adept at translating gaps be described as Bratton suggests: they and ‘infrastructural sovereignty’ are the two in the literature to gaps in the market. are layers of cabling, network protocols, primary modes of technical power resulting servers, operating systems, databases from The Stack. Analytically, algorithmic governance denotes and, in a gesture that would appear to the conjunction of models that limit just as reaffirm its privilege at the uppermost We don’t deny that sovereign power is much as they open up political possibilities, level of this architecture, the algorithm. multiplied beyond and not limited to modern a conjunction that, in turn, produces a Yet this characterization is limited in its state polities. The Stack is not required phantasmic situation of total control (Bratton) technical precision as much as in its political as a heuristic device or infrastructural or, through its failures, re-admits contingency terminology. Indeed, the stack can never condition to explain the transformations into a new, computationally mediated accomplish very much without a fine-grained to state sovereignty wrought by economic dialectic of determination and possibility. articulation of operations within and between and technological globalization. Sovereign Refusing the neoliberal characterizations its specific layers, and these operations power in any case is always contested. And of such a data politics as inherently open, turn out to be as or more significant, both this is the key point overlooked by Bratton, transparent and liberatory has become the in the determination and contingency whose predilection for totalization produces de facto means of strategic operation for its of governance, than the coarse-grained a significant analytical and political oversight. actors: the significant shifts over the past few organization of the entire data apparatus Namely, that interoperability between systems years have practically confirmed this point.12 itself. In short, alternate forms of algorithmic (layers) is never absolute. Protocological The rush to securitize data infrastructures governance must today intervene at the level conflicts, technological propensities, through physical borders and barricades, of parameters as much as at the level of the geological properties and institutional privacy regulation and appeals for Internet coal and rare-earth mines that disputes — to say nothing of social struggles regulation — amazingly, from the heart of power and produce the materials for — prevail across networks of relations that, Silicon Valley itself — makes no secret any algorithmic operations. frequently enough, are notworking.10 The more of a privatized and explicitly political re- failure of interoperability not only unsettles territorialisation of what, already, appears to Each of these layers contain their own the totalizing if contingent logic of sovereign us not as an object of a present or near-future, historical unfoldings. The ‘model/algorithm’ power Bratton attributes to The Stack but also but through the form of nostalgia — a digital distinction offers a nice point-in-time makes for a considerably more complicated public sphere.13 encapsulation, but does little to convey the operation of power and conceptualization historical reconfiguration over time of the of the political at the current conjuncture.11 At such junctures it is tempting to withdraw to algorithm itself. Today, precisely as terms Moreover, digital capitalism in some sense the inner salons of what György Lukács once like ‘algorithmic governance’ appear to depends upon this constitutive gap between described, in reference to the first generation register a certain new confluence of power/ fantasy of control and reality of breakdown. of the Frankfurt School, as the Grand Hotel knowledge, the algorithm is undergoing Such gaps ‘ask’, indeed, to be cleverly Abyss.14 Accommodation that today, precisely substantial reconfiguration.15 In machine exploited, to manufacture the customer through the affordances of digital social learning, algorithms are constituted by other ‘need’ and to introduce new disruptions media, is available to nearly everyone at algorithms, through a posteriori statistical and innovations in response. At a global Airbnb rack rates. We prefer to turn toward patterning rather than a priori models. In data scale, Tesla, one of the finest exponents of the resources afforded by the computational centres, the algorithm is turned inside out, need-production, exists precisely to bridge architectures themselves — less the salons defined and distributed according the physical present carbon and future renewable energy occupied by an erudite and déshabillé and logical arrangement of data rather than economies. Such bridging, in turn, produces commentariat, and rather the warehouses, by the model’s idealization. rack rooms and ‘non-places’ of cool alienation

9 Bratton, BH 2015, The stack: on software and sovereignty, MIT Press, Cambridge, p. 375. 10 See Lovink, G 2005, The principle of notworking: concepts in critical internet culture, public lecture, Hogeschool van Amsterdam, 24 February. Available at http://networkcultures.org/blog/publication/the-principle-of-notworking-geert-lovink/. 11 See Mezzadra, S & Neilson, B 2019, The politics of operations: excavating contemporary capitalism, Duke University Press, Durham. 12 An earlier draft of this paper was presented as Magee, L & Rossiter, N 2016, ‘Operationalising the data centre: algorithmic platforms and the distribution of computational labour’, Crossroads in Cultural Studies, Sydney, 14–17 December, http://crossroads2016.org/. 13 See Zuckerberg, M 2019, The internet needs new rules: let’s start in these four areas, 31 March, https://www.facebook.com/zuck/posts/10107013839885441. 14 See Jeffries, S 2016, Grand Hotel Abyss: the lives of the Frankfurt School, Verso, London. 15 See Rouvroy, A & Stiegler, B 2016, ‘The digital regime of truth: from the algorithmic governmentality to a new rule of law’, trans. A Nony & B Dillet, La Deleuziana, no. 3, pp. 6–29, viewed 1 May 2019, http://www.ladeleuziana.org/wp-content/uploads/2016/12/Rouvroy-Stiegler_eng.pdf and Amoore, L & Raley, R 2017, ‘Securing with algorithms: knowledge, decision, sovereignty’, Security Dialogue, vol. 48, no. 1, pp. 3–10.

Western Sydney University Algorithmic parallelization — running a processed increases, cloud computing EXPERIMENTS IN PARALLEL software implementation of an algorithm permits the algorithm’s operation to span COMPUTING concurrently on multiple processors multiple machines automatically. Increases In the world of parallel computing, the term or machines — is one example of this in computational power are paid for by the ‘pipeline’ is used to describe the conduits restructuring. Parallel processing can be hour or by other discrete quantities, and that link data sources, transformations and applied to tasks that are independent, where, without the intervention of human labour outputs. Like the UNIX pipe operator used in for instance, one task is not dependent upon to add machines, install software and terminal commands (‘|’), pipelines funnel data the results of another. Parallel processing configure networks. This commodification through input/output sequences (shown in encompasses processing systems for serving is facilitated through software frameworks, Table 1 below). Parallel computing frameworks web content, registering user clicks and or ‘middleware’, that mediate between the employ the metaphor of pipes and pipelines ‘likes’, tabulating and storage of smart city hardware facilities of the data centre and to describe the distribution the data sensor-generated data, image analysis of performance of the algorithm. Major cloud processing across multiple copies of a given photo streams and trend analysis of financial companies such as Amazon and have resource. These include the multiple cores of a transactions. These systems are widely produced such frameworks — one of which, central processing unit of a single machine, or deployed within the data infrastructures Apache Beam, we describe in detail below — the combined capacities of multiple machines that support what has been termed to simplify the process of redesign and ease on a network. Pipes funnel inputs across these ‘platform capitalism’: this includes data configuration of parallelized algorithms on resources and reassemble outputs produced centres operated by Google, Facebook and their cloud-computing platforms. Frameworks by their processing. independent business exchanges (or IBX) such as these make possible the parametric such as those run by Equinix in the inner-city adjustment of algorithms, producing the Such plumbing metaphors extend to the very 16 suburb of Alexandria in Sydney. Parallelized property of ‘elasticity’ famously associated naming of parallelisation frameworks. One pipeline architectures run on top of clusters of with cloud computing: that is, the ability example is DataFlow, a project developed machines housed within these data centres, to add or subtract computing resources by Google, and later contributed by the making it possible for these computational dynamically in response to demand, budget, company as an open-source project, now 19 resources to be used efficiently. dataset size and other constraints. renamed as Beam, to the Apache Foundation, a consortium that describes itself as ‘the Despite its affordances, enabling algorithms If our understanding of digital economy, world’s largest open-source foundation’.20 The to operate in a parallel rather than serial society and the production of subjectivity process of migrating software developed by fashion can be a complex, specialized and is to come to terms with this expansiveness companies for internal use to open source is expensive task.17 Strategies to do so typically and elasticity, a critique of data politics and common. It is a means for building corporate enlist an approach of ‘divide-and-conquer’, algorithmic institutions needs to register the goodwill in the wider software development redesigning or refactoring the software techniques, processes and operations special community and also for establishing de implementation of an algorithm to separate to media infrastructures. The technical, energy facto standards that help to realise other and process data inputs independently, and commercial constraints of parallelized organisational objectives. Apache’s projects, before joining or merging their outputs. The architectures articulate a data grammar of these members, licensing arrangements and its occasion of refactoring can substantially operations, a set of essential verbs of platform often-informal decision-making processes — alter the properties of the algorithm itself.18 capitalism: mapping, reducing, cutting, filtering, conducted on publicly accessible mailing lists Conventional use of parallelized algorithms partitioning, sequencing, transforming, flattening, — provide a glimpse into the confluences and also requires detailed knowledge of the merging and piping. Enumerating, rehearsing and complex relations of open-source software operating environment: the type of machines evaluating the elements of this grammar across and cloud computing companies. The projects the algorithms will run on, how datasets need hardware and software platforms establishes accepted by the Apache Foundation (Table to be partitioned and how the network is a baseline from which the determining force 1) map schematically the history of web configured to enable copies of the algorithm of data politics is made discernible, at least in computing over the past quarter century: they to communicate progress. preliminary ways. Operations and techniques range from content servers that deliver web such as these define the parameters within pages and images, to systems that enable big The arrival of cloud computing represents which action protrudes into the world. In the next data management, virtualization, security the commodification of the expertise and section, we transition to a more technical mode of and parallelization. resources required for managing enormous description, to pursue the algorithm through the circuits of data. As the volume of the data process of parallelization.

16 See Srnicek, N 2017, Platform capitalism, Polity, Cambridge. 17 See Blelloch, GE & Maggs, BM 1996, ‘Parallel algorithms’, ACM Computing Surveys (CSUR), vol. 28, no. 1, pp. 51-54. 18 Blelloch & Maggs, ‘Parallel algorithms’. 19 See Herbst, N, Kounev, S & Reussner, R 2013, ‘Elasticity in cloud computing: what it is, and what it is not’, Proceedings of the 10th International Conference on Autonomic Computing (ICAC 2013), San Jose, CA, 24-28 June. 20 See http://apache.org/#.

westernsydney.edu.au/ics Beam operates like a virtual machine, same code, configuration and data executes Despite the promises of Google’s platform, our translating data-processing requests made identically on a laptop machine and on experiment still generated practical examples by software into instructions that can be Google’s Cloud Platform, on which we had of notworking: conflicting software libraries executed in diverse operating environments. established an account for testing purposes. and essential steps that are undocumented Algorithms on Beam can be tested on a single This is significant due to the quite different and are only discovered through educated computer, deployed to a localized distributed operations that are performed in each case. guesswork and trial-and-error. Rather than an computing environment or uploaded to a idealized path from model to execution, the cloud-computing platform, where computing On a single machine, the code executes on automation of scalable procedures requires time is purchased by the hour. Running a multiple processors, typically limited to two all-too-human qualities of literacy, persistence simple example algorithm that extracts word or four. On the cloud, multiple copies of the and faith in the telos of computational reason. frequencies from text, supplied as part of the code may instead run on many machines, In the spirit of Gmail’s decade-long beta Beam framework, showed several distinctive limited only by budget. The promise of Beam status, everything remains provisional. It features of parallelisation. First, the code for framework seems partly fulfilled so long as is stitched together not in a seamless web the algorithm requires a specific operational an algorithm conforms to the requirements of interoperability but through numerous logic: the text input is loaded from files; the of the framework, it is scaleable on multiple iterations copy-and-paste, trial-and-error, and word frequencies are calculated; the output processes and machines, and running searches through mailing lists and Q-and-A containing the frequencies is formatted; and, seamlessly between both personal and cloud forums. Equally, we note that once the code finally, the output is written or ‘piped’ to a computing environments. While ostensibly ran successfully, determining whether to run file. A conventional implementation of an an open-source project, Beam facilitates it on a local machine or on Google’s cloud algorithm might adopt this same procedure, developers to publish their software on is merely a case of changing parameters but would not be required to do so. The commercial services like Cloud Platform, and territorial scale — even modified code is second feature shows why this logic is advertised as helping applications to ‘grow automatically recompiled and uploaded to the necessary: when the parallel implementation from prototype to production to planet- cloud, making for what has been is executed on a laptop machine, it runs scale, without having to think about capacity, termed a ‘No-Ops’ (or no technical several concurrent copies of the software, reliability or performance’.21 In other words, operations) environment. each of which processes separate parts of Beam lays down the pipes that take the the data input. The third feature is that the entrepreneur from the garage to the globe.

Table 1: History of Major Apache Projects

YEAR(S) SELECTED APACHE PROJECTS WEB COMPUTING ARCHITECTURES

1995 Apache Web Server (Brian Behlendorf and others, 1995) Web Servers Data Centres largely local ISPs

~2000 (Sun Microsystems, 1999) Middleware, enterprise computing Apache Struts (Sun Microsystems, 2000) Dynamic web content (IBM, 1999) XML (2001) Text Indexing Apache Software Foundation Open-source software committees – including corporations as good tech citizens Apache Software License (2000) Open source (permitting commercial derivatives)

Mid-2000s Apache Web Services (formerly Apache SOAP, Axis) Web services (Google, Yahoo, 2006) Elastic computing, distributed algorithms

2010s Apache CloudStack (contributed by Citrix, 2011) Cloud computing Apache CouchDB (2008) Alternative (NoSQL) databases (2013) Machine learning, parallel processing (2011)

2015/6 Apache Beam (contributed by Google, 2016) Parallel stream processing

21 , https://cloud.google.com/.

Western Sydney University Our wider interest here is to interrogate the How to re-engineer the black-box of critique and reorientation if society is to resist cloud-computational architecture that now algorithmic apparatuses? What does a total submission to algorithmic authority.25 mediates the abstract work of algorithms knowledge of computational rules and In a similar spirit, we suggest black boxes on data, and their spatial instantiation inside procedures tell us about the governance of are demystified and indeed made more the data centre. Critical inspection neither labour and life, economy and society? Notable knowable once their operations are rehearsed, ‘unboxes’ the ‘black box’ hypothetically research in the nascent field critical algorithm simulated, observed and replicated. common both to the algorithm and to the studies has identified the political economy of Virtualization, containers and parallelization data centre, nor seeks to reify it as a happily Google’s PageRank and its capture of living have become integral mediating technologies unknowable technical artefact. Rather, it labour to produce ‘network value’, crowd work between the abstraction of the algorithm rehearses aspects of the technical conditions on Amazon Mechanical Turk and the ‘crowd and the fortification of data centres. They under which machinic interoperability as fleecing’ of drivers that underscores the belong, in other words, to a new grammar of well as inoperability are performed. As the growth model for Uber.23 Despite the attention algorithmic governance. transfer of labour from human to the machine in these studies to algorithmic power, the continues, this amounts to a renovation of the actual architectures often remain elusive and kind of inquiry into conditions of work on the this is not just because computer scientists MULTIPLYING ALGORITHMIC factory floor that once preoccupied the social are not on the scene to lend a critical hand. INSTITUTIONS analysis of labour. As Trebor Scholz notes, ‘While people are The analysis above of Apache Beam registers powering the system, MTurk is meant to feel two key features of emergent algorithmic like a machine to its end-users: humans are institutions. First, it describes a shift in the ALGORITHMIC CRITIQUE seamlessly embedded in the algorithm. AMT’s operations of algorithms made possible by The exercises we conducted here are inspired clients are quick to forget that it is human parallel architectures adaptable to a range by the soporific use-cases of tutorials and beings and not algorithms that are toiling for of hardware and network configurations. training guides. Outputs are of course not the them — people with very real human needs Second, this seemingly banal feature point. Rather, the development of datasets, and desires’.24 illustrates the lack of totality and closure code examples and cloud configuration within a single operative environment. illustrates the purpose of otherwise obscure When humans become indistinguishable Driven by what appear to be solely technical investments by companies like Google from machines, what does this mean for a considerations — precisely, in other words, in parallel-processing frameworks. Such politics of operation? Does a critical dissection through a desire for mastery and control investments belong to a long history of of algorithms, for instance, provide a point — contingency is reintroduced to the strategic corporate contribution to open- of entry into organizing networks of digital algorithmic situation. The link from algorithmic source software, and today these serve to labour? For Scholz, the answer is ‘no’, at least parallelization to experimental modes of pave the way for the outsourcing of labour not in any exclusive sense. Instead, Scholz governance is one that is, of course, at best and infrastructural needs to data centres and advocates ‘platform cooperativism’ as a suggestive. Nonetheless, this parallelization cloud services that in turn support the rise consortium model that clones and adapts forms part of the ‘infrastructuring’ that has of large scale data accumulation, processing technologies of the sharing economy, making already realised WikiLeaks, BitCoin and and analysis. One predictable consequence use of web apps such as Loomio (Occupy, other digitally led forms of what political is the deterioration of in-house IT staff that Podemos) and blockchain technologies such scientists Cui Zhiyuan in dialogue with Charles defined the preceding eras of mainframe and as Backfeed, D-CENT and Consensys for the F. Sabel and Jonathan Zeitlin have termed client-server computing. The incremental autonomy of labour organization and ‘experimental governance’.26 While a highly erasure of labour borne by the ongoing march social movements. conciliatory view of experimentation, with of industrial modernity is coupled with a elements of ‘Third Way’ exuberance retrieved Frank Pasquale is not as quick in letting go displacement of institutional autonomy by from a decade marked, not coincidentally, by of net-neutrality claims by internet giants. the sovereign media of platform capitalism dot-com mania, there is nonetheless some He maintains that algorithmic methods and the automation of organizational critical and conceptual purchase to be had of extracting value from data, devising routines. Alienation returns. Memory is further from considering algorithmic institutions as criteria for automated decision making and exteriorized to the machine.22 test-beds of computational modes governance, and calculating procedures for of governance. finance capital must be subject to systematic

22 See Stack Overflow, http://stackoverflow.com/. 23 See, respectively, Pasquinelli, M 2009, ‘Google’s PageRank algorithm: a diagram of cognitive capitalism and the rentier of the common intellect’, in K Becker & F Stalder (eds), Deep search: the politics of search beyond google, Studien Verlag, Innsbruck, pp. 152–62; Irani, L 2015, ‘Difference and dependence among digital workers: the case of Amazon Mechanical Turk’, South Atlantic Quarterly, vol. 114, no. 1, pp. 225–34; and Scholz, T 2017, Uberworked and underpaid: how workers are disrupting the digital economy, Polity, Cambridge. 24 Scholz, Uberworked and underpaid, p. 20. See also Munn, L 2018, Ferocious logics: unmaking the algorithm, Meson Press, Lüneburg, https://meson.press/books/ferocious-logics/. 25 Pasquale, F 2015, The black box society: the secret algorithms that control money and information, Harvard University Press, Cambridge. 26 See Sabel, CF & Zeitlin, J 2012, ‘Experimentalist governance’, in Oxford handbook of governance, Oxford University Press, Oxford, pp. 169–83.

westernsydney.edu.au/ics Neither top-down autocracy, nor bottom-up dictum — to ‘move fast and break things’ — reintegration of markets through progressive anarchy, but ‘directly deliberative polyarchy’, have, often enough, motivated retreat towards regulation into a more dispersed but the emphasis on ‘recursion’ signals a form of even more forbidding levels of governance, comprehensive system of control. governance indebted to characteristics of performativity and risk management. the algorithm. Despite an idealised affinity This is not only a case of the state with the experimental culture of the start-up At an organizational level, such governance increasingly outsourcing a once sacrosanct or indeed the iterative cultural logic of ‘fail becomes exteriorized in the cloud, or, more responsibility to private service providers. fast’ typical of R&D for platform monopolies precisely, outsourced to the combined The multiple diffusions and aggregations of such as Google, Facebook and Amazon, we processing power of computers owned population data throughout a heterogeneous acknowledge the vastly different political and and operated by Big Tech corporations. Yet computational and institutional network historical context in which this discourse on this tendency toward homogenization and means that the ‘database’ is no longer experimental culture is situated in the case standardization contains, as it were, seeds physically or conceptually containable within of China and its laboratories in governance of its own demise. As organizations become the borders of a single institution. The era of across provincial cities and spaces. How to increasingly reliant on off-the-rack technical distributed computing, of virtualized clusters cultivate and instil aspects of democratic ‘solutions’, the qualities that distinguish one of machines and software that can co-operate politics within political systems and market organization from the next steadily disappear. to resolve queries over structured data on economies stemming from socialist and When organizational requirements become heterogeneous network and computational communist models of collective ownership retro-fitted to software capabilities with little topologies, have been paralleled by questions is an especially delicate and complex variation or minimal ‘customization’ from one of the sovereignty of singular guardians of challenge, one that is further complicated organization to the next, the prospect of a population data. Over the past fifteen years since the fall of Bo Xilai and demise of the crisis of legitimacy is waiting just around an array of new paradigms for arranging, ‘Chongqing experiment’.27 In the spirit of wilful the corner. connecting and querying data — the adaptation and the redesign of terms and Semantic Web, Linked Data, service-oriented ideas harnessed to experiments in modes Together, both technical operations and architectures (SOA) and software-as-a- of governance, we might endorse a certain organizational cultures make evident service (SaaS) — continue to bring into accelerationist logic that invites further the contours delineating the territory of question claims over institutional legitimacy.28 amplification of algorithmic transformation contemporary data politics. This combinatory of institutional practices as a force of model of data-and-institutional organization The increasing dependency by policy makers technical failure. extends to the governance of logistical on the generation of numbers by machines is populations. One key example can be symptomatic of the automation of decision Close to home, the cognitive jolt or shock found in the recent challenges faced by the making. Such is the institutional over-reliance that accompanies systemic breakdown Australian Bureau of Statistics (ABS), which on the pure power of computation. No could be sufficient to question how, for in 2014 confronted a very public institutional matter how many manual double-checks instance, universities govern the production crisis of legitimacy based on a perception and regulatory procedures may comprise the of knowledge in ways not submissive to of computational failure. This crisis was repertoire of techniques deployed to guard the tyranny of metrics and calculation of precipitated by the multiplication of sites and against the sort of institutional risk exposed ‘performance’. Such endorsement comes points of data agglomeration: the ‘monopoly by the ABS debacle, the scale and distribution with its own political complications, as recent of knowledge’ (Innis) enjoyed by the ABS of computational calculation in the production histories of shock therapy, from Jeffrey Sachs’ for many years has now become rivalled by of knowledge will most likely result in an re-engineering of national economies to a diversity of institutional actors who also increasing jostling for legitimacy among Žižek’s endorsement of far-right ‘disruptive’ have considerable computational capacity institutional actors seeking government politics as a precursor to a revolutionary to produce knowledge that bears upon how contracts related to policy development. alternative, have been either ruinous or economies and populations are understood. Implicit in this jostling is a challenge to closed- irrelevant. Nothing yet in the history of the This has been accelerated by cutbacks in the world assumptions which accompany the Internet, which consists of an endless series operating budget of the ABS from successive traditional relational database form and, of shocks (from modem disconnection governments, which needs to be seen in the by association, the single institution that and browser incompatibility to Cambridge context of a two-fold move toward, first, manages such infrastructure.29 Analytica), suggests that ‘shock therapy’ the marketization of governance enabled by works. Academia’s version of Facebook’s computational processes, and second, the Rival claims, multiple perspectives and contradictory or indeterminate datasets

27 See Cui, Z 2011, ‘Partial intimations of the coming whole: the Chongqing experiment in light of the theories of Henry George, James Meade, and Antonio Gramsci’, Modern China, vol. 37, no. 6, pp. 646–60. See also Frenkiel, E 2010, ‘From scholar to official: Cui Zhiyuan and Chongqing’s local experimental policy’, Books & Ideas, 6 December 2010, https://booksandideas.net/From-Scholar-to-Official.html. 28 See Magee, L & Rossiter, N 2015, ‘Service orientations: data, institutions, labour’, in I Kaldrack & M Leeker (eds), There is no software, there are just services, Meson Press, Lüneburg, pp. 73–89, http://meson.press/books/there-is-no-software-there-are-just-services/. 29 See Edwards, PN 1996, The closed world: computers and political discourse in cold war America, MIT Press, Cambridge.

Western Sydney University form the new territory of informational nonetheless remained a peculiar continuum accounts of an automated world accompanied contestation. The case of the ABS offers an of relative institutional stability or at least by a shift in the global axis of power. optic into the emergence of such struggles. semblance of coherence across the modern Our claim is that this is less a story about epoch.30 Church, state, union, factory, firm. A more mundane, less ‘measurable’ decentralization and privatization of These were chief organizational forms that adjustment is also at work in the process of government within a neoliberal paradigm comprised the institutional rhythms of daily computational systems integrating with social (although these are without doubt key life and economy. The current conjuncture life and economy. The cognitive tendencies forces), and more an instance of the of institutional disaggregation and of the brain and psycho-physiognomic technical logic of databases and distributed computational geopolitics is in some ways a composition of subjectivity and its body of computing resulting in an unsettling of logical — even digital — ‘output’ stemming flesh are also steadily, even quite rapidly, modern institutional authority. What are the from the modern experience of a transformed undergoing change. A data politics of implications for public institutions as they world. As algorithmic modes of organizing the present is defined not only by battles relate to the supply of knowledge on national decision-making and practices of governance of proprietary platforms, by extractivist populations when the technologies of insight increasingly remake modern institutional economies and by claims of legitimacy over have become distributed and increasingly settings while simultaneously giving rise the right to govern.31 Data politics within social unaccountable across a range of actors? to ‘platform’ organizations and largely life, engineered by parametric designs and And what affordances does this present for non-governable apparatuses such as high- managed by forms of nonconscious cognition, the disruption of parametric politics, or the frequency trading systems, a corresponding is also about the invention of autonomy establishment, at the very least, of alternative redefinition of authority, expertise, severed from terms of agreement. Data parameters though which political life can subjectivity (manifest especially as a crisis politics insists on disagreement as a condition be constituted? of masculine identity) and indeed political- of computational cultures. economic hegemony is currently underway. Despite the extensive literary, artistic and This institutional transformation holds not musical expressions on the maelstrom of only political-economic and geopolitical modernity, and without questioning the implications, the details of which we can barbarism of war and colonial violence, there observe on a daily basis in the innumerable

30 The classic text here remains Berman, M 1983, All that is solid melts into air: the experience of modernity, Verso, London. 31 For a collection of essays that address these issues, see Bigo, D, Isin, E & Ruppert, E (eds), 2019, Data politics: worlds, subjects, rights, Routledge, London.

westernsydney.edu.au/ics