<<

Vol 1 No 2 (Autumn 2020)

Online: jps.library.utoronto.ca/index.php/nexj Visit our WebBlog: newexplorations.net

Surveillance Pandemic Marshall Soules—[email protected]

The Age of Capitalism

By

New York: Public Affairs, 2019

Who knows? Who decides? Who decides who decides?—Shoshana Zuboff (2019) Under the Wire In Talking to Strangers (2019), Malcolm Gladwell tells interesting stories to explain how, when we meet strangers for the first time, we should try to understand where they are coming from, the wider context of our encounter. Just so you know, I have written what follows as in a fevered dream of discovery where timing is of the essence. The context? An unprecedented alignment of events. On January 15, 2020, my partner and I flew from Vancouver, changed planes in Seoul, and flew into Hanoi. Preparations were underway throughout Southeast Asia for Chinese New Year and the Vietnamese Tet Festival (coinciding on January 25th) and the streets of Hanoi were kaleidoscopic with preparations for the popular celebrations. There were flowers everywhere. Tourists from China, South Korea, Japan, Taiwan and elsewhere merged with the Vietnamese to honour cultural traditions dating back millennia. We were swept along through crowded streets with their hectic celebratory atmosphere. We celebrated Tet in Hoi An on the central Vietnam coast on January 25. By this time, news of Wuhan and the spreading coronavirus had been widely circulated, and its threat confirmed—it had become a global pandemic. Increasingly, people wore masks, but social distancing was impossible in the chockablock streets. Within days after the celebrations, the number of Asian tourists declined noticeably; hotels and restaurants faced cancellations and rapidly declining numbers.

Traveling through Vietnam, Cambodia, and Laos in the following five weeks, we witnessed plunging tourist numbers, and talked to service industry workers facing job losses. Growing

201

Marshall Soules paranoia about social distancing and news of exploding infection rates elsewhere around the world bloomed along with the Spring flowers. While traveling, we use the internet to reserve hotels, schedule buses and planes, discover restaurants and places to visit. Every search query is accompanied by recommendations of what to do, see, and buy. We are often surprised by the alignment of our preferences with recommendations dished up by TripAdvisor, , and Booking.com. Google kept us current with the global pandemic while recommending what we must experience in our final week in Southeast Asia not far from the Chinese border. Our flight back to Vancouver through Hanoi and Taipei on March 1 was not difficult or stressful, just strange. We passed through numerous medical checkpoints by following the arrows, and saw groups of travelers quarantined in the airports, but there were no long lineups passing through security. There was not an empty seat on our airplane from Taipei to Vancouver. We were returning home just under the wire. The pandemic soon caught up with us. Measures for social isolation and distancing, frequent handwashing, and wearing masks were successively rolled out by health officials. By mid-March, with lockdowns and quarantines increasingly imposed, we were plunged into uncertainty, isolation, and confusion. International and local news sources served up a torrent of data about the coronavirus pandemic. Unprecedented Events and the Black Swan Into this disorienting series of events, Shoshana Zuboff’s equally disturbing The Age of Surveillance Capitalism; The Fight for a Human Future at the New Frontier of Power (2019) arrived in the mail. The book is a hefty read, compelling and challenging on many fronts, and compulsively researched. I read it with a sense of compulsion, just as the coronavirus lockdown tightened its screws. Social isolation suddenly felt ominous and threatening. Dr. Shoshana Zuboff, professor emerita at Harvard Business School and Associate Professor at Harvard Law School, knows her way around technology, capitalism, and the law. Previous books explore the impacts of computers on business practices (In the Age of the Smart Machine, 1988) and the role of “digitally distributed capitalism of services tailored to the individual (The Support Economy, 2002).”

In her exposé of surveillance capitalism, Zuboff is on a mission and she is relentless. She names her villains, heroes, and victims and tells us what they are creating, even as they try to keep their true activities obscure. She repeats her arguments and concerns to make them indelible, to make us pay attention and remember. She wants us to know how our inner psychological resources are transformed into massive profits under ambiguous circumstances by the planet’s largest tech corporations. You may not always appreciate her self-assured and indignant tone, but her detective work is compelling and of global import. (Today, as I write this, Amazon’s Jeff Bezos and ’s Mark Zuckerberg are testifying before the US Congress to defend their

202

Surveillance Capitalism Pandemic enterprises against charges of excessive market dominance.) Zuboff’s text will put you squarely in the moment, and for years to come. Think, for a moment, of a viral pandemic: the importance of information, data, prediction, social control, surveillance, all on a global scale, and you will find yourself at the intersection of surveillance capitalism and Covid-19. In this maelstrom of powerful currents, we find ourselves wondering how to navigate in a game-changing media ecology. To make sure we understand her subject off the top, Zuboff begins with a description of her many-headed Hydra. She defines surveillance capitalism in her Preface:

A new economic order that claims human experience as free raw material for hidden commercial practices of extraction, prediction, and sales; 2. A parasitic economic logic in which the production of goods and services is subordinated to a new global architecture of behavioral modification; 3. A rogue mutation of capitalism marked by concentrations of wealth, knowledge, and power unprecedented in human history; 4. The foundational framework of a surveillance economy; 5. As significant a threat to human nature in the twenty-first century as industrial capitalism was to the natural world in the nineteenth and twentieth; 6. The origin of a new instrumentarian power that asserts dominance over society and presents startling challenges to market ; 7. A movement that aims to impose a new collective order based on total certainty; 8. An expropriation of critical human rights that is best understood as a coup from above: an overthrow of the people’s sovereignty. (p. v)

An imposing agenda, one she holds to with dogged tenacity. In her view, the word unprecedented fuels the sense of urgency and significance for both surveillance capitalism and Covid-19. Nassim Taleb’s widely circulated analogy of the black swan expresses how unprecedented events create uncertainty, anxiety, and confusion before forcing us to reconsider our assumptions. If we believe all swans are white, the discovery of a black swan will shake confidence in our beliefs. If we believe the market will always correct itself, a dot-com bubble bursting will make investors question their assumptions and force tech capitalists to re-evaluate their business models. As with the 9-11 attacks, unprecedented events force a recalibration of risks and opportunities, new calls to action. Almost universally, Taleb explains, these calls to action are delivered in stories: “You need a story to displace a story. Metaphors and stories are far more potent (alas) than ideas (Taleb 2007, xxvii).” What are the stories that swirl around surveillance capitalism, around Covid-19? Which stories are supported by evidence, and which are persuasive attempts to move the masses? This is not to suggest that surveillance capitalism is analogous to Covid-19 in every respect. Their origin stories are quite dissimilar, but the speed of their emergence and transmission contributes equally to the climate of uncertainty and anxiety they generate. Zuboff frequently asserts that

203

Marshall Soules surveillance capitalism is “unprecedented,” a “pandemic.” As the quote above suggests, both are “parasitic,” “rogue mutations,” “foundational framework[s] of a surveillance economy.” Both predict the possibility of a “new instrumentarian power” imposing a “new collective order based on total certainty,” and perhaps “an expropriation of critical human rights” and “an overthrow of the people’s sovereignty.” While the process of using algorithms for resource extraction differs from collecting data to contain a biological contagion, the impacts of these events on human society have too many parallels to ignore. So, just what is Zuboff warning us about? Extracting Behavioral Data “Surveillance capitalism is not technology; it is a logic that imbues technology and commands it into action (Zuboff 2019, 14-15”; subsequent references to this text will include only page numbers.). We need to look at the puppet masters and not the puppets. Google, incorporated in 1998, was first and foremost involved in information capitalism which relied on the value of information collection and retrieval subsidized by . Ad revenues financed new information services, from Google Earth to Translate and Voice Recognition. was always central to this enterprise; total information dominance the goal. Google co- founder Larry Page pursued the dream of Search as the bedrock of artificial intelligence, machines smarter and faster than Homo Sapiens. In the early 2000s, “Amit Patel’s work with Google data logs persuaded him that detailed stories about each user—their thoughts, feelings, interests—could be constructed from the wake of unstructured signals that trailed every online action (p. 68).” This “wake of unstructured signals” came to be known as “data exhaust,” a by- product of user interactions. In 2002, Google began to exploit this underappreciated, and free resource. It turns out that people’s online decisions are less valuable than how they are feeling at the time—elated, vulnerable, lonely—while buying things, gathering information, reading stories, listening to music, watching video. Your search has been successful, and you have left behind a trail of tell-tale breadcrumbs of what you were thinking and feeling while engaged in Search. These clues—from how quickly you were keyboarding and your spelling mistakes, to where you clicked to next—have considerable value for those with powerful algorithms, state-of-the-art computing machines, and massive data storage. But what are the real costs of this exchange? True, Google was continuing to improve its products for the benefit of users, but it had discovered a whole new revenue stream. “Behavioral surplus” is a valuable commodity for players who wish to modify behavior: from advertisers and promotors, to public agencies, politicians and their proxies who wish to win elections and shape national and global agendas. For example, Chris Wylie (2019) describes how (2013-2018) weaponized Facebook’s user profiles and behavioral surplus to influence the outcomes of the 2016 US presidential election and the Brexit referendum in the UK, among many other forays into social engineering. Cambridge Analytica became a central player in the exploding field of affective computing, emotional analytics, and sentiment analysis. More on this below.

204

Surveillance Capitalism Pandemic

In Zuboff’s narrative, first Google, then Facebook, Apple, Microsoft, Verizon and others discovered new revenue streams based on behavioral surplus. In those early years between 1998 and 2002, data provided by users of Google’s services was reinvested into product development. Users were not the product but supplied the raw materials auctioned to advertisers (p. 69). The dot-com crash in 2000 pushed Google to innovate with AdWords (now Google Ads), matching ads with queries. User profile information (UPI) matches users to products. “Ads would no longer be linked to keywords in a search query, but rather a particular ad would be ‘targeted’ to a particular individual (p. 74).” Prediction accuracy achieved by Google’s algorithms added value to the cost of the targeted ads. Increasing machine intelligence and accumulating supplies of behavioral surplus fueled an “unprecedented logic of accumulation.” In this new frontier, territory is up for grabs, the law has yet to arrive on the frontier, and the original inhabitants have not secured their rights. Zuboff uses the analogy of Spanish conquistadors in the early 16th century who, to justify their imperialism, asserted their God-given right, sanctified by the Bible, to the land, people, and resources of the New World (pp.175-77). This displacement of people in the quest for resources was repeated in the subsequent colonization of the Americas, from Canada to Chile. Many people believe Google and Facebook collect and sell the information we give them to advertisers, but they collect more information than we are aware of, more than we offer feely. The more we expose our public and private interests, the more data these companies accumulate to predict our behaviour and our futures. Selecting Facebook’s Like button signals much more about our moods and preferences than we might imagine, though how the algorithms technically do this is a proverbial black box. (Bring in the technical experts!) As their algorithms for data mining improve, so does their ability to predict our behaviours, whether concerned with travel or healthcare. Asymmetries of Knowledge: Who Knows? Both in the 16th and 21st centuries, asymmetries of knowledge and power provide unprecedented opportunities for resource extraction. The dot-com crash was felt as an existential threat to the tech giants who entered a “state of exception” requiring them to redefine their values, principles, and vision for growth (p. 84). Their “hiding strategy”—not explaining exactly what they were up to with data collection—was rationalized as follows: “Behavioral surplus was necessary for revenue, and secrecy would be necessary for the sustained accumulation of behavioral surplus (p. 88).” Users would not appreciate the behavioral data grab without prior conditioning, and Google declared the right to do what it wanted with behavioral surplus. Users agreed to the terms and conditions of this expropriation, which were purposely opaque and misleading and impossibly long-winded. Besides—here’s the tempting apple—the information served up in exchange is both powerful and convenient. We want to know the future. is not so much eroded as redistributed. Google uses Search to train its AI capabilities. The more data, the better its predictive power. Google claims it does not sell raw data; instead, it

205

Marshall Soules

“sells predictions that only it can fabricate from its world-historic private hoard of behavioral surplus (p. 96).” Google’s algorithms and its ability to aggregate large amounts of data change what users contribute into some new commodity. Consequently, Google and then Facebook (which went public in 2007) have lobbied strenuously against privacy protection legislation because their business model requires the extracted resources to be free (p. 105). Merging prediction with behaviour modification techniques (as promoted by B.F. Skinner and his protégés), these companies nudge and drive behaviours profitable for themselves and their sponsors. The companies claim they are collecting data to improve their products and their users’ lives; meanwhile, they are hoovering up data through all varieties of sensors connected to the Internet of Things (IoT). Think of the smart home with an Alexa to answer questions, a Nest thermostat to track heating and cooling, an iRobot Roomba vacuum cleaner, a Waymo, Google’s self-driving car, Google Earth, and ol’ familiar Search—all servants to our comfort, convenience, quality of life, desire to know, to go, to buy and save. Meanwhile, we are being closely tracked, our contacts traced, our community alerted. Big Other—Google, Facebook, Apple, Amazon, Alibaba and their ilk—don’t want us to know that aggregated data surplus, previously thought to be a relatively worthless by-product of Search and Find technology, is much more valuable than ad revenues to people who want to know what motivates us when making choices, or calculating risk. Valuable to those who want to predict the future: not only where we will shop or travel, but how we will vote, what equities will rise and fall, where the virus will cause most disruption, where and when civil and political disturbances like Black Lives Matter will burst their restraints. So powerful have their tools of prediction become that governments, their militaries, security and intelligence agencies, public health officials, police departments are contracting out their surveillance, prediction, and motivational analyses to Big Other. How long will it be until contact tracing—where you were and who you came close to—is mandatory, like a driver’s licence, passport, or gun registration? Location services on our smartphones must always be turned on—no GPS, no service. Social Confluence: The Charge in the Global Membrane For Zuboff, surveillance capitalism is only possible in the digital environment: “a ubiquitous, sensate, networked, computational infrastructure (p. 20).” B.W. Powe’s The Charge in the Global Membrane (2019) brilliantly limns the promise and peril of this digital planetary medium, how it gets under our skin, recalibrates our nervous system and sense of affect, alters our moods. The charge has a propulsive, motivational power to move electrons one way or another. Both Zuboff and Powe want us to become more discerning in our search for information and convenience. What charge attracts the flow of data? What charge repels data, forcing it into hiding. How do memes sow dissension in cultural wars?

206

Surveillance Capitalism Pandemic

Surveillance capitalism predicts affect (emotional climate) at the exact moment when people need a confidence boost, at the peak of their vulnerability. Who should I vote for? What does she stand for? Does he really like me? Should I make that investment? Facebook’s 2012 contagion experiments proved that subliminal cues influence offline events. Online click-throughs can be transformed into real life foot falls. “Instrumentarian power aims to organize, herd, and tune society to achieve a similar social confluence, in which group pressure and computational certainty replace politics and democracy… (p. 21, italics in original).” If you can affect emotional states, you can influence actions. The Cambridge Analytica story illustrates the dark side of behavioral engineering. Whistleblower Chris Wylie, originally from Victoria, BC, revealed how this propaganda enterprise influenced voting behaviour by triggering “innermost demons (Wylie p. 132).” To circumvent election interference legislation in both the US and UK, Cambridge Analytica was created as an offshoot of its UK parent company SCL Group. Inspired by the alt-right thinking of Steve Bannon— Trump advisor, formerly of Breitbart News—and supported with funding by conservative US billionaire Robert Mercer, Cambridge Analytica wanted to capitalize on SCL’s extensive experience in voter manipulation in Africa and the Caribbean among other locations. As a data programmer and engineer, Wylie would help build the tools Bannon needed to influence the outcome of the 2016 US presidential elections and the Brexit vote (on the Leave side). Wylie reveals that, as an animating influence, Bannon wanted

to gain cultural power and informational dominance—a data-powered arsenal suited to conquer the hearts of minds in the new battlespace. The newly formed Cambridge Analytica became the arsenal…. In this new war, the American voter became a target of confusion, manipulation, and deception. Truth was replaced by alternative narratives and virtual realities. (p. 16)

To accomplish these goals, CA leveraged a “dark triad” of emotions—narcissism (extreme self- centeredness), Machiavellianism (ruthless self-interest), and psychopathy (emotional detachment)—to ignite anger, short-circuit rationality, and influence the vote. In doing so, CA unleased an unprecedented pandemic of social confusion and vitriol. For many readers, the true costs of these interventions are beyond measure. To what extent did Cambridge Analytica’s secretive and highly-charged operations contribute to divisiveness in the social fabric? Much has been written, and is still being written, on this vexing question. , Agency, Interdependence Zuboff’s central argument with surveillance capitalism—that it undermines individual agency and self-determination—is problematic in an age when the interconnectedness of all species has been thrust to the forefront of social consciousness. The Covid-19 pandemic, climate change, racial and gender (in)equality all reflect current anxieties about specious hierarchies. When personal freedoms collide with collective benefits, remedies are difficult to identify and

207

Marshall Soules implement. The notion of agency (the ability to act in the world) is ambiguously conditioned by many factors including the law, tradition, culture, personality, and circumstance. Attempting to encourage social distancing and quarantining during Covid-19, politicians and health officials discovered deeply entrenched beliefs about individual freedoms and collective responsibilities. There were challenges: “Who says we have to wear protective masks?” “Who says we can’t work?” “Why can’t I visit my ailing Mother in her long-term care home?” (One wonders if the rate of Covid-19 infection is a reliable measure of a nation’s personal autonomy expectations.) Zuboff wants us to answer the question, “Who decides?” Will it be those who have the best data and can make the most accurate predictions? Or will those with political power make life-and-death decisions for us? How does our sense of personal agency factor into this calculation of power? As the interdependence meme propagates, individual autonomy and rights will be increasingly difficult to justify. At its heart, Zuboff’s argument interrogates the central importance of individual freedom, agency, and privacy: “The new harms we face entail challenges to…the elemental rights that bear on individual sovereignty, including the right to the future tense and the right to sanctuary (p. 54).” Understandably, those who promote surveillance capitalism respond by claiming collective benefits: our predictions and contact-tracing apps will help us manage the Covid-19 pandemic; our data gathering will ensure elected officials will follow national priorities; sharing your life and experiences will benefit community-building and foster a sense of belonging. Zuboff counters that capitalism’s “laws of motion”—maximizing , reinvesting surplus, improving productivity—are the real motivations for mining behavioral surplus (p. 66). Unseen Elites: Who decides? Central to the narratives of both surveillance capitalism and the Covid-19 pandemic is a resurgence of interest in B.F. Skinner’s theories of operant conditioning and behaviour modification (Skinner, 1971/2002). Operant conditioning rewards specific behavior and withholds rewards to leverage compliance. Related influence techniques such as priming (Iyengar, Peters, and Kinder, 1982; Nyhan and Reifler, 2010), and libertarian paternalism and nudging (Thaler and Sunstein, 2008) have seen renewed interest by behavioral economists who want to explain how these mechanics of persuasion can be put to work. While Nyhan and Reifler (210) focus on resistance to the argument, their research on the “backfire effect” provides useful insights. The backfire effect is engaged when strongly held beliefs are challenged with evidence, and resistance to the evidence increases. The facts of the matter do not outmatch the benefits of belief. If people cease to believe the truth of their holy book or patriotic sentiment, they will become exiles from their communities of faith. The backfire effect at work: “The Covid-19 pandemic is under control following our rapid response. It is safe to return to work to keep the economy strong. In God we trust.”

Zuboff asks, “Who knows? Who decides? Who decides who decides (p. 174)?”

208

Surveillance Capitalism Pandemic

She anticipates a shift of focus from predicting buyer behavior to social prediction and crowd control: policing, medical emergencies, voting, legislation, military escapades, policy development (as in climate change). Surveillance capitalism retrieves Bernays’ argument in Propaganda (1928) that a “unseen elite” is better suited to make decisions of national importance than a quixotic democratic public. Is there any doubt that are being challenged to hold their own against the scenario planning of those who decide who decides? Mastering Chance Alex Pentland (MIT Media Lab, Google Glass developer, Sociometric Solutions / Humanyze) visualizes populations coded not by race, income, occupation, or gender but rather by behavior patterns that predict disease, financial risk, consumer preferences, and political views with high degrees of accuracy (Pentland, 2011, p. 3). His social influence algorithms are designed to induce large numbers of people to seek guaranteed safety, stability, and efficiency based on these predictions. In his scheme, behavior modification is based “on imitation that can be manipulated for confluence (Zuboff, p. 429).” In Pentland’s imagined utopia, “continuous streams of data about human behavior” will be used to predict and modulate “everything from traffic, to energy use, to disease, to street crime (p. 143).” He predicts a “world without war or financial crashes, in which infectious disease is quickly detected and stopped, in which energy, water and other resources are no longer wasted, and in which governments are part of the solution rather than part of the problem.” Human society, nudged forward by collective intelligence, will pursue “social universals” in a “coordinated manner (p. 143).” Social networks will instrumentalize social pressure, especially among those with strong ties of affiliation, such as those cultivated by Facebook and other popular social media platforms (p. 152). Zuboff insists that individuality threatens Pentland’s instrumentarian society, since it introduces troublesome friction sucking energy from collaboration, harmony, and integration. Self- determination and autonomous moral judgment, generally regarded as bulwarks of liberal humanism, are recast as threats to collective well-being. The Covid-19 pandemic might be considered a global instrumentarian experiment in which people are asked to sacrifice their self- determination—through social isolation and distancing, wearing masks—in exchange for collective benefits. It could well emerge that the highest rates of Covid-19 infection occur in regions with the highest expectations of personal freedom. Perhaps gun ownership, access to sunbathing beaches, political affiliation, or attending large sporting events will show a high correlation to numbers of infections. We see public health officials and political leaders encouraging the public to honour the restricting safeguards, or collective solutions will be imposed by law. Of course, data will prove the necessity of following instructions, and data collected during the peak of the pandemic will tell us something about the benefits of cooperation. How successfully did we master chance?

209

Marshall Soules

In Lament for a Nation (1965), the Canadian philosopher George Grant comments that technology is “a spirit which excludes all that is alien to itself. As Heidegger has said, technique is the metaphysic of the age (Grant 1965/1997, p.11).” Elsewhere, Grant elaborates on Heidegger to characterize the “western” experiment: “The dynamism of technology has gradually become the dominant purpose in western civilisation because the most influential men in that civilisation have believed for the last centuries that the mastery of chance was the chief means of improving the race (Grant, 1969, p.113).” Could any observation be more germane to the intersection of surveillance capitalism and Covid-19? Unlike , surveillance capitalism seeks not a violent coup from the margins but a “taming” by a technological elite and their political allies. They seek the “elimination of chaos, uncertainty, conflict, abnormality, and discord in favor of predictability, automatic regularity, transparency, confluence, persuasion, and pacification: We are expected to cede our authority, relax our concerns, quiet our voices, go with the flow, and submit to the technological visionaries whose wealth and power stand as assurance of their superior judgment (pp. 515-6).” In this instrumentarian world, Zuboff fears we will sacrifice “the will to will, the sanctity of the individual, the ties of intimacy, the sociality that binds us together in promises, and the trust they breed (p. 516).” We are living in a time of competing visions for the future where information and belief are locked in a yin-yang embrace. Surveillance Exceptionalism Surveillance capitalism is sheltered and enabled by a series of factors. Collectively, these factors construct its media ecology: neoliberal economic ideology, market self-regulation; freedom of expression legislation is tied to property rights (corporate actions are considered “free speech” and not to be infringed upon); cyberlibertarian ideology discourages oversight and external constraints that either limit platform content or the “algorithmic orderings of information produced by their machine operations (p. 109)”; Section 230 of the U.S. Communications Decency Act of 1996 “shields website owners from lawsuits and state prosecution for user-generated content (p. 110)”; elective affinity between intelligence agencies and the emerging Google surveillance operations evolved after the unprecedented 9-11 attacks “to produce a unique historical deformity: surveillance exceptionalism (p. 115).”

The convergence of these influences after 2001 created a cultural ecology charged with economic imperatives whose consequences continue to bear heavily in 2020. In this year of seeing clearly, we have much to witness and learn from. Following from Marshall and Eric McLuhan’s The Laws

210

Surveillance Capitalism Pandemic of Media (1988), we are watching the reversal of an overheated medium and the obsolescing of privacy and sanctuary. Total Information Awareness (TIA)—what the US government decided it needed after 9-11— happened to be Google’s holy grail in 2001. Google’s data extraction abilities soon surpassed government capabilities and made it a surveillance collaborator. Since then, revelations by WikiLeaks, Edward Snowden, Chris Wylie and others; video surveillance by police and military; wide-spread CCTV installation; and broader awareness of Five Eyes intelligence sharing (by the US, UK, Canada, Australia, and New Zealand) have conditioned people to the ubiquity of citizen surveillance. (The Five Eyes initiative is a work-around for legislation preventing the participating governments from spying on their own citizens.) Intelligence sharing? No problem.

Google’s growing influence is built on its advantage in predicting elections; strenuous lobbying that blurs distinctions between private and public interests; collaboration with the Obama administration (2009-2016); and Google’s efforts to influence academic research and policy formulation (p. 122). Zuboff confirms how “Google licensed Android to mobile handset makers for free because it was intended to draw users into Search and other Google services, establishing a ubiquitous mobile supply apparatus to sustain known terrains of behavioral surplus and open up new ones, including geolocation and mobile payment systems (p. 133).” The perfect partnership for Covid-19 contact tracing, smart homes, and smart cities as proposed by Sidewalk Labs (an Alphabet/Google enterprise planning to use technology to solve the challenges facing cities in the future. In 2004, Google acquired Keyhole, a satellite mapping company founded by John Hanke. Keyhole evolved into Google Earth, Google Maps, and the controversial Street View Project. “Google’s budding surveillance practices were coveted rather than contested” in the “new habitat of militarized demand (p. 120).” Despite the controversy of building 5G networks, installation continues apace since 5G will be the backbone of ubiquitous computing and surveillance. One report from the EU identifies these privacy concerns:

We argue that smart-cities are likely to (a) generally weaken privacy by allowing massive data-sets to be cross-referenced, and (b) to obscure the purpose of data collection and thus trump individuals’ perceptions of privacy. We hypothesize that such phenomena may bring a cloud of suspicion on any collection of data and could create a general chilling effect. In such case, even the strong legal safeguards the Court has discovered in recent years may not be enough to prevent it. (Laudrain, 2019)

In July 2020, the US Congress called the chief executives of Amazon (Jeff Bezos), Apple (Tim Cook), Facebook (Mark Zuckerberg) and Google (Sundar Pichai) to account for their predatory surveillance practices and growing of power and influence. Whether any legislation

211

Marshall Soules they can devise and approve will effectively address these issues remains to be seen beyond the sphere of our crystal balls. Four Stages of Dispossession Big Other successfully dispossesses people of their privacy in four stages: incursion, habituation, adaptation, and redirection. Incursion, like the “settling of the West” is a simple matter of theft—we will take this land / data because we can. Google Glass, for example, uses facial recognition software to mine the immediate environment for information. Incursion is typically accompanied by a declaration: “a particular way of speaking and acting that establishes facts out of thin air, creating a new reality where there was nothing. We are claiming this data because we discovered it (John Searle, qtd Zuboff, p. 176).” With habituation, people learn to live with the reality of data extraction, often with a rationalization in tow: “I’ll trade my privacy for convenience or knowledge.” In the third stage, people have learned how to adapt with coping strategies, or acceptance. Perhaps they use the Tor browser or turn off location services on their smartphones when not in use. If, however, the surveillance capitalists are pressured by public opinion or political pressure, they redirect by stonewalling, prevaricating, refusing to change, or tweaking the technology to avoid further controversy.

While Google, Twitter, Facebook and other sources of news and opinion thrive in a climate of controversy and division—all good for ad revenues—they have been forced to introduce measures identifying fake news and curtailing hate speech. In 2020, Facebook was challenged to identify and quarantine patently false statements about the coronavirus. This redirection certainly does not prevent these companies from prospecting. Prediction in the time of crisis, whether of climate, health, or political power, will be the cash cow. Eric Schmidt, Google’s CEO, when describing “modern technology platforms,” observes that “almost nothing, short of a biological virus, can scale as quickly, efficiently, or aggressively as these technology platforms, and this makes the people who build, control, and use them powerful too (qtd. Zuboff, p. 179).” Prediction: Google’s technology will contain the Covid-19 pandemic. Count on it. As Bernays predicted in 1928, with knowledge comes power in the age of technology and information. And those who secure monopolies of knowledge are inducted into the boardrooms of the hidden elite, where they strategize “the division of learning in society—the axial principle of social order in an information civilization (p. 179).” There are two texts in the surveillance capitalist’s handbook: the text users provide knowingly, and the shadow text of behavioral data extracted from the text. The shadow text is a trade secret ruled by exotic algorithms not to be revealed in courts of law or public opinion. Big Other uses the shadow text “to create the world

212

Surveillance Capitalism Pandemic they show us as a fact (p. 185).” They “make the future for the sake of predicting it,” argues Zuboff (p. 201). Cambridge Analytica, with the help of Facebook profiles, nudged whole nations towards their predicted outcomes. Those who have access to the data will profit and those without access will be subject to its operations. What will happen when proprietary machine learning and AI algorithms outperform their human masters? Who will teach these machines what they need to know to outperform humans? Will there be a fifth stage of dispossession, and what would it look like? Prediction Imperative Information recalibrated as prediction drives, nudges, and conditions behavior. Zuboff does not like this outcome, especially for her children. I share her concern for my granddaughter, aged 14 at the time of writing. She is habituated to her mobile phone and deeply affected by her digital encounters. By her own admission, her self-esteem and sense of agency hang in the balance. She knows she is being influenced. Looking ahead to her adult years, we will be curious to observe her attitude towards personal fulfilment, social and political engagement. “Personal information is increasingly used to reinforce standards of behavior,” Zuboff writes, whereas democracy “depends on a reservoir of individual capabilities associated with autonomous moral judgment and self-determination (p. 190).” Relentless, she continues: “These [surveillance] operations challenge our elemental right to the future tense, which is the right to act free of the influence of illegitimate forces that operate outside our awareness to influence, modify, and condition our behavior (p. 194).” Yes, I am concerned about my granddaughter’s future.

The prediction imperative extends extraction operations from the virtual to the real world. Adolescents or young singles experience psychic shocks when they transition from the initial virtual contact to their F2F encounters. Your personal digital assistants know more than you ever will, or you would not be asking for help! Your smart home has a better memory for home maintenance tasks. Telemetry used to monitor the environment or track both animals and people record more than humans are capable of. From tracking refugees to contact tracing during Covid- 19, accurate prediction will be the gold standard. Dark data is data that remains in hiding. What remains dark for surveillance capitalists is “menacing, untamed, rebellious, rogue, out of control…Everything must be illuminated for counting and herding (p. 210).” The prediction imperative comes with its own inevitability rhetoric—computing is inevitable, the Internet of things will become ubiquitous—and acts as a “Trojan horse for powerful economic imperatives (p. 220).” For example, Alphabet’s Sidewalk Labs Project plans to provide digital kiosks, traffic management, digital omniscience. Sidewalk CEO Dan Doctoroff, at a talk to industry insiders in April 2016, sketched his vision:

In effect, what we’re doing is replicating the digital experience in physical space…. So ubiquitous connectivity; incredible computing power including artificial intelligence and machine learning; the ability to display data; sensing, including cameras and location data as well as other kinds of specialized sensors…. We fund

213

Marshall Soules

it all…through a very novel advertising model…. We can actually then target ads to people in proximity, and then obviously over time track them through things like beacons and location services as well as their browsing activity (qtd. Zuboff, pp. 229- 230).”

A Sidewalk Labs city (once planned for Quayside, Toronto, but now cancelled) will be ideally suited to manage the data and crowd control for unprecedented events such as Covid-19 social distancing or Black Lives Matter demonstrations. Accurate prediction will help keep everyone safe. Extreme Rendition: Surveillance Capitalism Hates a Vacuum In surveillance capitalism, rendition is a double-edged sword. Technologies render our experience into data at various points of contact. At every digital interface, we make ourselves available to datafication when we “sur-render” our inner resources, often without full awareness (pp. 232-3). We are complicit in this rendering process when we use the extraction tools available to us. As noted above, steps can be taken to preserve our privacy, if we are willing to forego convenience, and research how to become anonymous in the digital world. Rendition becomes more complex when data collection is required of us (mandatory contact tracing, paperless banking), or enforced without our awareness or consent (hidden sensors, social media aggregation). Data extraction enterprises build interfaces to do the digging: iRobot Roomba vacuum cleaner, Nest thermostat, geo-targeting, Android location services, wearable technologies, interactive fabrics, fitness trackers, healthcare monitoring, and facial recognition. Surveillance tags along with us wherever we travel with our mobile phones and internet-connected devices. Rendition increasingly capitalizes on voice recognition (orality), after its earlier days as primarily text-based. Conversation with our digital devices can be collected as behavioral data. “Casual talk helps blur the boundaries between ‘it’—the apparatus saturated with commercial agents—and us. In conversation we imagine friendship (p. 259).” Google Home (formerly Google Now), Microsoft Cortana, Amazon Alexa, Apple Siri, Facebook M, all personalized digital assistants are merging into an extended conversation with technology. Alexa, as one example, links to other smart devices in the home. Amazon Lex, the engine powering Alexa, is a service for building conversational interfaces into any application using voice and text. It enables any company to integrate Alexa’s brain into its products (aws.amazon.com/lex/). I routinely use voice recognition to talk to my television. One wonders what Walter Ong would say about the power and character of secondary orality in the age of pandemics. Perhaps he would remind us that “Oral discourse has been commonly thought of even in oral milieus as weaving or stitching—rhapsōidein, to rhapsodize (Ong, p. 13).” Hal Varian, Google’s chief economist in 2014 claimed the moon when he announced that Google Now “should know what you want and tell you before you ask the question…[it] has to know a lot about you and your environment to provide these services (Varian, 2014, pp. 28-29; qtd. Zuboff,

214

Surveillance Capitalism Pandemic p. 255).” There was a time when we searched Google and Amazon; now, they search and render us. is stitching together the lives we are weaving, one query at a time. On the horizon, then, is machine rendition of the unconscious. In 2015, Realeyes received European Commission funding to develop SEWA (Automatic Sentiment Analysis in the Wild), an “automated technology…able to read a person’s emotion when they view content and then establish how this relates to how much they liked the content (sewaproject.eu/).” Paul Ekman’s Facial Action Coding System (FACS) builds on his earlier work on the facial expression of universal human emotions and the detection of lying. At MIT, Rosalind Picard continues ground- breaking work on affective computing with Affectiva (web.media.mit.edu/~picard/), emotion recognition software concerned with prediction and rendition of emotion data. Now, “computers can be given the ability to recognize emotions as well as a third-person human observer (qtd. Zuboff, p. 285).” Picard’s pioneering work was highjacked, claims Zuboff, by surveillance capitalists who deviated from her original intentions (p. 291). Economies of Action: Intelligence or Consciousness? After identification and analysis, data extraction requires actuation, when real-time analytics translate into real-time action. Discovering “economies of action” is one of the goals of behavior modification and involves a variety of approaches designed to encourage compliance: tuning, nudging, priming, herding, conditioning, rewarding, and punishing. Zuboff is not sanguine about this ascending ladder of coercion: “In this phase of the prediction imperative, surveillance capitalists declare their right to modify others’ behavior for profit according to methods that bypass human awareness, individual decision rights, and the complex of self-regulatory processes that we summarize with terms such as autonomy and self-determination (p. 297, italics in original).” Many of surveillance capitalism’s “innovations” are retrieved from earlier discoveries and insights. Emotional contagion was certainly an interest of Gustave le Bon is his 1885 classic The Crowd: A Study of the Popular Mind, as it was for Walter Lippmann in Public Opinion (1922), and for Edward Bernays in Propaganda (1928). In the 21st century, much research has been conducted to understand the role of mirror neurons, the neurotransmitter oxytocin, and empathy in conditioning affiliation, receptivity, and decision- making (Kahneman, 2011; Zak, 2012; Damasio, 2010). Empathy contributes to emotional contagion, observes Zuboff: “[A] person’s susceptibility to subliminal cues and his or her vulnerability to a ‘contagion’ effect is largely dependent upon empathy: the ability to understand and share in the mental and emotional state of another person.…[E]mpathy is a ‘risky strength’ because it predisposes us to experience others’ happiness but also their pain (p. 301).” Priming emotional contagion, especially in online environments, is more effective when accomplished below conscious awareness. For the humanist Zuboff, self-awareness is critical to self- determination: “Every threat to human autonomy begins with an assault on awareness (p. 307).”

215

Marshall Soules

Human awareness is thus a threat to surveillance revenues, as awareness endangers the larger project of behavior modification. “The competitive necessity of economies of action means that surveillance capitalists must use all means available to supplant autonomous action with heteronomous action (p. 307).” In Zuboff’s ideal society, autonomy (regulation by the self) surpasses heteronomy (regulation by others). People are less susceptible to persuasion when they can “premeditate”: “harness self-awareness to think through the consequences of their actions (p. 307).”

Individual autonomy provides a moral compass for liberal humanists. In Homo Deus (2015), the Israeli historian Yuval Harari argues that human autonomy is a story humanist tell themselves to anchor their beliefs about freedom, independence of thought, and agency. Predicting the future, Harari identifies what he sees as a growing disjunction between Humanists and Dataists:

Humanism holds that experiences occur inside us, and that we ought to find within ourselves the meaning of all that happens, thereby infusing the universe with meaning. Dataists believe that experiences are valueless if they are not shared and we need not—indeed cannot—find meaning within ourselves. We need only record and connect our experiences to the great dataflow, and the algorithms will discover their meaning and tell us what to do (p. 450).

Thinking like a Dataist, Harari ironically argues that “according to current biological dogma, emotions and intelligence are just algorithms…if we could create a data-processing system that can assimilate even more data than a human being, and process if even more efficiently, wouldn’t that system be superior to a human in exactly the same way that a human is superior to a chicken (p. 445).” If data reveals that we should wear masks to prevent the spread of Covid-19, shouldn’t we do what we’re told to be responsible, interconnected citizens? Or does the story we tell ourselves about our individual freedoms take precedence over social well-being? Harari asks, “What’s more valuable—intelligence or consciousness? What will happen to society, politics and daily life when non-conscious but highly intelligent algorithms know us better than we know ourselves (462).” I have mixed feelings about Harari’s binaries, but I appreciate his challenge to humanist dogma about individual freedom and agency. Evolutionary biology and the hidden life of trees position the interdependence of life forms at centre stage. Human agency is notoriously difficult to calculate, so numerous are the influences acting upon an individual life. While I do not always share Zuboff’s passion for individual autonomy and agency over interdependence, I do agree that self-reflection and discernment are necessary antidotes to manipulation by persuasion. And I am equally disturbed by surveillance capitalism’s obfuscation of their contract with users, and how they repurpose our behavioral data for their profit. It seems evident that Big Other’s business model depends to some extent on secrecy and misdirection, as with tobacco (health) or gas and oil (climate change). Contracts are “islands of predictability” based on trust and promises to perform. “When we join our wills and our promises, we create the possibility of collective action

216

Surveillance Capitalism Pandemic toward a shared future (p. 332).” For the law professor Zuboff, collective action is not so much a humanist sentiment as a promise kept. What is Shoshana’s advice in this important text? Naming and taming are inextricable. Productive Uncertainty and Black Swans It is important to acknowledge that surveillance capitalism—while it can behave authoritatively— does not necessarily retrieve totalitarianism. Totalitarian regimes use the threat of violence to secure territory and power, while surveillance capitalism gains superiority with knowledge, ubiquity, and prediction. Instrumentarian power choses to operate

…remotely and move in stealth. It does not grow through terror, murder, the suspension of democratic institutions, massacre, or expulsion. Instead, it grows through declaration, self-authorization, rhetorical misdirection, euphemism, and the quiet, audacious backstage moves specifically crafted to elude awareness as it replaces society with certainty. It does nor confront democracy but rather erodes it from within, eating away at the human capabilities and self-understanding required to sustain a democratic life. (p. 381)

However, for those seeking political power and wealth, the stratagems of surveillance capitalism, in concert with unprecedented events, can wreak havoc on civil discourse, social coherence, and citizen safety. The Covid-19 pandemic throws a wrench into the social engine, immediate solutions are required, and exceptional remedies proposed. Unscrupulous leaders can exploit these conditions to foster chaos, fear, and anger to tear apart the social weaving. Recall Steve Bannon’s plans for Cambridge Analytica, with its appeal to a dark triad of emotions. We can witness these strategies at work in the US, Britain, Russia, Brazil, Nigeria, Myanmar, and China among other locations. Purposeful chaos provides opportunities for regime change, a phenomenon described by Naomi Klein (2007) as “disaster capitalism.” Astonishment is a necessary alarm in unprecedented times. We are standing at a crossroads, astonished, indecisive, with a choice between democratic self-determinism or instrumentarian prediction and loss of sanctuary. “The steady drumbeat of Big Other’s manifest destiny, its breathtaking velocities, and the obscurity of its aims and purpose are intended to disarm, disorient, bewilder (pp. 394-5).”

Isn’t it astonishing, then, to experience at the same moment in history such unprecedented events as the Covid-19 pandemic and successive revelations of surveillance capitalism’s operations across the global membrane? Passing through airport security from Hanoi to Taipei to Vancouver on March 1, 2020 was both disarming and astonishing. Our familiar world was turned inside out by shocking statistics and incredible human responses, both troubling and inspiring. In the aftermath, our world will have learned much about autonomy and cooperation, privacy and transparency, observation and belief.

217

Marshall Soules

It will not be easy to forego the addictions and benefits of social networking and digital surveillance, to overcome the fear of not being in the loop—our decisions will not be easy ones. But we have a better chance of preserving what we honour about human civilization if we go forward more richly informed by evidence. That is Zuboff’s hope and it is mine in writing this review of her remarkable research. I believe everyone who uses digital social media should, to some degree, become familiar with the principles of behavioral data extraction; how it is monetized; how we contribute to that enterprise; and how we can protect ourselves from being misled when entering the digital vortex. To escape this maelstrom of prediction and disenfranchisement, Zuboff joins with Thomas Piketty to call for the protection of democracy. According to Piketty, democracy is the only tool societies have to counter rapacious capitalism: “If we are to regain control of capital, we must bet everything on democracy (Picketty, p. 573; Zuboff, p. 519).”

Who will answer the questions Who knows? Who decides? Who decides who decides? As we are learning from our unprecedented opportunity to see into the future with the Covid-19 pandemic, “Freedom from uncertainty is no freedom (p. 522-3).” Black swans have come home to roost. References Damasio, A. (2010). Self comes to mind: Constructing the conscious brain. New York, NY: Pantheon Books. Deibert, R. (2020). Reset: Reclaiming the internet for civil society, Toronto, ON: House of Anansi.

Doctoroff, D. (2016). “Google city: How the tech juggernaut is reimagining cities—faster than you realize. Presentation. https://www.bisnow.com/south- florida/news/technology/sidewalk-toronto-dan-doctoroff-82334 Ekman, P. (Accessed July 19, 2020). Facial Action Coding System (FACS), paulekman.com/facial-action-coding-system/ Gladwell, M. (2019). Talking to strangers: What we should know about the people we don’t know. New York, NY: Little, Brown. Grant, G. (1965/1997). Lament for a nation. Toronto, Canada: Carlton University Press.

Grant, G. (1969) Technology and empire: Perspectives on North America. Toronto, Canada: House of Anansi. Harari, Y. (2015). Homo deus: A brief history of tomorrow. Toronto, ON: Signal. Humanyze: Analytics for Better Performance. (Accessed July 19, 2020). www.humanyze.com/ Iyengar, S., Peters, M. & Kinder, D. (1982) Experimental demonstrations of the ‘not-so-minimal’ consequences of television news programs," The American Political Science Review. 76 (4) pp. 848–858.

218

Surveillance Capitalism Pandemic

Kahneman, D. (2011) Thinking, fast and slow. Toronto, Canada: Doubleday.

Klein, N. (2007). The shock doctrine: The rise of disaster capitalism. Toronto, Canada: Alfred Knopf. Laudrain, A. (August 7, 2019). Smart-city technologies, government surveillance and privacy: Assessing the potential for chilling effects and existing safeguards in the ECHR. Leiden Law School Working Papers Series. Le Bon, G. (1885/2006). The crowd: A study of the popular mind, West Valley City, UT: Waking Lion Press. Lippmann, W. (1922/1997). Public opinion. New York, NY: Free Press Paperbacks.

McLuhan, M. & McLuhan, E. (1988). Laws of media: The new science. Toronto, Canada: University of Toronto Press.

Nyhan, B. & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions,” Political Behavior 32: 303–330. Ong, W. (1982). Orality and literacy: Technologizing the word. London, UK: Routledge. Pentland, A. (2014). Social physics: How social networks can make us smarter. London, UK: Penguin. Pentland, A. (October 2011). Society’s nervous system: Building effective government, energy, and public health systems, MIT Open Access Articles. dspace.mit.edu/handle/1721.1/66256 Picard, R. (Accessed July 2020). Affectiva. web.media.mit.edu/~picard/ Piketty, T. (2014). Capitalism in the twenty-first century, Cambridge, MA: Belknap Press. Powe, B.W. (2019). The charge in the global membrane, Blaine, WA: NeoPoiesis Press. Realeyes. (Accessed July 2020). www.realeyesit.com/ SEWA Project. (Accessed July 2020). sewaproject.eu/ Taleb, N. (2007). The black swan: The impact of the highly improbable, New York, NY: Random House. Thaler, R., Sunstein, C. (2009). Nudge: Improving decisions about health, wealth and happiness. London, UK: Penguin. Varian, H. (2014). Beyond big data. Business Economics 49(1): 27–3.

Wylie, C. (2019). Mindf*ck: Cambridge analytica and the plot to break America. New York, NY: Penguin/Random House.

219

Marshall Soules

Zak, P. (2012). The moral molecule: The source of love and prosperity. London, UK: Bantam Press. Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. New York, NY: Hachette Book Group. Notes In Fall 2020, The Massey Lectures by Ron Deibert directly addressed the disruptions of surveillance capitalism:

Once, it was conventional wisdom to assume that digital technologies would enable greater access to information, facilitate collective organizing, and empower civil society. Rather than facilitating unity and the emergence of a common ideology based on science, the internet and social media have proven to be vehicles used to spread falsehoods, pollute the public sphere, and subject populations to wholesale surveillance. People are also spending an unhealthy amount of time staring at their devices, “socializing” while in fact living in isolation and detached from nature. As a consequence, there are pushes to regulate social media and to encourage tech giants to be better stewards of their platforms, respect privacy, and acknowledge the role of human rights. A prerequisite of any such regulation, however, is a complete understanding of the precise nature and depth of the problems. (houseofanansi.com/products/reset)

220