Political and Ethical Perspectives on Data Obfuscation 165
Total Page:16
File Type:pdf, Size:1020Kb
1 7 2 Political and ethical perspectives 3 on data obfuscation 4 Finn Brunton and Helen Nissenbaum* 5 Asymmetries of data gathering and means 6 of redress: the warrant for obfuscation 7 Our chapter, like all the others gathered in this volume, is written in light 8 of the fact that computer-enabled data collection, aggregation and mining 9 dramatically change the nature of contemporary surveillance. Innocuous 10 traces of everyday life submitted to sophisticated analytics tools developed for 11 commerce and governance can become the keys for stitching disparate data- 12 bases together into unprecedented new wholes. This data is often gathered 13 under conditions of profound power imbalance. What can we do when faced 14 with these demands, which are often trivial but whose implications are pro- 15 found, and which we may not be in a position to refuse? 16 Being profi led is the condition of many essential transactions, from con- 17 necting with friends in online social networks to shopping, travelling and 18 engaging with institutions both public and private. Nor, as we shall discuss 19 below, can we rely on law, technology or the scruples of the data gatherers. 20 What we propose is an alternative strategy of informational self-defence, a 21 method that acts as informational resistance, disobedience, protest or even 22 covert sabotage – a form of redress in the absence of any other protection and 23 defence, and one which disproportionately aids the weak against the strong. 24 We call this method obfuscation and, in this chapter, we will argue for the 25 political and ethical philosophy it expresses and embodies. 26 Obfuscation is the production of misleading, ambiguous and plausible but 27 confusing information as an act of concealment or evasion. It is a term we use 28 to capture key commonalities in systems ranging from chaff, which fi lls radar’s 29 sweep with potential targets; to the circulating exchanges of supermarket loy- 30 alty cards that muddle the record of purchases; to peer-to-peer fi le sharing 31 systems such as BitTorrent, protecting their users from legal action by pro- 32 ducing records of many IP addresses, only a few of which may be engaged 33 in fi le sharing. Through these and other cases we can begin to clarify obfusca- 34 tion among the other forms of resistance to surveillance, whether that surveil- 35 lance takes the form of consumer data aggregation (for supermarkets, or by HHildebrandtildebrandt aandnd ddee VVries_Chap07.inddries_Chap07.indd 116464 224-12-20124-12-2012 110:43:390:43:39 Political and ethical perspectives on data obfuscation 165 1 companies such as Acxiom), monitoring for intellectual property violations 2 (at the behest of the Recording Industry Association of America (RIAA) and 3 the Motion Picture Association of America (MPAA)), targeted advertising (by 4 sites such as Google and Facebook) or police actions by repressive govern- 5 ments (which we will see addressed by obfuscation tactics within platforms 6 for secure private conversation such as Tor). 7 We distinguish and evaluate different modes of obfuscation as well as 8 motivations and power topologies of key actors: are obfuscation tactics typi- 9 cally the response of the weak against the strong, adopted by those outside 10 circles of power and infl uence, or vice versa? Our political analysis of obfusca- 11 tion also addresses normative questions of legitimacy, asking whether such 12 ‘smokescreens’ to avoid monitoring are morally defensible – ever, never or 13 sometimes? Under what conditions in the political landscape of surveillance 14 are obfuscation’s deceptive tactics acceptable? These can be deemed legitimate 15 assertions of autonomy or problematic instances of economic free ridership 16 (relying on others to be less conscientious in muddying their tracks and there- 17 fore better targets); they can be hailed as resistance to inappropriate monitor- 18 ing or damned as the poisoning of wells of collective data. Obfuscation, as a 19 tactic both personal and political, offers a platform for studying legitimate 20 and problematic aspects of both surveillance and its opposition in an age of 21 ubiquitous data capture. 22 In the context of this volume, we do not need to go out of our way to 23 describe the problematic state of contemporary data gathering and analysis, 24 but we do need to highlight the specifi c problems of asymmetry these prac- 25 tices, as a matter of fact, often involve. The most mundane points of contact 26 with contemporary life implicate the involuntary production of data on our 27 part: passing security cameras, withdrawing cash, making credit card pur- 28 chases, making phone calls, using transit (with electronic ticketing systems 29 such as MetroCards, FasTrak tags, Oyster, Octopus, Suica or E-ZPass) – to say 30 nothing of using the internet, where every click and page may be logged and 31 analysed, explicitly providing data to the organisations on whose systems we 32 interact, as well as their associates. This data can be repackaged and sold, col- 33 lected, sorted and acquired by a variety of means, and reused for purposes of 34 which we, the monitored, know nothing, much less endorse (Gonzalez Fuster 35 2009 ). The unreliability of the businesses and public-private partnerships in 36 the information industry gives data mobility still more sinister dimensions, as 37 materials are stolen, leaked, sold improperly or turned to very problematic ends 38 by governments – ChoicePoint’s sale of 145,000 records to identity thieves 1 39 being one particularly egregious example. The nature of these businesses, 40 acquiring new data sets to add to their existing collections, points to a fi nal area 41 of concern. Multiple databases consolidated and cross-referenced, with inciden- 42 tal details linking previously disconnected bodies of information, produce a far 43 more signifi cant whole than any one part would suggest: identities, tendencies, 2 44 groups and patterns with both historically revelatory and predictive power. HHildebrandtildebrandt aandnd ddee VVries_Chap07.inddries_Chap07.indd 116565 224-12-20124-12-2012 110:43:390:43:39 166 Privacy, due process and the computational turn 1 The asymmetry problems to which we alluded above are, fi rst, an asym- 2 metry of power: rarely do we get to choose whether or not we are monitored, 3 what happens to information about us and what happens to us because of this 4 information. We have little or no say when monitoring takes place in inap- 5 propriate contexts and is shared inappropriately with inappropriate others. 6 The second asymmetry, equally important, is epistemic: we are often not fully 7 aware of the monitoring, and do not know what will become of the informa- 8 tion produced by that monitoring, nor where it will go and what will be done 9 with it. 10 Your data is not accumulated in neutral circumstances, whether surveil- 11 lance occurs at the level of infrastructure with which you must participate, 12 through forms that must be completed to receive essential resources, or oner- 13 ous terms of service to which you must consent before you can use an online 14 product that has become vital to doing business. The context is often one of 15 major power imbalance, between individual consumers and major corpora- 16 tions, or citizens and governments. Obviously there is nothing inherently 17 wrong with gathering and aggregating data on individuals – it is the life- 18 blood of the work of the epidemiologist, for example, and the starting point 19 for many benefi ts of the networked society. It is in the combination of data 20 gathering with authority and its arbitrary interests where problems may 21 begin. 22 These problems continue once our data has been collected: we do not know 23 whether whoever gathers it will repackage and resell it, whether it will become 24 part of a schedule of assets after a bankruptcy or whether it will be collated by 25 a private party such as ChoicePoint with public records for reassembly and 26 used in a different context from the original point of provision. Data mining 27 and related disciplines are complex and intellectually demanding; they often 28 require resources of expertise, software and hardware that people outside large 29 institutions do not possess. We do not have access to the other databases, nor 30 the techniques and the training in mathematics and computer science, to 31 comprehend what can be done with seemingly trivial details from our lives 32 and activities, and how they can provide more powerful, total and revealing 33 analyses than we could have anticipated (Reiman 1995 ; Solove 2008 ). The 34 inconsequential and even benign can quickly become the problematic and 35 sinister. 36 Furthermore, we do not know what future techniques and databases will 37 enable. Opportunities for the correlation of information tend to increase with 38 time. Institutions very rarely voluntarily destroy materials with as much 39 potential as a rich database and, as Templeton ( 2009 ) points out, the mecha- 40 nisms to extract value from databases are only going to get better. Materials 41 from very different contexts, created in conditions of many different norms – 42 telephone call logs, geolocative data, purchase records whether in person or 43 online, demographic and personally identifying information, products of the HHildebrandtildebrandt aandnd ddee VVries_Chap07.inddries_Chap07.indd 116666 224-12-20124-12-2012 110:43:390:43:39 Political and ethical perspectives on data obfuscation 167 1 data generating machines that are social networking sites – can be combined, 2 correlated and cross-referenced with less and less effort. 3 Finally, the gravity of the potential consequences of this mass of data is not 4 easily perceived in the many small moments when we are faced with a deci- 5 sion about whether or not to comply, and give up information.