<<

apc issue paperS Big Data and Sexual

By Nicole Shephard

Keywords: surveillance, gender, sexuality, race, big data, LGBTIQ, women human defenders, , anonymity, consent

of surveillance. In addition to exposing and addressing Summary algorithmic discriminations, feminist data practices oppose the non-consensual collection of data, amplify participatory urveillance has historically functioned as an op- data projects that empower women and sexual minorities, pressive tool to control women’s bodies and is and protect the data, privacy and anonymity of activists and closely related to colonial modes of managing the communities they work with. populations. Big data, metadata and the tech- nologies used to collect, store and analyse them are by no means neutral, but come with their own exclusions and biases. This paper highlights the gendered and racialised effects of data practices; outlines the overlap- Main concepts ping nature of state, commercial and peer surveillance; and maps the challenges and opportunities women and queers Big data: Vast datasets containing social media data, encounter on the nexus between data, surveillance, gen- machine data or transactional data that can be analysed der and sexuality. Vulnerable communities as well as sexual computationally to reveal patterns, trends and predic- rights activists are at heightened risk of data-driven modes tions about human behaviour and social life.

Dr Nicole Shephard is an independent researcher and writer working on the intersections between gender, sexuality and technology. She holds a PhD in Gender (LSE) and an MSc in International Development (University of Bristol). 2 / issue papers

Metadata: Information describing other data; in the context surveillance practices in the private as well as in the of , telephone calls or , metadata public sphere.6 includes the sender, receiver, devices, locations, service provid- • Big data is generated in many places – social media, global ers, IP addresses, time, length or size of a message. positioning system (GPS) data, radio frequency identifica- tion (RFID) data, health data, financial data or phone Dataveillance: Combines data and surveillance to de- records. It extends to sexual and , as scribe systematic data-based surveillance practices that evidenced by recent research on the spread of menstrua- involve sorting and aggregating large quantities of data to tion, pregnancy and fertility apps.7 Experts project 40% monitor, track and regulate people and populations. annual growth in data generated globally.8,9 Assemblage: Rather than individual technologies or data- • Big data has found its way into gender and develop- sets, this theoretical concept emphasises the intersecting ment discourse, as evidenced by ongoing initiatives that nature of institutions and processes at work in abstracting seek to harness data for development10 and strive to and reassembling bodies through data, and the fluid socio- close the gender gap by closing the data gap.11 technical, economic and political contexts that data and • Large-scale social media data is mined for operative and surveillance are embedded in. marketing purposes by corporations.12 Consent is blurred when social media data is then collected by governments and aggregated with other available databases. Key facts • Women’s rights and sexual rights activists do not al- ways hold sufficient technological privilege to mitigate 1 • Whistleblowers including , Thomas against the risks inherent in their work and to minimise 2 3 Drake, Bill Binney and Russell Tice disclosed that con- potential adverse effects of surveillance (exposure, temporary data-driven surveillance practices carried out harassment, violence) for themselves and vulnerable by the (NSA), groups they work with. the United Kingdom Government Communications • Big data and dataveillance run the risk of algorithmi- Headquarters (GCHQ) and their allies globally are ubiq- cally reproducing past discriminations,13 creating new uitous and pervasive. exclusions and digital discriminations,14 and exacerbat- • With state surveillance at the forefront of the public de- ing epistemic violence that marginalised communities bates it is important to not lose sight of a wider range are subject to.15 of sexual surveillance practices historically functioning as a “tool of patriarchy, used to control and restrict women’s bodies, speech, and .”4 6 Dubrofsky, R. E., & Magnet, S. A. (2015). Feminist • The recognition that the data and metadata at stake in surveillance studies. Durham: Duke University Press. 5 surveillance are never neutral is central in relation to 7 rizk, V., & Othman, D. (2016). Quantifying Fertility and gender, sexuality and race. Reproduction through Mobile Apps: A Critical Overview. Arrow for Change, 22(1). www.arrow.org.my/wp-content/ • Women, people with disabilities, refugees, sexual mi- uploads/2016/08/AFC22.1-2016.pdf norities, people receiving state benefits, or incarcerated 8 Manyika, J., Chiu, M., Brown, B., Bughin, J., Dobbs, R., populations, among others, can testify to myriad ways Roxburgh, C., & Hung Byers, A. (2011). Big data: The next in which their privacy has habitually been invaded by frontier for innovation, competition, and productivity. www. mckinsey.com/business-functions/business-technology/our- insights/big-data-the-next-frontier-for-innovation 9 https://e27.co/worlds-data-volume-to-grow-40-per-year-50- 1 https://edwardsnowden.com/revelations times-by-2020-aureus-20150115-2/ 2 radack, J., Drake, T. & Binney, W. (2012). Enemies of the 10 UN Global Pulse. (2012). Big Data for Development: State: What Happens When Telling the Truth about Secret US Challenges & Opportunities. New York: . Government Power Becomes a Crime. Presentation at Chaos Communication Congress. https://media.ccc.de/v/29c3-5338- 11 Plan International. (2016). Counting the Invisible: Using en-enemies_of_the_state_h264 Data to Transform the lives of Girls and Women by 2030. Woking: Plan International. 3 Now. (2006, 3 January). National Security Agency Whistleblower Warns Domestic Spying Program 12 Kitchin, R. (2014). The Data Revolution: Big Data, Open Data, Is Sign the U.S. is Decaying Into a “Police State”. www. Data Infrastructures & their Consequences. London: Sage. democracynow.org/2006/1/3/exclusive_national_security_ 13 conrad, K. (2009). Surveillance, Gender, and the Virtual Body agency_whistleblower_warns in the Information Age. Surveillance & Society, 6(4), 380–387. 4 association for Progressive Communications (APC). (2016, 14 Lyon, D. (2003). Surveillance as social sorting: August). Feminist Principles of the 2.0. www.apc. codes and mobile bodies. In D. Lyon (Ed.), Surveillance as org/en/pubs/feminist-principles-internet-version-20 Social Sorting: Privacy, risk, and digital discrimination (pp. 5 Kitchin, R., & Lauriault, T. P. (2014). Towards critical data 13–30). London: Routledge. studies: Charting and unpacking data assemblages and 15 Gurumurthy, A., & Chami, N. (2016, 31 May). Data: The their work. The Programmable City Working Paper 2 SSRN. New Four-Letter Word for Feminism. GenderIT.org. www. papers.ssrn.com/sol3/papers.cfm?abstract_id=2474112 genderit.org/articles/data-new-four-letter-word-feminism

association for progressive communications 3 / issue papers

Big data evangelists have boldly declared “the end of Introduction theory”18 as vast quantities of numbers are taken to speak for themselves. Critics have convincingly demonstrated This issue paper aims at mapping sexual surveillance by that data, no matter how big, are only ever a representa- 19 exploring the key issues at stake on the nexus between tion and a sample, never a population, and that so-called 20 surveillance, gender and sexuality. Taking the recogni- “raw data” is in fact never quite raw. All methods of tion that surveillance has historically functioned as collecting, organising and analysing data are always a “tool of patriarchy, used to control and restrict cooked up by somebody. Similarly, “metadata” such as women’s bodies, speech, and activism”16 as a start- the sender, receiver, devices, locations, time and length of ing point, it explores the implications of data-driven a communication (but not its content), often constructed modes of surveillance for women and queers. It does as unproblematic in terms of the , can so from an intersectional17 perspective, as gender and provide “insight into an individual’s behaviour, social rela- sexuality never take place in isolation but interact with tionships, private preferences and identity that go beyond race, religion, class, geo-political location, health or bod- even that conveyed by accessing the content of a private 21 ily diversity. communication.” The recognition that the data and metadata at stake in surveillance are never neutral Big data is generated in many places – social media, is central in relation to gender, sexuality, and race. global positioning system (GPS) data, radio frequency identification (RFID) data, the internet of things, health This paper aligns itself with a growing body of scholar- data, financial data or phone records are just a few ship taking a feminist approach to surveillance studies, examples of data sources. Some of these data are know- as well as with emerging work that draws attention to ingly created, for example by updating a status, posting the colonial continuities and racial implications of data an image or writing a tweet. Others are the side product and surveillance. It explores the relationship between big of using services and their features, for example swiping data and sexual surveillance, as well as the challenges a card or using/carrying a phone. and opportunities that arise for women, queers and their advocates where data and surveillance meet gender and sexuality. Empowering uses of data such as community mapping projects take place alongside efforts to push back against, subvert and resist illegitimate surveillance So-called “raw data” is and its adverse effects.

in fact never quite raw. Ultimately, the paper argues for feminist data practices The recognition that the that are attentive to agency and consent of those in- volved. Such practices: data and metadata at • Oppose and resist the non-consensual collection of stake in surveillance are data. • Amplify data projects that empower women and never neutral is central sexual minorities. in relation to gender, • Take adequate care of protecting the data, privacy, and anonymity of activists and the communities they sexuality and race. engage with. • Work to expose and level algorithmic discriminations.

18 anderson, C. (2008, 23 June). The End of Theory: The Data Deluge Makes the Scientific Method Obsolete. Wired. 16 association for Progressive Communications (APC). (2016). www.wired.com/2008/06/pb-theory/ Feminist Principles of the Internet 2.0. www.apc.org/en/ pubs/feminist-principles-internet-version-20 19 Kitchin, R. (2014). Big Data, new epistemologies and paradigm shifts. Big Data & Society, 1(1), 1–12. 17 crenshaw, K. (1989). Demarginalizing the Intersection of Race and Sex: A Black Feminist Critique of 20 Gitelman, L. (2013). “Raw Data” is an Oxymoron. (L. Antidiscrimination Doctrine, Feminist Theory and Gitelman, Ed.). Cambridge, MA: MIT Press. Antiracist Politics. University of Chicago Legal Forum, 21 United Nations, OHCHR (2014). The Right to Privacy in the 1989, 139–167. Digital Age. undocs.org/A/HRC/27/37

association for progressive communications 4 / issue papers

Big data <-> represents but partially profiles particular individuals. In addition to never being comprehensive, data are also Surveillance never neutral. Data are “situated, contingent, relational, and framed, and used contextually to try and achieve certain aims and goals.”28 On Twitter individuals pur- Big data have been defined in many ways. Kitchin22 char- posefully create (mostly) public data. In other instances, acterises them as high in volume and velocity, diverse in they actively consent to the collection of their data for variety, exhaustive in scope, fine-grained in resolution, specific purposes such as advertising or research, and in relational, and flexible in scaleability. Recent years have yet others they are unaware of becoming data points. witnessed an exponential growth in data volume, and experts project 40% annual growth in data gener- 23 ated globally. Consent is blurred when Yet, hyperbolic statements like declaring the death of social scientific analysis24 because now “n=all”25 form social media data is collected part of big data myths26 that have close ties with the in bulk by governments spectre of objectivity so well-known to feminist critics of knowledge production. Here it appears in the form and aggregated with other that bigger data are considered somehow more true, available databases. accurate and objective than other modes of producing knowledge.

Using the example of Twitter data, boyd and Crawford27 explain succinctly why n, no matter how Consent is blurred when social media data, however will- large, never equals all: ingly created and shared, is collected in bulk by governments and aggregated with other available databases. Neither are Twitter does not represent “all people”, and it is the databases and repositories that hold the data neutral. an error to assume “people” and “Twitter users” They are “complex socio-technical systems that are embed- are synonymous: they are a very particular sub-set. ded within a larger institutional landscape”29 that includes Neither is the population using Twitter representa­ research institutions, corporations, as well as government tive of the global population. Nor can we assume agencies concerned with security, citizenship or public that accounts and users are equivalent. Some users health, and is imbued with power relations. have multiple accounts, while some accounts are used by multiple people. Some people never esta- Scholars have observed quantitative and qualitative blish an account, and simply access Twitter via the shifts in surveillance practices, as the so-called “war on web. Some accounts are “bots” that produce auto- terror” goes hand in hand with the increased avail- mated content without directly involving a person. ability of big data. Lyon30 notes how data are not only “captured differently, they are also processed, combined, Twitter data illustrates the partiality of n well, but it is and analysed in new ways. Social media that appeared worth noting that every trove of data comes with its on the scene at roughly the same time as responses to own exclusions. 9/ 11­ boosted the ‘surveillance state’, are now the source Just as big data cannot represent entire populations, ag- of much data, used not only for commercial but also for gregating “all” data about a single person never fully ‘security’ purposes.” The US Customs and Border Protection Agency, for 22 Kitchin, R. (2014). Op. cit. instance, currently proposes collecting information 23 Manyika, J., Chiu, M., Brown, B., Bughin, J., Dobbs, R., on visitors’ social media accounts upon entry to the Roxburgh, C., & Hung Byers, A. (2011). Big data: The next frontier for innovation, competition, and productivity. www.mckinsey.com/business-functions/business- technology/our-insights/big-data-the-next-frontier-for- 28 Kitchin, R., & Lauriault, T. P. (2014). Towards critical data innovation studies: Charting and unpacking data assemblages and their work. The Programmable City Working Paper 2 SSRN. 24 anderson, C. (2008). Op. cit. papers.ssrn.com/sol3/papers.cfm?abstract_id=2474112 25 Mayer-Schönberger, V., & Cukier, K. (2013). Op. cit. 29 Kitchin, R., & Lauriault, T. P. (2014). Op. cit. 26 boyd, d., & Crawford, K. (2012). Critical Questions for Big 30 Lyon, D. (2014). Surveillance, Snowden, and Big Data: Data. Information, Communication & Society, 15(5), 662–679. Capacities, consequences, critique. Big Data & Society, 1(2), 27 Ibid. 1–13.

association for progressive communications 5 / issue papers

country. The measure aims at providing “greater clarity imbued with, is the basis for conceptualising data and its and visibility to possible nefarious activity and connec- entanglements with surveillance as assemblage.35 tions” in the assessment of “potential risks to national Haggerty and Ericson36 theorise a surveillance assem- security and the determination of admissibility.”31 blage where information, technology and the human The concept of “dataveillance” expresses these shifts and body interact to create a somewhat deterritorialised “data describes practices of sorting and aggregating vast datasets double” to be tracked, commodified, managed and to track and regulate populations: “Dataveillance in the controlled. Kitchin describes a data assemblage made present moment is not simply descriptive (monitoring) but up of systems of thought, forms of knowledge, finance, also predictive (conjecture) and prescriptive (enactment).”32 political economy, governmental and legal frameworks, infrastructure, materiality, institutions, places, subjectivi- It arguably differs from targeted surveillance: “Whereas ties, communities, and markets that all intersect with one surveillance presumes monitoring for specific purposes, another.37 Such a conceptual approach leaves room for dataveillance entails the continuous tracking of (meta) highly localised data practices as well as attention to the 33 data for unstated preset purposes.” ways in which big data extend to “the global, through Targeted surveillance thus requires a suspect to monitor inter-regional and worldwide data sets, data sharing ar- for a purpose, while dataveillance generalises suspicion rangements and infrastructures, and the formulation of 38 and algorithmically produces suspects, thus turning the protocols, standards and legal frameworks.” assumption of innocence until proven guilty on its head. Aradau and Blanke see modes of knowledge produc- Marx34 details this shift to new modes of surveillance tion, technological devices, institutions, and methods as along a staggering 28 dimensions. Most pertinently to a part of a data-security assemblage with stakes in civil discussion of sexual surveillance are the following: , , privacy, and data protection.39 • New surveillance often lacks consent, with higher pro- portions of involuntary production/collection of data. • The location of data and its collectors/analysts is Thinking of big data and often remote and less visible. (sexual) surveillance in terms • After collection the data is stored remotely and of an assemblage highlights migrated often. • The temporality of new surveillance is continuous, how much more than mere omnipresent, and covers past, present and future data is at stake, and that occurrences of data. the data cannot be thought • It is acontextual. • Whole populations rather than individuals are surveilled. independently of its wide

An understanding that data never emerge in isolation, and rather fluid context. are always contingent on context, technologies, humans and their algorithms that collect, sort, and analyse them, as well as on the power relations that all of the above are 35 the notion of assemblages is based on the work of philosophers Gilles Deleuze and Félix Guattari, who in A Thousand Plateaus (1987, p. 102-103) define it along two axes: “On a first, horizontal, axis, an assemblage comprises 31 Dept. of Homeland Security, U.S. Customs and Border two segments, one of content and one of expression. On Protection. (2016). Agency Information Collection Activities: the one hand it is a machinic assemblage of bodies, of Arrival and Departure Record (Forms I-94 and I-94W) actions and passions, an intermingling of bodies reacting and Electronic System for Travel Authorization. www. to one another; on the other hand it is a collective federalregister.gov/documents/2016/06/23/2016-14848/ assemblage of enunciation, of acts and statements, of agency-information-collection-activities-arrival-and- incorporeal transformations attributed to bodies. Then on departure-record-forms-i-94-and-i-94w-and#h-11 a vertical axis, the assemblage has both territorial sides, or reterritorialized sides, which stabilize it, and cutting edges 32 raley, R. (2013). Dataveillance and Countervailance. In L. of deterritorialization, which carry it away.” Gitelman (Ed.), Raw Data is an Oxymoron. Cambridge, MA: The MIT Press 36 Haggerty, K. D., & Ericson, R. V. (2000). The surveillant assemblage. The British Journal of Sociology, 51(4), 605–622. 33 van Dijck, J. (2014). Datafication, dataism and dataveill­ ance: Big data between scientific paradigm and ideology. 37 Kitchin, R. (2014). The Data Revolution: Big Data, Open Data, Surveillance & Society, 12(2), 197–208. Data Infrastructures & their Consequences. London: Sage. 34 Marx, G. T. (2002). What’s new about the “new surveill­ 38 Kitchin, R., & Lauriault, T. P. (2014). Op. cit. ance”?: Classifying for change and continuity. Surveillance 39 aradau, C., & Blanke, T. (2015). The (Big) Data-security assemblage: & Society, 1(1), 9–29. Knowledge and critique. Big Data & Society, 2(2), 1–12.

association for progressive communications 6 / issue papers

In a nutshell, thinking of big data and (sexual) surveillance but “most often upholds negating strategies that first in terms of an assemblage highlights how much more accompanied European colonial expansion and transat- than mere data is at stake, and that the data cannot be lantic that sought to structure social relations and thought independently of its wide and rather fluid con- institutions in ways that privilege whiteness.”42 text. The relationship between and humans, Smith notes how contemporary surveillance, so con- as well as discourse around security and big data that cerned with seeing as much as possible, is equally about justifies ever more surveillance form an integral part of “not-seeing” its heteropatriarchally structured colonial this assemblage,40 as do everyday practices that produce legacy.43 The interdependencies between colonial rule, digital traces and practices of resistance against dataveil- its particular modes of looking/seeing, collecting, record- lance, and efforts to use big data for positive change. ing and managing, as well as the complex ways in which gender, race, class and sexuality shaped colonialism are well documented.44,45 Surveillance <-> Small highlights the continuities between Antebellum sur- veillance of slaves and contemporary racial profiling that Sexual surveillance relies on surveillance mechanisms disproportionately targeting poor people of colour who lack the techno- logical privilege to opt out.46 Post 9/11 modes of profiling Gender and sexualities have only recently found their non-white people, and Muslims in particular, are unthink- way into the study and discourse of surveillance. This sec- able without considering the racist and orientalist colonial tion, however, maps ways in which women and sexual logics long predating 9/11 that enable them. minorities have long been highly visible to technologies of surveillance. The racialised, sexualised and gendered Tamale furthermore shows how colonial legacies, outcomes of sexual surveillance form part of what Lyon alongside capitalism and globalisation continue to describes as mechanisms of social sorting: “Surveillance shape contemporary regimes of sexual surveillance today sorts people into categories, assigning worth or in Africa.47 She maps a complicated patriarchal land- risk, in ways that have real effects on their life-chances. scape where state-supported religion (Christianity and Deep discrimination occurs, thus making surveillance not Islam with their emphasis of morality, purity and sin) merely a matter of personal privacy but of social justice.”41 intersects with cultural taboos and the law to inform and control African sexualities. Sexuality is often regu- Data have historically been used to categorise and man- lated by colonially inherited penal codes that retain the age populations. Big data are but the latest trend in a criminalisation of adultery, prostitution, , sod- long tradition of quantification with roots in modernity’s omy and elopement while protecting marital rape and fetishisation of taxonomy in the service of its institutional allowing for defences based on “mistaken belief” or order. Big data and machine learning may take these the victim’s supposed immorality.48 As a result, African practices to unprecedented levels, but it is important to sexual rights movements operate in environments not lose sight of the historical continuities the data/sur- where some choose to submit to sexual surveillance veillance assemblage is embedded in, particularly where gender, race and sexuality are concerned. 42 Browne, S. (2015). Dark Matters: On the Surveillance of Blackness. Durham: Duke University Press. 43 Smith, A. (2015). Not-Seeing: State Surveillance, Settler Gender, sexuality Colonialism, and Gender Violence. In R. E. Dubrofsky & S. A. Magnet (Eds.), Feminist Surveillance Studies (pp. 21–38). and surveillance Durham: Duke University Press. 44 Stoler, L. A. (2010). Carnal Knowledge and imperial Browne describes surveillance as “a technology of Power: Race and the Intimate in Colonial Rule. Berkeley: University of California Press. social control” that produces racial norms and has the 45 McClintock, A. (1995). Imperial Leather: Race, Gender and power to “define what is in or out of place.” She ar- Sexuality in the Colonial Contest. New York: Routledge. gues that this racialised ordering is fluid and contextual 46 Small, D. (2014, 8 October). Feeling Some Kind of Way About Surveillance. Model View Culture. www.modelviewculture. com/pieces/feeling-some-kind-of-way-about-surveillance 40 Ibid. 47 tamale, S. (2014). Exploring the contours of African 41 Lyon, D. (2003). Introduction. In D. Lyon (Ed.), Surveillance sexualities: religion, law and power. African Human Rights as Social Sorting: Privacy, risk, and digital discrimination Law Journal, 14(1), 150–177. (pp. 13–30). London: Routledge. 48 Ibid.

association for progressive communications 7 / issue papers

practices – Tamale cites women agreeing to virginity collection of communications data apply to everyone – if, tests for respectability’s sake as an example. Conversely, as Small points out, to diverging effects: “We can appar- those who cannot conform (sex workers, queers, rape ently kiss old assumptions about privacy goodbye. This is survivors, or those with HIV amongst others) find especially true for groups without access to technological themselves further marginalised. The vilification, sur- privilege, who must also deal with race, ethnicity, class, veillance and control of the “sexual other” is politically gender, and lifestyle biases against them.”54 instrumentalised, for example when sexuality is con- Harry highlights, for instance, how commonplace the structed as today’s “key moral issue on the continent” exploitative monitoring of black women by law enforce- to distract from mismanagement and corruption at the ment, media commentators, and online communities root of unemployment and other social problems.49 is. Stemming from cultural values long predating sur- While Tamale’s analysis relies on the similarities in veillance cameras, it continues to extend pervasively to sexual politics across countries, Hodes’50 work on the contemporary digital spaces and at times includes white South African context – where law and public policy feminists’ encounter with black women online.55 were largely re-written post-apartheid and abortion has been legally available since 1997 – presents further As relevant as conventional surveillance technologies, historical continuities in regimes of sexual surveillance. however, are public health measures, fertility screenings, She finds that the proportion of women seeking illegal birth certificates, social media postings, and other every- and unsafe has barely changed since the day practices that incur data outside of our personal implementation of the “Choice Act” legalising abor- control.56 Andrejevic argues that the study of surveillance tion in 1997. Women fear social stigma attached to needs to include all those “interests, pressures, preju- abortion as well as punitive treatment by healthcare dices, and agendas” and “forms of control that operate professionals, many of whom condemn abortion in the name of security, efficiency, risk management, and and only reluctantly comply with the Choice Act.51 so on, while simultaneously obscuring the forms of gen- Rather than submitting to breaches of confidentiality dered, raced, classed, and sexualised discrimination they and privacy built into the legal provision of abortions advance in the name of an allegedly general interest.”57 (crowded facilities, thin walls) and the culture of disap- The contemporary debate tends to frame this general proval among healthcare professionals, many women interest in terms of national security, but the examples evade sexual surveillance and protect their privacy by discussed here illustrate that public health or population resorting to risky illegal abortions, thus upholding pub- control can be equally complicit. lic “norms of silence and secrecy.” 52 Hodes concludes that post-apartheid abortion culture, with abortion In a case brought to APC’s attention by Bangladeshi 58 legal but publicly condemned, shows significant con- NGO BFES, for instance, their breast cancer project tinuities with the past, when abortion was illegal but faced government pressure to hand over non-an- privately sanctioned. Not only does the state remain onymised data on affected women they had been similarly invested in sexual surveillance as during the in contact with, prompting the NGO to develop a apartheid era, women continue to seek clandestine system to safeguard their identity. In an environment abortions in large numbers, and the state faces similar where women risk being left by their husbands to limitations in achieving reproductive control.53 avoid liability for medical bills, this data in the wrong hands would have exacerbated the already severe In addition to highlighting the gendered and racialised social and familial stigma for women seeking cancer implications of surveillance and its colonial and patriarchal care and information in rural areas.59 continuities, a feminist lens extends attention to prac- tices that conventionally have not been studied as 54 Small, D. (2014, 8 October). Op. cit. surveillance proper to reveal the asymmetrical exercise 55 Harry, S. (2014, 6 October). Everyone Watches, Nobody of power along gendered, racialised, and sexualised lines. Sees: How Black Women Disrupt Surveillance Theory. Drone footage, CCTV monitoring, wiretapping or the bulk www.modelviewculture.com/pieces/everyone-watches- nobody-sees-how-black-women-disrupt-surveillance- theory 56 Dubrofsky, R. E., & Magnet, S. A. (2015). Feminist 49 Ibid. surveillance studies. Durham: Duke University Press. 50 Hodes, R. (2016). The culture of illegal abortion in South 57 andrejevic, M. (2015). Foreword. In R. E. Dubrofsky & S. Africa. Journal of South African Studies, 42(1), 79–93. A. Magnet (Eds.), Feminist Surveillance Studies. Durham: 51 Ibid. Duke University Press. 52 Ibid. 58 www.amadergram.org 53 Ibid. 59 communication between BFES and APC.

association for progressive communications 8 / issue papers

Drawing attention to continuities between past and Often entangled in practice, they offer guidance in present forms of sexual surveillance, highlighting small- thinking through sexual surveillance beyond using gen- scale and/or analogue sexual surveillance practices that a der and sex to disaggregate (big) data. sole focus on big data might obscure, as well as attend- Body discrimination refers to “technologies that sim- ing to the ways in which data is partial and processed by ply are not designed with a full range of bodies in mind. potentially sexist, racist, and Islamophobic algorithms at These technologies privilege certain bodies – usually male, work in dataveillance are pressing feminist tasks. young, White, and able ones – over others.”62 Voice recog- nition software that struggles with non-male voices, facial More often than not recognition software that struggles with non-white faces, the norm all data bodies or full body scanners singling out non-normative bodies as suspicious are all instances of body discrimination. are measured against By contrast, context/use discrimination refers less to turns out to be white, those surveilled than to those doing the surveilling and male, cisgendered and the context the surveillance takes place in. Reminiscent of Mulvey’s “male gaze”,63 it describes the masculinised heterosexual and remote monitoring of feminised spaces. Monahan notes that “when social contexts are already marked by sexist relations, then surveillance (and other) technolo- The algorithmic methods and categorisations that data- gies tend to amplify those tensions and inequalities.”64 veillance relies on by definition operate by proximity to a Discrimination by abstraction refers to the ways in norm. More often than not the norm all data bodies are which context does not translate into the representation measured against turns out to be white, male, cisgen- of data, “leaving a disembodied and highly abstract de- dered, and heterosexual. As Conrad notes, “predictive piction of the world and what matters in it.”65 This poses models fed by surveillance data necessarily reproduce problems when inequalities are not represented in the past patterns. They cannot take into effective considera- data, further obscuring existing discriminations. tion randomness, ‘noise’, mutation, parody, or disruption unless those effects coalesce into another pattern.”60 Van der Ploeg66 furthermore describes the “informa- tisation of the body” that has the capacity to affect Dataveillance thus by definition runs risk ofreproducing embodiment and bodily experience and questions the past discriminations and marginalisations through possibility of a neat distinction between the body new modes of algorithmic discrimination. As public itself and its digital representation. Along with “the health, development, state security as well as private in- body as data” come numerous ways in which “bodies dustries increasingly move online and embrace big data, can be monitored, assessed, analysed, categorised, and, feminist attention to the data practices involved (as well ultimately, managed.”67 as to those left behind due to a lack of access) is timely and warranted.

The body under surveillance

Monahan61 conceptualises three overlapping gendered dimensions of surveillance: 62 Monahan, T. (2009). Ibid. • Body discrimination 63 Mulvey, L. (1975). Visual Pleasure and Narrative Cinema. Screen, 16(3), 6–18. • Context or use discrimination 64 Monahan, T. (2009). Op. cit. • Discrimination by abstraction. 65 Ibid. 66 van der Ploeg, I. (2003). Biometrics and the body as information. In D. Lyon (Ed.), Surveillance as Social Sorting 60 conrad, K. (2009). Surveillance, Gender, and the Virtual Body (pp. 57–73). London: Routledge. in the Information Age. Surveillance & Society, 6(4), 380–387. 67 van der Ploeg, I. (2012). The body as data in the age 61 Monahan, T. (2009). Dreams of Control at a Distance: of information. In K. Ball, K. D. Haggerty, & D. Lyon Gender, Surveillance, and Social Control. Cultural Studies (Eds.), Routledge Handbook of Surveillance Studies (pp. <-> Critical Methodologies, 9(2), 286–305. 176–183). Oxon: Routledge.

association for progressive communications 9 / issue papers

Facebook “likes” can reveal here joined by a form of “self-surveillance”. The re- cording, tracking and sharing of health, fitness and a user’s gender with 93% nutrition measures extends to sexual and reproductive accuracy, sexuality with 75%- health, as evidenced by recent research on the spread of menstruation, pregnancy and fertility apps. 88% accuracy, race with 95% Rizk and Othman conclude that the quantification of accuracy and relationship women’s bodies takes place on an unprecedented scale, with large data sets and metadata collected and shared status with 67% accuracy. with the applications’ platforms and third parties. The al- gorithms managing the data remain untransparent, with potential implications for “the creation of new normals, Hailed as progressive by some, Facebook’s inclusion of new standards for reproductive and gynaecological of more than 50 genders in 2014,68 for instance, paid indicators based only on those women who have access little more than lip service to transgender and queer to these apps, and those who bother to use them.”70 critiques of the previously binary gender options as neither targeted advertising on Facebook nor its In addition to binary categories long under feminist scru- application programming interface (API) reflect the tiny such as male/female, non/heterosexual or nature/ users’ “custom” gender. Having these options affords culture, the data body facilitates additional categorisa- those outside the male/female binary the opportunity tions, for example known/unknown, wanted/unwanted, to see the gender identity they embody represented normal/abnormal and so on.71 The body in its virtual iter- on screen and thus digitally represent themselves. At ation has the potential to be re-constituted, controlled, the same time, they draw attention to critical ques- marketised, and quite literally sold to the highest bidder. tions about the coding of gender (and sexuality) in Taking place from a distance and often unbeknownst data. The inclusion of diverse gender options can be to the user, these mechanisms have material con- read as a step towards overcoming gender binaries, sequences that extend beyond data back to the but simultaneously counteracts feminist notions of material body, for instance by the means of technolog- blurring gendered categorisations and promoting ical barriers to non-male or non-white bodies, increased views of gender and sexuality as not fixed. The sort- (border) policing of genderqueer bodies and those with ing and managing of bodies, particularly vulnerable disabilities, or the formation of normative ideas around or marginalised ones, is always already implicated in health and reproduction that affect women’s bodies. gendered, sexualised, and racialised ways of seeing, and by extension, of coding and categorising.

While the “custom” gender option is information the Sites of sexual surveillance user intentionally shares with friends, a recent study shows how an analysis of Facebook “likes” can reveal a user’s gender with 93% accuracy, sexuality with 75%- Across the literature related to big data and surveillance, 88% accuracy, race with 95% accuracy and relationship two defining moments emerge which conjointly shape status with 67% accuracy.69 Big data emerges as a rare the contemporary landscape. instance where less inclusion may make for a stronger First, the ongoing securitisation of borders, policing, ed- feminist argument, particularly considering that more ucation, development and many other areas, including detailed data goes hand in hand with increased surveil- everyday life under the banner of the “war on terror” lance and further marketisation. post 9/11. Gender, race, and sexuality are implicated in The quantified-self trend provides an instance of bod- these processes. Initiatives carried out in the name of ily surveillance that points to the fluid boundaries the “war on terror” have at times assumed a feminist between commercial and lateral modes of surveil- guise, for instance, when women’s rights were mobilised lance discussed in the next section of this issue paper, to justify the invasion of based on discourse

68 following further critique, in 2015 the gender options 70 rizk, V., & Othman, D. (2016). Quantifying Fertility and were additionally expanded by a free text field. Reproduction through Mobile Apps: A Critical Overview. 69 Kosinski, M., Stillwell, D., & Graepel, T. (2013). Private Arrow for Change, 22(1). www.arrow.org.my/wp-content/ traits and attributes are predictable from digital records of uploads/2016/08/AFC22.1-2016.pdf human behavior. PNAS, 110(15), 2–5. 71 van der Ploeg, I. (2012). Op. cit.

association for progressive communications 10 / issue papers

around the “saving” of Afghan women from the op- While the case of privacy risks on online dating pressive Taliban. platforms75 carries obvious implications for sexual surveillance, the example of app-based car-for-hire And the second are recent disclosures, revealing the scope company Uber shows that sexualised data practices of pervasive and secretive state-driven dataveillance car- can also be found in seemingly mundane business, ried out by the US National Security Agency (NSA), the such as figuring out when/where to dispatch drivers. UK Government Communications Headquarters (GCHQ) Uber’s data scientists not only correlated rides to/from and their allies globally. Edward Snowden, whose 2013 prostitution-prone areas with the habitual paydays disclosures on are the most comprehen- for benefits recipients, but rebranded the so-called sive and prominent to date, was not the first to disclose “walk of shame” a “ride of glory” after discovering details about mass surveillance, as evidenced by the ac- increased demand of their service based on patterns tions of previous whistleblowers including Thomas Drake, they associated with one night stands.76 While Uber Russ Tice and Bill Binney, nor was he the last. 2014 saw published these insights in a (humorous) post an anonymous source disclose the extent of US tracking that was later deleted, they illustrate the potential for of “terror suspects”, many of whom appear to have no sexual, and in this case classed, surveillance in data known connection to terrorism at all.72 Most recently, collected for commercial purposes. former Yahoo employees revealed that the email provider secretly scans all customers’ for intelligence ser- Andrejevic describes instances where peers like vices and/or law enforcement.73 members, friends, or acquaintances keep track of one another as “lateral surveillance.”77 Lateral surveillance The sheer mass of data gathered to feed a wide range involves technologies such as online background screen- of surveillance programmes suggests that data originally ings, portable cameras, keystroke loggers, spyware, or retained by social media, phone or internet providers for lie detectors but can equally consist of repeatedly goog- commercial purposes, as well as all traces the simple use ling someone, intensively following their social media of online services leave potentially seep into mechanisms presence, excessive and/or threatening commenting, of state-driven mass surveillance governed by a “collect amongst other mundane pursuits. it all” logic.

Thus while analytically useful and politically sensible to distinguish between the surveillance practices of states, Sexualised modes of lateral commercial entities, and individuals, in practice the surveillance overlap with boundaries between these categories are fluid. It is nevertheless important to note that commercial sur- online harassment and veillance, for example when large-scale social media stalking. datasets are mined for marketing purposes, and lateral surveillance where peers draw on similar mechanisms as states and corporations to monitor one another74 form Sexualised modes of lateral surveillance arise when these part and parcel of dataveillance alongside state surveil- activities intrude on women’s freedom of expression lance. and right to privacy, or when they overlap with online Often discussed in relation to gendered algorithmic harassment and stalking. Revenge porn websites or advertising practices or social media platforms’ role in smartphone apps specifically designed to track spouses (tackling) technologically mediated violence, the com- and family members come to mind foremost as tools mercial realm of surveillance extends beyond marketing for lateral sexual surveillance. However, in addition to and social media. those as well as to the surveillance technologies listed above, perpetrators can abuse mainstream apps that use

72 Scahill, J. & Devereaux, R. (2014, 5 August). Watch Commander. The Intercept. https://theintercept. com/2014/08/05/watch-commander/ 75 reitman, R. (2012, 10 February). Six Heartbreaking Truths 73 Menn, J. (2016, 4 October). Exclusive: Yahoo secretly about Online Dating Privacy. EFF. https://www.eff.org/ scanned customer emails for U.S. intelligence – sources. deeplinks/2012/02/six-heartbreaking-truths-about-online- Reuters. www.reuters.com/article/us-yahoo-nsa-exclusive- dating-privacy idUSKCN1241YT 76 Dataconomy (2014, 12 June). Uber: Mapping Prostitution 74 andrejevic, M. (2005). The work of watching one another: and “The Walk of Shame”. www.dataconomy.com/uber- Lateral surveillance, risk, and . Surveillance & mapping-prostitution-and-the-walk-of-shame/ Society, 2(4), 479–497. 77 andrejevic, M. (2005). Op. cit.

association for progressive communications 11 / issue papers

location data, messaging services and even sleep moni- Challenges <-> toring apps to track women’s movement. Opportunities Andrejevic furthermore highlights ways in which gov- ernments enrol the public in surveillance practices, for example when enlisting the public to report anything Debates on surveillance rightfully emphasise the right to “suspicious” or “unusual” to authorities. He concludes privacy, which remains a feminist issue in more ways that the spread of new media, rather than having a than one.81 It has, however, never been equally extended democratising effect, has facilitated “the injunction to to all. Women, people with disabilities, refugees, sexual embrace the strategies of law enforcement and mar- minorities, people receiving state benefits, or incarcerated keting at a micro-level. (…) The result has not been a populations, amongst others, can testify to myriad ways diminution of either government or corporate surveil- in which their privacy has habitually been invaded.82 lance, as evidenced by their converging role in the ‘war on terror’, but rather their amplification and replication 78 through peer monitoring networks.” Challenges: digital exclusions, The shooting of John Crawford III by US police forces algorithmic discriminations, illustrates the interplay of lateral surveillance “with his- and ambiguous visibilities torical discrimination. The police who ultimately ended his life were responding to a report, via citizen surveil- lance, that he had been observed with a gun.”79 Shot Whether any transformative potential of the informa- in a supermarket as he was holding a toy gun, he is one tisation of the body can be realised depends on who of a growing number of young black men killed by US controls the information: “When the control of a per- law enforcement. Imagery and videos of these deaths son’s information is out of that person’s hands, so too is circulate widely, turning them into posthumous specta- the nature of the potential transformation.”83 cles of lateral surveillance. The surveillance footage of his Manovich identifiesthree emerging classes in data- shooting, on the other hand, did not lead to the indict- driven societies, “those who create data (both ment of the police officer responsible for his death.80 consciously and by leaving digital footprints), those The focus of this paper is on data-driven modes of sexual who have the means to collect it, and those who have surveillance and its impact on women, sexual minorities, expertise to analyse it. The first group includes pretty and anyone else who may have a stake in the intersec- much everybody in the world who is using the web and/ tion between bodily integrity, sexuality, sexual rights, or mobile phones; the second group is smaller; and the freedom of expression, health and the myriad ways in third group is much smaller still.”84 which potentially mineable data traces occur. Therefore, a further important site for sexual surveillance, the pri- The transformative potential vate home, intimate relationships, and the family is left underexplored. of the informatisation of

Women, people of colour, queers and other marginal- the body depends on who ised groups have always been under sexual surveillance controls the information. in the private as well as in the public sphere. It is perhaps instructive that increased public and media attention to technologies of surveillance coincides with revelations 81 the Feminist Principles of the Internet, for instance, about the scope and scale of the mass surveillance of advocate the right to privacy and control over one’s everyone, while longstanding gendered and racialised personal information online and defend the right to modes of surveillance failed to grasp public imagination anonymity. At the same time, feminism’s relationship with privacy is a complicated one that rests on decades’ worth to remotely similar extents. of critique of the public/private distinction, not least due to the abuse against women perpetrated in the “privacy” of their homes. The private remains as political as ever. 82 Dubrofsky, R. E., & Magnet, S. A. (2015). Op. cit. 83 conrad, K. (2009). Op. cit.

78 Ibid. 84 Manovich, L. (2011). Trending: The Promises and the Challenges of Big Social Data. In M. K. Gold (Ed.), Debates 79 Harry, S. (2014, 6 October). Op. cit. in the Digital Humanities (pp. 1–17). Minneapolis: 80 Ibid. University of Minnesota Press.

association for progressive communications 12 / issue papers

The control of data bodies predominantly lies with instance, found that young women feel uncomfortable the latter two groups. Those consist of governments, when exposed to male observers in internet cafes and corporations, research institutions, and international are concerned about lateral surveillance (by family mem- organisations holding the infrastructure to collect and bers and peers) of their online activities.90 store large amounts of data, and of the subset of ana- lysts who both have the required skills and access to the data. A technology in itself may be Juxtaposed with the gendered and racialised power amoral and not created with relations underpinning the surveillance assemblage, inclusion in tech industries, persisting divides in access discriminatory intent, but to the internet,85,86 and technology-mediated violence can nevertheless have sexist against women, this means that the control over sexual surveillance lies outside of the hands of the women and and racist outcomes that queers, particularly those of colour, it concerns. reproduce social bias. Women and queers are at risk of tech-related violence, that is, acts of gender-based violence “committed, abetted or aggravated, in part or fully, by the use of Digital exclusions, be they due to lack of access to the information and communication technologies (ICTs), internet or due to the violent silencing of those present, such as phones, the internet, social media platforms and further skew the data in favour of those able to par- email.”87 ticipate freely and safely. Exclusions from big data are Tech-related violence “infringes on women’s right exacerbated when joined by algorithmic discriminations to self-determination and bodily integrity. It impacts and racist technologies. Examples include search on women’s capacity to move freely, without fear of results returning images of black women with natural surveillance” and “often what are seen as ‘merely vir- hair for “unprofessional hair” or normatively attractive tual’ threats soon translate into physical violence.”88 white people for “men” and “women”;91 or the history Contextually, the same threats extend to trans* people, of colour film technology that until recently was un- intersexed people, and ethnic minority groups.89 able to process darker skin tones in good quality.92 A technology in itself may be amoral and not created with In terms of big data and sexual surveillance, tech-related discriminatory intent, but can nevertheless have sexist violence not only affects the data generated (use of and racist outcomes that reproduce social bias. pseudonyms, self- and other blind spots), but also limits access and participation and hinders the free- Magnet’s work on (the failings of) biometric tech- dom of expression online. Tech-related violence is clearly nologies93 documents how they fail considerably situated within the broader landscape of gender-based more often on women, people of colour, and differ- violence rather than caused or determined by technolo- ently abled bodies, all while serving states to track gies mediating it. However, it amplifies heteronormative and control marginalised groups such as refugees and patriarchal modes of sexual surveillance and further or recipients of benefits. “Othered bodies” become marginalises women and queers in data, code, and subject to heightened surveillance, as Magnet and life. Research on female internet users in Mumbai, for Rodgers’ work on full body imaging technology

90 Bhattacharjya, M., & Ganesh, M. I. (2011). Negotiating 85 aPC. (2016). Ending digital exclusion: Why the access intimacy and harm: Female internet users in Mumbai. In divide persists and how to close it. www.apc.org/en/ ://EROTICS Sex, Rights and the Internet. APC. www.apc. system/files/APC_EndingDigitalExclusion.pdf org/en/system/files/EROTICS.pdf 86 othman, D. (2016). Access, Legislation, and Online 91 alexander, L. (2016, 8 April). Do Google’s ‘unprofessional Freedom of Expression: A Data Overview. Arrow for hair’ results show it is racist? The Guardian. https://www. change, 22(1), 49-55. www.arrow.org.my/wp-content/ theguardian.com/technology/2016/apr/08/does-google- uploads/2016/08/AFC22.1-2016.pdf unprofessional-hair-results-prove-algorithms-racist- 87 www.genderit.org/onlinevaw/ 92 caswell, E. (2015, 18 September). Color film was built for 88 Malhotra, N. (2014). Good Questions on Technology- white people. Here’s what it did to dark skin. Vox. www. Related violence. End violence: Women’s rights and safety vox.com/2015/9/18/9348821/photography-race-bias online, APC. www.apc.org/en/pubs/good-questions- 93 Magnet, S. A. (2011). When Biometrics Fail: Gender, Race, technology-related-violence and the Technology of Identity. Durham: Duke University 89 Ibid. Press.

association for progressive communications 13 / issue papers

shows. Particularly the “intersections of transgen- connect with their family, straight friends and church dered, disabled, fat, religious, female and racialised” community; and a queer account under an adopted name bodies are disproportionally affected and singled out where they connect with others in the LGBTQ community. for increased surveillance and harassment.94 (…) Nevertheless, the two accounts can result in danger- ous situations; respondents described being accidentally The expectation of transparency (of citizens rather than outed to their after being tagged under their real governments) is often accompanied by “nothing to hide name in a photo attending a LGBTQ gathering. Cases of – nothing to fear” narratives, with far-reaching feminist surveillance also include family members actively seeking implications. Governed by heteronormative institutions out queer accounts to expose users.”98 within borders that come with their own racialised and sexualised technologies of control, how much is safe to reveal and what must remain hidden is not Facebook’s insistence on equal for all. Beauchamp95 discusses what “nothing to hide” means for trans* people for whom questions nominal transparency of stealth versus visibility take on multiple dimensions. Some negotiate a desire/need to remain hidden with penalises non-normative medical and legal records that were never afforded pri- identities when they disrupt vacy, and others’ compliant visibility risks complicity with national security discourse around who can “pass” as a the collection of personal “safe” citizen or traveller (white, conclusively gendered) data. and who becomes subject to policing. The blame for systemic discrimination is transferred onto the victims; if they suffer negative consequences, it must be because they were hiding what is wrong with them.96 Such strategies serve the safety of sexual minorities and enable their participation online, but come in conflict The heightened visibility and transparency that accom- with Facebook’s real name policy, which has come under pany increased inclusion in data can pose challenges critique for its negative impact on trans* and queer indi- in terms of sexual surveillance. South African research viduals and activists. MacAulay and Moldes characterise illustrates that online counter-publics play a significant Facebook’s insistence on nominal transparency as pe- role in the negotiation of transgender and lesbian identi- nalising non-normative identities when they disrupt the ties, sexualities and politics. The authors note, however, collection of personal data. They found that “Facebook that black lesbians, the group facing the most violent re- prioritised threats that had a market value, and that sponse to their sexualities, were reluctant to participate profiles that fail to generate useful/marketable data are in the research.97 seen as a greater liability than abusive users.”99 Participating in queer or feminist activism online, navi- In conjunction with questionable content-moderation gating social media as a member of a sexual minority, practices,100 the politics of real-name policies affect particularly when additionally racialised are ambiguous big data and sexual surveillance in several ways: instances of visibility that can come at great cost rang- • Taking down “infringing” content and profiles leads ing from involuntary outing to harassment, social stigma, to further exclusion of already marginalised users. and persecution. Research with Kenyan queers highlights strategies developed to negotiate their online pres- • Enforcing real names and normative identities rein- ence: “Nearly all respondents maintain two accounts on forces heteronormative logics in social data. Facebook because of high levels of lateral surveillance: a ‘straight’ account using their real name, where they 98 Ganesh, M. I., Deutch, J., & Schulte, J. (2016). Privacy, anonymity, visibility: dilemmas in tech use by marginalised communities. opendocs.ids.ac.uk/opendocs/ 94 Magnet, S., & Rodgers, T. (2012). Op. cit. handle/123456789/12110 95 Beauchamp, T. (2009). Artful Concealment and Strategic 99 Macaulay, M., & Moldes, M. D. (2016). Queen don’t Visibility: Transgender Bodies and U. S. State Surveillance compute: reading and casting shade on Facebook’s real After 9/11. Surveillance & Society, 6(4), 356–366. names policy. Critical Studies in Media Communication, 96 andrejevic, M. (2015). Op. cit. 33(1). 97 Prinsloo, J., & McLean N. C. (2011). The internet and sexual 100 York, J. (2016, 20 September). Facebook’s nudity Ban identities: Exploring transgender and lesbian use of the Affects All Kinds of Users. Electronic Frontier Foundation. internet in South Africa. In ://EROTICS Sex, Rights and the www.eff.org/deeplinks/2016/09/facebooks-nudity-ban- Internet. APC. www.apc.org/en/system/files/EROTICS.pdf affects-all-kinds-users

association for progressive communications 14 / issue papers

• It enables the involuntary outing and direct sexual Opportunities: Between surveillance of queer individuals. resistance, sousveillance • It simultaneously makes users more transparent to and community data dataveillance.

Social media data in itself is highly curated – be The potential in the collection and analysis of large that due to users’ choice, lack of access, self-censorship, quantities of data has not gone unnoticed by activists, or other reasons. Those data are then marketised (for researchers, as well as international organisations. The example, in targeted advertising), potentially tracked by UN, for instance, run a number of big data projects to peers, and aggregated with other data available to law support and humanitarian ac- enforcement and security. tion. They note that big data are “no modern panacea for age-old development challenges”104 and advocate an For activists advocating women’s and sexual rights online approach to big data that recognises context as key. the overlapping nature of these modes of surveillance poses further questions. Imagine for example an unsus- An ongoing project, for instance, works on the pecting activist who has gained a following based on her real-time monitoring of the implementation of expertise on topics like abortion, gender-based violence, Option B+, a treatment programme to prevent HIV/AIDS, or LGBT issues, resulting in an increasingly mother-to-child HIV transmission available to HIV public profile. What she might perceive as successful ad- positive mothers in Uganda. Pulse Lab Kampala has vocacy work, however, places her wider network as well developed an application that tracks metrics such as as those her work concerns at heightened risk. number of patients receiving the treatment, number of antenatal care visits, or number of HIV/AIDS cases The surveillance of networks and relationships by adver- to monitor the performance of clinics in areas where saries is commonplace, and activists are important nodes the programme is in place.105 in those networks. They become targets themselves, as evidenced for instance by police surveillance of protest A past project analysed the impact of advocacy move- movements such as Black Lives Matter101 or the inter- ment Every Woman Every Child using four years national trade in surveillance technologies used to track worth of tweets and machine learning to identify 14 activists,102 or expose links to other activists and mem- million relevant tweets about women’s and children’s bers of the communities they work with. As Ball notes, health. The analysis showed an increase in the con- “when one is exposed, one is exposed to something.”103 versation about maternal and child health, as well as demographic trends peaks that could be linked to However, under the contemporary surveillance assemblage events internal and external to the campaign.106 where lateral, commercial, and state surveillance have be- come closely entangled, it is often not intelligible who While such projects are indicative of data-driven op- precisely that is. What may present as benign online portunities for positive change in terms of sexual and collaboration and exchange amongst feminist activists reproductive health, an overly celebratory stance risks and those involved in their projects, may in fact be closely masking the epistemic dangers that riddle the data/ watched by hostile governments, groups opposing the development terrain. Gurumurthy and Chami liken cause in question, employers, law enforcement, families “data for development”107 narratives to a “techno-so- or peers of those at risk, and so on. Under these circum- lutionism” that is complicit in epistemic violence and stances, the consideration of adequate operational security plays into the hands of a neo-liberal capitalist brand of is crucial to feminist activists to protect themselves as well as those they work with/for from becoming targets.

104 UN Global Pulse. (2012). Big Data for Development: 101 Ozer, N. (2016, 22 September). Police Use of Social Media Challenges & Opportunities. Surveillance Software is Escalating, and Activists Are in the 105 UN Global Pulse. (2016). Monitoring in Real Time the Digital Crosshairs. ACLU. www.aclu.org/blog/free-future/ Implementation of HIV Mother-to-Child Prevention police-use-social-media-surveillance-software-escalating- Programme. Retrieved from http://www.unglobalpulse. and-activists-are-digital org/projects/monitoring-hiv-mother-child-prevention- 102 Privacy International (2015, 23 March). Ethiopia expands programme on 20.09.2016 surveillance capacity with German tech via Lebanon. www. 106 UN Global Pulse. (2013). Advocacy Monitoring through privacyinternational.org/?q=node/546 Social Data: Womens and Children’s Health. www. 103 Ball, K. (2009). Information, Communication & Society. unglobalpulse.org/projects/EWEC-social-data-analysis Information, Communication & Society, 12(5), 639–657. 107 cf. www.data4sdgs.org/

association for progressive communications 15 / issue papers

development.108 This tension between potential public black feminists by tracking and attempting to infil- health benefits of big data and its adverse effects in trate their ‘ranks’.” She notes, however, that such terms of surveillance is tangible in the UN’s recognition public sousveillance strategies hinge on social capital that “while metadata can provide benefits”, they can and relative safety of those involved.113 also be aggregated to “reveal personal information and The initiatives cited here differ in context, scale and aims … give an insight into an individual’s behaviour, social but share participatory and situated approaches to re- relationships, private preferences and identity.”109 sisting sexual surveillance, fighting its adverse effects, or The remainder of this section turns to civil society-based initiating data practices for positive change. data projects on gender, sexuality, and sexual rights. HarassMap114 is an initiative founded 2010 in Egypt Sousveillance,110 i.e. practices of inverse surveillance that maps and calls out sexual harassment by crowd- that aim at watching from below, watching the watch- sourcing text messages and online reports of sexual ers by observing and recording surveillance practices, harassment and assault, and mapping the instances teaching and sharing knowledge and technologies to online. The community organisers and volunteers defend vulnerable groups and activists against adverse behind HarassMap then base communication cam- effects of surveillance, has grown in significance. paigns and programmes designed to make schools, Such practices have been spurred by movements of the universities, public places, and workplaces safer for so-called Arab Spring, Occupy, and Black Lives Matter women on the collected data. Initially Cairo-based, where citizen-reporting on violent authorities has played a the project has resonated internationally, and the central role, as well as by whistleblowers such as Chelsea Egyptian team has coached organisers in 28 coun- Manning or Edward Snowden whose sousveillance of the tries, including Algeria, Canada, India, Japan, Kenya, military and security agencies contributed to wider public Lebanon, Pakistan, Palestine, South Africa, Syria and awareness of previously hidden wrongdoings. Often tak- the US.115 It is noteworthy that HarassMap builds on ing place on a grassroots scale, sousveillance practices Ushahidi,116 an open source crowdmapping platform related to gender and sexuality are emerging. Where originally developed in Kenya to report instances of there is power, there is resistance,111 and efforts to sub- post-election violence in 2008. vert technologies – be that creative resistance against The Utunzi Rainbow Security Network serves surveillance or efforts to use surveillance technologies for as another example of sousveillance by the means good – deserve amplification in the face of a discursive of community mapping. The Kenyan collaboration landscape of terror and national security panics. between three LGBTQ organisations seeks to crowd- Data practices need to be viewed in light of the “agency map violence against queers in Kenya. Individuals as and reflexivity of individual actors as well as the variable well as organisations can report as well as request as- ways in which power and participation are constructed sistance when faced with or witnessing human rights and enacted”112 rather than algorithmic power alone. violations and other violence abuses based on sexual orientation and/or gender identity and expressions.117 Harry describes how hashtags or street record- Utunzi struggled to gain the trust and support of its ings are “ways of using tech to push back against potential users, as being web-based rather than app- surveillance.” Along with other Twitter users and based made it difficult for many potential users to participants in an online community of black women, access, due to a lack of trust in an unknown platform, she used the hashtag #Yourslipisshowing “to ex- and because the LGBTQ community did not see any pose 4chan board members who declared ‘war’ on tangible benefit of mapping incidents.118

108 Gurumurthy, A., & Chami, N. (2016, 31 May). Data: The New Four-Letter Word for Feminism. GenderIT.org. www. genderit.org/articles/data-new-four-letter-word-feminism 113 Harry, S. (2014, 6 October). Op. cit. 109 undocs.org/en/A/RES/69/166 114 www.harassmap.org/en/what-we-do/ 110 Mann, S. (2004). “Sousveillance” Inverse Surveillance in Multimedia Imaging. In Proceedings of the 12th annual 115 www.harassmap.org/en/what-we-do/around-the-world ACM international conference on Multimedia (pp. 116 www.harassmap.org/en/who-we-are/our-partners 620–627). 117 https://utunzi.com/about.php 111 Foucault, M. (1978). The History of Sexuality: An 118 Ganesh, M.I., Deutch, J., & Schulte, J. (2016). Privacy, Introduction. New York: Pantheon. anonymity, visibility: dilemmas in tech use by 112 Couldry, N., & Powell, A. (2014). Big Data from the bottom marginalised communities. opendocs.ids.ac.uk/opendocs/ up. Big Data & Society, (July-December), 1–5. handle/123456789/12110

association for progressive communications 16 / issue papers

In addition to strategic opposition to surveillance sys- Conclusion tems, and the development of mechanisms to evaluate and contest such systems, Monahan calls for democra- tising surveillance practices.119 If we are to engage The picture that emerges on the nexus between big data in data practices that are empowering, that have the and sexual surveillance is an ambiguous one. Calls for potential to further sexual rights and improve the quality better representation of women and queers, for reaping of life of women and sexual minorities, those practices data’s benefits in terms of development, gender equality, have to contribute to shifting data power from corpo- and sexual health, and for better recognition of gender- rations and governments into the hands of individual based and sexual violence have the potential to improve women, queers, and their communities. the lives of marginalised groups. They, however, have to As the hypothetical example of the somewhat oblivious be leveraged against concerns about the politics of the feminist activist at the end of the previous section sug- collection and analysis of big data, discriminations coded gests, women’s rights and sexual rights advocates can into the collection and algorithmic analysis of data, its face similar challenges as the communities they work colonial legacies, and the complicated politics of visibility with in terms of safety and visibility, with the added dan- that go along with the presence in data. ger of placing others at risk. Feminist work thus often makes it necessary to resist surveillance and to protect the privacy and anonymity of marginalised women, queers and activists. Activists, researchers, and journal- Whether data practices are ists working on sexual rights might benefit from closer transformative depends on ties with digital rights movements to improve safety and security in their own activism, to mitigate against the agency and consent, on how risks of increased visibilities, and to reap the benefits of data is collected, by whom, data activism for sexual rights causes.120 and to what ends. A wide range of free open source tools offer ways to protect data, anonymity and privacy and thus resist the surveillance of activist work. The accessible and fairly comprehensive manuals aimed at women, sexual mi- Whether data practices are transformative depends on norities and activists, such as A DIY Guide to Feminist agency and consent, on how data is collected, by whom, Cybersecurity,121 Zen and the Art of Making Tech Work and to what ends. Does the data serve the powerful to for You122 or the Take Back the Tech! digital safety road- further marginalise women and queers, or does it em- maps123 introduce a wide range of safety strategies, power them to make informed sexual health decisions, practices and tools. While 100% security is an illusion, resist harmful power structures, exercise their sexual particularly when faced with powerful state adversaries, rights, and enjoy bodily integrity? A feminist lens on using open source encryption, anonymity, and privacy surveillance contributes depth, perspective, a focus on tools go a long way towards keeping feminist and sexual power relations, and attention to the quirks and outliers rights activists, their sensitive data, and the vulnerable so often ignored in big data. communities they work with as safe as possible.124 The Feminist Principles of the Internet, developed by APC, call for an internet that empowers women and queers in all their diversity and recognise that the internet is not free-floating but an extension of other 119 Monahan, T. (2006). Questioning Surveillance and Security. In T. Monahan (Ed.), Surveillance and Security: Technological social spaces that are often “shaped by patriarchy and Politics and Power in Everyday Life. New York: Routledge. heteronormativity.”125 UN resolution A/HRC/32/L.20 on 120 Ibid. “the promotion, protection and enjoyment of human 121 tech.safehubcollective.org/cybersecurity rights on the Internet” was recently passed as an addi- 122 gendersec.tacticaltech.org/wiki/index.php/Complete_ tion to Article 19 of the Universal Declaration of Human manual Rights126 and reaffirms the importance of access to a free 123 https://www.takebackthetech.net/be-safe 124 See also DATNAV, a guide to working with digital data for human rights researchers, as many of the strategies it lists apply equally to women’s and sexual rights work. www. theengineroom.org/wp-content/uploads/2016/09/datnav. 125 APC. (2016). Op cit. pdf 126 www.un.org/en/universal-declaration-human-rights

association for progressive communications 17 / issue papers

and open internet for all.127 While this resolution does While the internet remains an important sphere for the not oblige governments to provide the infrastructure advancement of sexual rights and empowerment of to bring all citizens online, it represents a welcome af- women, advocacy work requires a situated and contex- firmation that human rights extend to digital spaces. In tual assessment of (and mitigation against) the risks its contrast to previous resolutions on human rights and the data practices may expose activists, women, and sexual internet, it explicitly stresses the importance of access minorities to. Community-based open source projects to information technology and digital literacy for the that generate data to be used in the struggle against empowerment of women and girls. gender-based and sexual violence and towards the em- powerment of women and queers are promising and Feminist scholarship and activism on/against surveillance indicative of democratising data practices.131 shows clear continuities between discriminatory practices offline and online, as well as historical con- A feminist practice opposes the non-consensual col- tinuities between the patriarchal surveillance and control lection of data and plays its part in preventing and/or of women’s, non-normative, and racialised bodies and subverting the non-consensual use of data already col- contemporary modes of sexual surveillance. Thinking of lected. sexual surveillance and the data it operates through, no matter how big or small, in terms of situated knowl- edges128 places the technologies involved in collecting, A feminist practice storing, and analysing them in their social context. opposes the non- While data-driven sexual surveillance often takes consensual collection place on the level of abstraction, it produces embod- ied consequences and meanings, as the examples of data and plays its discussed in this paper show. Affording these localised part in preventing and/ meanings participation and voice in data practices has the potential to drive the data, but also the systems and or subverting the non- social relations they are embedded in towards more so- consensual use of data cial justice,129 not least for women and sexual minorities. already collected. The Feminist Principles of the Internet highlight the importance of “an ethics and politics of consent” that affords women and sexual minorities “informed deci- Given the pervasive yet unaccountable nature of data­ sions on what aspects of their public or private lives to veillance practices, the protection of information share online.”130 The notion of consent in the context privacy and anonymity remain a prerequisite for of big data and sexual surveillance is complicated by any transformative use of data. A feminist practice the overlapping nature of lateral, commercial, and state thus opposes the non-consensual collection of data and surveillance this paper has outlined. Consent to the col- plays its part in preventing and/or subverting the non- lection of particular data for particular purposes (say consensual use of data already collected. sexual health services, the use of a social media platform, research, or advocacy) of course does not equal consent When developing or participating in data practices that to the bulk collection of that same data by government aim at furthering social justice goals, leveling algorithmic agencies. Data gathered in the course of advocacy work discriminations, or empowering women and sexual mi- and sexual health projects, sensitive data in the hands of norities, it is attentive to consent and participation and sexual rights activists, or data on activists themselves as takes adequate care to safeguard the data of vulnerable well as their wider networks form part and parcel of the groups involved as well as of activists themselves at risk data/surveillance assemblage. of surveillance.

127 undocs.org/A/HRC/32/L.20 128 Haraway, D. J. (1988). Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective. Feminist Studies, 14(3), 575–599. 129 Monahan, T. (2009). Op. cit. 130 APC. (2016). Op cit. 131 Monahan, T. (2006). Op. cit.

association for progressive communications

Internet and ICTs for social justice and development

APC is an international network of civil society organisations founded in1990 dedicated to empowering and supporting people working for peace, human rights, development and protection of the environment, through the strate- gic use of information and communication technologies (ICTs).

We work to build a world in which all people have easy, equal and afforda- ble access to the creative potential of ICTs to improve their lives and create more democratic and egalitarian societies. monocromo

www.apc.org [email protected] Graphic Design:

This issue paper forms part of APC’s EROTICS network building in India, Nepal and Sri Lanka, and was produced with the support of AmplifyChange.

Big Data and Sexual Surveillance DECEMBER 2016

ISBN 978-92-95102-71-2 APC-201612-WRP-I-EN-DIGITAL-261

Creative Commons Attribution-ShareAlike 3.0. [email protected]