CLEARLY OPAQUE PRIVACY RISKS OF THE

the Internet of Things Privacy Forum May 2018 2 Clearly Opaque: Privacy Risks of the Internet of Things EXECUTIVE SUMMARY

EXECUTIVE SUMMARY Key IoT privacy risks and issues

The IoT will expand the data collection Given the likelihood of ubiquitous practices of the online world to the data collection throughout the human offline world. environment, the notion of privacy invasion may decompose; more so — The IoT will enable and normalize as people’s expectation of being preference and behavior tracking in monitored increases. the offline world. This is a significant qualitative shift, and a key reason to — Much of consumer IoT is predicated evaluate these technologies for their on inviting these devices into our lives. social impact and effect on historical The ability to know who is observing us methods of privacy preservation. The in our private spaces may cease to exist. very notion of an offline world may The IoT will hasten the disintegration of begin to decline. the ‘reasonable expectation of privacy’ standard as people become more The IoT portends a diminishment of generally aware of smart devices in private spaces. their environments.

— The scale and proximity of sensors When IoT devices fade into the being introduced will make it harder background or look like familiar to find reserve and solitude. The IoT things, we can be duped by them, will make it easier to identify people in and lulled into revealing more public and private spaces. information than we might otherwise. Connected devices are designed to be The IoT will encroach upon emotional unobtrusive, so people can forget that and bodily privacy. there are monitoring devices in their environment. — The proximity of IoT technologies will allow third parties to collect our emotional states over long periods of time. Our emotional and inner life will become more transparent to data collecting organizations. EXECUTIVE SUMMARY Clearly Opaque: Privacy Risks of the Internet of Things 3

IoT devices challenge, cross and The IoT makes gaining meaningful destabilize boundaries, as well as consent more difficult. people’s ability to manage them. The IoT is in tension with the principle — The home is in danger of becoming a of Transparency. ‘glass house,’ transparent to the makers of smart home products. And, IoT The IoT threatens the Participation devices blur regulatory boundaries – rights embedded in the US Fair sectoral privacy governance becomes Information Practice Principles and the muddled as a result. EU General Data Protection Regulation.

As more and more products are IoT devices are not neutral; they are released with IoT-like features, there constructed with a commercial logic will be an “erosion of choice” for encouraging us to share. The IoT consumers – less of an ability to not embraces and extends the logic of have Things in their environment social media – intentional disclosure, monitor them. social participation, and continued investment in interaction. Market shifts towards ‘smart’ features that are intentionally unobtrusive The IoT will have an impact on lead to less understanding of data children, and therefore give parents collection, and less ability to decline additional privacy management duties. those features. — Children today will become adults in a The IoT retrenches the surveillance world where ubiquitous monitoring by society, further commodifies people, an unknown number of parties will be and exposes them to manipulation. business as usual. 4 Clearly Opaque: Privacy Risks of the Internet of Things EMERGING FRAMEWORKS

Emerging Frameworks and Strategies to address IoT Privacy

Having broad non-specialist social User Control and conversations about data (use, collection, Management Strategies effects, socioeconomic dimensions) is essential to help the populace understand — Pre-Collection the technological changes around them. • Data Minimization – only collect data Privacy norms must evolve alongside for current, needed uses; do not collect connected devices – discussion is for future as-yet-unknown uses essential for this. • Build in Do Not Collect ‘Switches’ (e.g., mute buttons or software toggles) Human-Computer Interaction (HCI) • Build in wake words and manual and Identity Management (IDM) are two activation for data collection, versus of the most promising fields for privacy the truly always-on strategies for IoT. • Perform Privacy Impact Assessments to holistically understand what your A useful design strategy is the ‘least company is collecting and what would surprise principle’ – don’t surprise users happen if there was a breach with data collection and use practices. Analyze the informational norms of — Post-Collection personal data collection, use and sharing • Make it easy for people to delete in given contexts. their data • Make it easy to withdraw consent Give people the ability to do fine-grained • Encrypt everything to the maximum selective sharing of the data collected by degree possible IoT devices. • IoT data should not be published on social media or indexed by search Three major headings for emerging engines by default – users must review frameworks and strategies to address and decide before publishing IoT privacy: • Raw data should exist for the shortest time possible — User Control and Management — Notification — Identity Management — Governance • Design strategies: > Unlinkability – build systems that  can sever the links between users’ EMERGING FRAMEWORKS Clearly Opaque: Privacy Risks of the Internet of Things 5

activities on different devices • Researchers are exploring privacy or apps notification automation: > Unobservability – build or use > Automated learning and setting intermediary systems that are of privacy preferences blind to user activity > Nudges to encourage users to think • Give people the option for about their privacy settings pseudonymous or anonymous > IoT devices advertising their guest use presence when users enter • Design systems that reflect the a space sensitivity of being able to identify people — Governance Strategies • Use selective sharing as a • Creation of baseline, omnibus privacy design principle laws for US > Design for fine-grained control • Regulations restricting IoT data from of data use and sharing certain uses > Make it easy to “Share with this • Regulator guidance on acceptability person but not that person” of privacy policy language and • Create dashboards for users to see, innovation understand and control the data • Requirement to test privacy policies that’s been collected about them for user comprehension • Design easy ways to separate • Expansion of “personally-identifiable different people’s use of devices information” to include sensor data from one another in the US • Policymaker discussions of the — Notification Strategies collapse of the ‘reasonable expectation • Timing has an impact on privacy of privacy’ standard notice effectiveness. • Greater use of the ‘precautionary • Emerging privacy notice types: principle’ in IoT privacy regulation > Just-in-time • More technologists embedded > Periodic with policymakers > Context-dependent • Trusted IoT labels and > Layered certification schemes • Test people’s comprehension of privacy policies 6 Clearly Opaque: Privacy Risks of the Internet of Things ABOUT THE AUTHORS

ABOUT THE AUTHORS/ ACKNOWLEDGEMENTS

DR. GILAD ROSNER SAMMA, which won an Emmy Award for technical and engineering excellence in Dr. Gilad Rosner is a privacy and 2011. Gilad is a member of the UK Cabinet information policy researcher and the Office Privacy and Consumer Advisory founder of the non-profit Internet of Group, which provides independent Things Privacy Forum, whose mission analysis and guidance on Government is to produce guidance, analysis and digital initiatives, and a member of best practices to help industry and the IEEE Global Initiative for Ethical government reduce privacy risk and Considerations in Artificial Intelligence innovate responsibly in the domain and Autonomous Systems. He is a Visiting of connected devices. Gilad’s broader Scholar at the UC Berkeley School of work focuses on identity management, Information and a Visiting Researcher at US & EU privacy and data protection the Horizon Digital Economy Research regimes, consumer protection, and Institute. Gilad is a graduate of the public policy. His research has been used University of Chicago, NYU and the by the UK House of Commons Science University of Nottingham. and Technology Committee report on the Responsible Use of Data and he has contributed directly to US state legislation ERIN KENNEALLY, J.D. on law enforcement access to location Erin Kenneally is a currently a Program data, access to digital assets upon death, Manager in the Cyber Security Division and the collection of student biometrics. within the U.S. Department of Homeland Gilad has consulted on trust issues for Security Science & Technology the UK government’s identity assurance Directorate. Her portfolio comprises program, Verify.gov, and is the author trusted data sharing and research of Privacy and the Internet of Things infrastructure (IMPACT - Information (O’Reilly Media, 2017). Marketplace for Policy and Analysis of Cyber-risk and Trust), Data Privacy, Gilad has a 20-year career in IT, having cyber risk economics (CYRIE - Cyber worked with identity technologies, Risk Economics), and information & digital media, automation, and communications technology ethics. telecommunications. Prior to becoming Kenneally is Founder and CEO of Elchemy, a researcher, he helped design, prototype Inc., and served as Technology-Law and manufacture the world’s only Specialist at the International Computer robotic video migration system, known as ABOUT THE AUTHORS Clearly Opaque: Privacy Risks of the Internet of Things 7

Science Institute (ICSI), the Center for Foundation, and his colleagues for Internet Data Analysis (CAIDA) and being willing to fund this report, and for Center for Evidence-based Security providing an excellent venue for our first Research (CESR) at the University of workshop. Also, we are grateful to Steve California, San Diego. Weber, Betsy Cooper, Kristin Lin, Chuck Kapelke and Allison Davenport of the Erin is a licensed Attorney specializing Center for Long-Term Cybersecurity at in strategy, research and execution of UC Berkeley for their encouragement, challenging and emergent IT legal risk advice and administrative support, solutions. This including translating, and for publishing a concise white transitioning and implementing applied paper based on this report. Thanks as R&D and ops solutions at the crossroads of well to Ian Wallace and New America IT and advanced analytics, law, and ethics. for hosting our DC workshop in their She holds Juris Doctorate and Masters beautiful offices. We’d like to thank of Forensic Sciences degrees, and is a the Advisory Board of the IoT Privacy graduate of Syracuse University and The Forum: Michelle Dennedy, Jim Adler, George Washington University. Sharon Anolik and Scott David. We’re extremely grateful to our and colleagues who reviewed sections of this ACKNOWLEDGEMENTS report for their time, encouragement We gratefully acknowledge the generous and thoughtful critiques: Dr. Ewa Luger, financial support for this project Jo Breeze, Nik Snarey, Garreth Niblett, provided by The William and Flora Steve Wilson, Dr. Ben Zevenbergen, Ian Hewlett Foundation. Skerrett, Steve Olshansky, Claire Milne, Dr. William Lehr, Dr. Tyler Moore, We are deeply grateful to the colleagues, Shazade Jameson, Robin Wilton, Kasey friends and institutions that supported Chappelle, Liz Coll, Emma Lindley, Dr. the creation of this work, and the forty Chris Pounder, Sudha Jamthe, Josh Rubin, experts, practitioners and researchers Dr. Richard Gomer, Rose Keen, Aoife who graciously gave us their time to Hagerty, Carlos Ruiz-Spohn, Elliotte be interviewed and participate in our Bowerman, Kaitlin Mara, Henrik Chulu, workshops. We would like to thank and Danielle Pacheco. particularly Eli Sugarman, our Program Officer at the William and Flora Hewlett 8 Clearly Opaque: Privacy Risks of the Internet of Things TABLE OF CONTENTS

TABLE OF CONTENTS

2 Executive Summary 52 Privacy Management  54 The IoT’s Impact 6 About the Authors on Transparency and Acknowledgements  56 The IoT’s Impact on Notice and Disclosure 8 Table of Contents  60 Forgetting Data-Collecting Devices are Nearby 10 Introduction 6 3 People’s Changing Relationships with Their Devices 12 Definitions 14 Defining the Internet of Things 66 Incentives Framework 15 Defining Privacy and Mechanisms 71 Countering Market Failure 18 Boundaries of Privacy 20 The Home 7 4 Nature of the IoT Ecosystem  23 The Body and Emotional Life  75 Uncertain Costs and Benefits 26 Sensor Fusion  27 The Impact of 78 Risks of Harm Destabilized Boundaries 81 Technical Underpinning 82 Privacy Threats 32 Governance 84 Privacy Vulnerabilities 34 Old Wine in New Bottles? 85 IoT Privacy Harms 37 Governance and 86 Personal Information Breaches Innovation Philosophies and Identity Theft 42 Blurred Regulatory Boundaries 86 Autonomy Harm  47 The Need for Enhanced 87 Diminished User Participation Social Discourse 88 Violation of Expectations of Privacy 8 9 Encroachment on Emotional Privacy 90  Diminishment of Private Spaces 92 Chilling Effects 9 2Normalization of Surveillance and Manipulation TABLE OF CONTENTS Clearly Opaque: Privacy Risks of the Internet of Things 9

98 Emerging Frameworks 122 Recommendations and Strategies 124 Research 100 User Control and 124 Emotion Detection Management Strategies 124 Children’s Issues 100 Pre-collection 125 Norms 104 Post-collection 125 Discrimination 106 Identity Management 126 Insurance 109 Notification Strategies 126 Governance 109 Notice Timing 127 Funding 111 Notice Comprehension 128 Fostering Dialogue 112 Notice Content 129 Governance 112  Notices Contributing to Norms 130 Research Participants 113 Governance Strategies 113  EU General Data 132 Bibliography Protection Regulation 114 Baseline Privacy Law 142 Appendix 1: in the US IoT Privacy-Relevant Standards 114 Deletion Rights 115 Use Regulations 144 Appendix 2:  117 Risk Standards  IoT Privacy-Relevant 118  Sector-specific protections: Academic Centers Wearables 119 Certifications 148 Appendix 3: IoT Privacy-Relevant Conferences 10 Clearly Opaque: Privacy Risks of the Internet of Things INTRODUCTION

There have been many names for the INTRODUCTION IoT over time: , , machine-to- “Modern civilization is naturalistic, machine communications, pervasive mechanistic, its rhythm the tempo computing, and, most recently, cyber- of machines, each one of which is a physical systems. The terms emerged creature of problem-solving intelligence. from various disciplines, but they all point It is an unstable equilibrium of forces, in the same direction. These persistent the shifting patterns of which require attempts to find a suitable term for the of mankind ever more insight phenomenon reveal an awareness that and calculation.” the world is in rapid transition towards — Everett Dean Martin, 19281 more comprehensive monitoring and connectivity, that this will likely have a Scarcely a week goes by without a profound impact on our lives, and that significant privacy event, data breach or it is important to start anticipating the legal decision commanding the attention potential consequences. Our physical of journalists and the public. Perhaps and informational world is evolving, this is unsurprising: privacy debate has and with it, the concept of privacy as historically gone hand in hand with we know it. The potential ramifications new technical developments, and given of the advances of the Internet of Things the rapidity of change all around us it are a matter of profound concern to makes sense that the stories are flying many people.2 fast and furious. Given this increased rate of technological change and media interest, it is more crucial than ever to ensure depth and nuance in our privacy discourse. It is this belief in the value of nuanced discussion that has animated the research for this report about privacy and the Internet of Things (IoT).

2 E.g., see the Pew Research Center’s “The state of privacy in post-Snowden America: What we learned,” available at https://pewrsr. ch/2HY6sps, and findings from the EU-funded CONSENT project, “What consumers think,” 1 Martin, 1928, p. 364 available at http://bit.ly/EUConsent INTRODUCTION Clearly Opaque: Privacy Risks of the Internet of Things 11

This report is the culmination of eighteen Our intended audience for this report months of research on how the Internet encompasses the industry, grant funding, of Things affects and will affect privacy, technologist, privacy, policy, and academic and vice versa. It is exploratory in communities. Writing for such a broad nature, attempting to test preexisting audience is a tremendous challenge, and ideas and surface new ones. It presents we hope to have struck a balance between new empirical data gathered through detail and brevity, flow and terminology. seventeen interviews and two workshops The title of this report reflects two aspects with a total of forty experts, scholars and of the IoT that we highlight throughout practitioners, plus an extensive literature the report: the transparency of people and review. The gathered data was coded places that the IoT engenders, and the and analyzed through thematic content opacity of the devices themselves and the analysis techniques.3 Throughout the data flows and organizations behind them. report, we include many quotes from our interview and workshop participants This report is divided into the major to convey the richness of their views themes that emerged from our interviews, and voices. workshops and literature. They are: Boundaries, Governance, Privacy Management, Incentives, and Risks of Harm. These sections are followed by an overview of Emerging Frameworks and Strategies to improve IoT privacy, and then our Recommendations, which augment the prior section. To facilitate further research and exploration of IoT privacy topics, the appendices contain lists of relevant academic departments and research institutes, conferences, and protocol and standards efforts.

3 Aberbach and Rockman, 2002; Braun and Clarke, 2006; Saldaña, 2009; Vaismoradi et al., 2016 12 Clearly Opaque: Privacy Risks of the Internet of Things

DEFINITIONS DEFINITIONS Clearly Opaque: Privacy Risks of the Internet of Things 13

There is no strict definition of the Internet of Things. The term is a catch-all for the proliferation of objects in our homes and workplaces and cities that are acquiring varying degrees of networked intelligence. Devices that sense and communicate are not new, but technological developments have made sensing and connectivity inexpensive, unobtrusive and ubiquitous.

IoT refers to the increasing emergence of devices that are ‘smarter’ than they’ve historically been: a thermostat that knows a homeowner’s preferred temperature, a watch that tracks fitness and can locate its wearer via GPS, a car that drives itself. 14 Clearly Opaque: Privacy Risks of the Internet of Things DEFINITIONS

What unites them is that they are not full- DEFINING fledged computers, but rather purpose- THE INTERNET built devices, most of which are familiar, OF THINGS e.g., a vacuum cleaner or an ingestible pill. But these familiar objects have been The IoT is characterized by several upgraded with an ability to gather, process converging trends: ubiquitous and transmit information, thus becoming network access, inexpensive new actors in the informational world. sensors, computation and storage, miniaturization, location positioning Given the high diversity of connected technology, and the advent of devices, separating them into categories smartphones as a platform for can assist in conceiving of and analyzing device interfaces. In this way, the privacy implications. It should be noted IoT should be seen as evolution that these categories are not absolute and in product development. overlap in many ways:

For the purposes of this paper, the IoT Ecosystem: This approach arranges the is defined as the collection of devices IoT by technical elements: Operating that have the ability to sense, amass, and Systems, Platforms, Device Types, analyze data and to communicate through Infrastructure, Protocols, Processors, networks. These devices might be found Interoperability, Standards4 in the home, such as smart lighting or virtual assistants or TVs with cameras and Data Flow: The IoT here is viewed microphones; outdoors in public, such from the perspective of directional data as smart electricity grids, adaptive traffic flows. Each layer comes with its own signals and street lighting with gunshot stakeholders, commercial arrangements, detectors; in particular industries, such and interoperability issues: Things >> as remote monitors for health conditions Communications >> Gateway >> Data >> or personally tailored advertising; in Integration >> Consuming Applications5 retail environments, detecting who and where people are in shops, observing where they look or linger; or on a user’s person, such as wearable fitness trackers 4 Cloud Security Alliance Mobile Working Group, or head-mounted, networked cameras. 2015 5 Ovum, 2015 DEFINITIONS Clearly Opaque: Privacy Risks of the Internet of Things 15

Technical Layers: The IoT can be separated into different layers from the DEFINING PRIVACY perspective of security: network layer, It is clear that such devices will have application layer, device level, physical an impact on privacy, but, as we shall layer, human layer6 show, there is debate as to whether they create new privacy issues or Industry Verticals: This describes the IoT whether their effects on privacy merely according to markets for goods or services. mirror preexisting concerns. Exactly Common arrangements include: Healthcare, what is meant by privacy varies widely Logistics, Energy & Utilities, Public in different legal contexts, different Infrastructure & Services, Construction, cultures, and from person to person. Transportation, Retail, Media, Insurance, Entertainment, Telecommunications, DEFINING PRIVACY Education, Banking, Law Enforcement, We find value in Alan Westin’s classic Agriculture, Consumer Goods definition of privacy as “the claim of individuals, groups, or institutions to Contexts: This alignment considers the determine for themselves when, how, IoT according to its deployment settings: and to what extent information about home, workplace, retail environments, them is communicated to others.”7 And, personal vehicle, public transportation, in though it may be worn possibly to the public, borders, government interactions, point of being threadbare, Warren and law enforcement interactions Brandeis’ conception of privacy as a ‘right to be let alone’8 is still useful to bear in The all-encompassing nature of the mind, especially as they envisioned this categories above shows that the Internet right to encompass thoughts, emotions of Things will in the not-too-distant and sentiments, which is particularly future merely be Things. That is, the IoT germane to the IoT. We also find useful is today’s label for the current step in the Westin’s view that privacy protects four evolution of technology products. We ‘states’: solitude, intimacy, anonymity, believe the term IoT will, sooner rather and reserve.9 That said, these views are than later, go away, much as ‘mobile predicated in part on harms resulting computing’ did.

7 Westin, A., 1967, p. 7 6 Cloud Security Alliance Mobile Working Group, 8 Warren and Brandeis, 1890 2015 9 Westin, 1967, p. 31 16 Clearly Opaque: Privacy Risks of the Internet of Things DEFINITIONS

from invasion. We argue at different to the workplace), people experience points in this report that the IoT this unexpected boundary crossing as threatens to decompose the notion of a privacy violation.10 Related to this are privacy invasion because of increasingly theories of boundary management, omnipresent sensors and because many which claim that privacy is part of IoT devices will be invited into our lives. people’s ability and need to negotiate the boundaries between themselves While the United States’ legal and, and others, and with society at large. We arguably, cultural approach to privacy is discuss both the contextual and boundary largely organized around individualistic dimensions of IoT privacy at length in the harms, other vital aspects of privacy Boundaries section. appear both in the US and, to a greater degree, in Europe. The power and breadth PRIVACY, THE SELF, AND DEMOCRACY of the right to privacy is one of the major Broader yet is the idea that privacy is differences between Europe and the US. vital to the construction of the self and Europe’s somewhat more precautionary the health of a democracy. In 1998, the privacy laws hold that privacy and data early days of the commercial internet, protection are fundamental rights, scholar Phil Agre wrote, “control over whereas in the US a clear showing of harm personal information is control over after a privacy violation is the main trigger an aspect of the identity one projects for legal intervention. Another essential to the world, and the right to privacy dimension to consider is expectation, or is the freedom from unreasonable how people expect that their information constraints on the construction of one’s will be collected or used. In the US, the own identity.”11 Classic formulations of ‘reasonable expectation of privacy’ is a liberal democracy venerate autonomy principal standard used to judge whether and self-determination as core aspects someone’s privacy has been violated. The of what it means to be free. Agre connects obvious problem with this standard is how people’s control over their data that expectations can change over time, impression is linked to their ability and powerful actors can deliberately to freely, autonomously construct shift them. Nevertheless, one important themselves, in conjunction with the idea privacy theory argues that when that privacy protects people’s ability informational norms about the context to make decisions free from unwanted in which information is disclosed or used interference. In other words, privacy are breached, as when data leaks from one context to another (e.g., from home 10 Nissenbaum, 2009 11 Agre and Rotenberg, 1997, p. 7 DEFINITIONS Clearly Opaque: Privacy Risks of the Internet of Things 17

assists freedom of thought and of action. “hard for any one person to have privacy These freedoms, along with the ability without all persons having a similar to dissent, to protect deliberation from minimum level of privacy.”13 Other undue commercial or government scholars argue that privacy is constitutive influence, to speak freely and sometimes of society,14 integrally tied to its health, anonymously, and to think and behave in and that privacy is a public good.15 In ways that may deviate from community this way, privacy regimes can be seen norms are essential elements of as social policy, encouraging beneficial participating in democracy. Consider societal qualities, discouraging harmful the simplicity of voting in secret behind ones, and safeguarding democracy.16 a curtain: privacy protects people from Market actors like device manufacturers retribution for their political choices. It and service providers are essential preserves the belief that those choices contributors to how privacy manifests, and the thoughts that led to them are their both in the sense of mechanisms (settings, business and theirs alone. controls, switches) and norms (what is and is not acceptable practice, when to Privacy is a social and collective value; show notices, sharing defaults). But these a vital counterweight to earlier views manifestations occur largely according to that privacy is concerned with harms to commercial logic; business is concerned and capabilities of individuals. If privacy with sales. The long-term health of protects people’s capacities to participate democratic society, embodied in the in democracy, then it confers benefits preservation of social values like fairness, on society as a whole. Scholar Priscilla freedom of thought, and protection of Regan argued that privacy is a common the vulnerable, is decided in political and value: “although different people exercise policy realms. the right to free conscience differently, believe in different things, and belong to The way we discuss privacy, the way different religions, all individuals have we employ it to govern information a common interest in this right. The and the power it holds, and the way we same is arguably true for privacy.”12 The encode it into formal and informal policy preservation of privacy must therefore instruments have direct bearing on the be enacted at social levels and not left kind of society we collectively create. exclusively to the domain of individual people and how they experience it. It is 13 Regan, 1995, p. 213 14 Schwartz, 1999 15 Fairfield and Engel, 2015 12 Regan, 1995, p. 221 16 Bennett and Raab, 2003, Part 1 18 Clearly Opaque: Privacy Risks of the Internet of Things

BOUNDARIES BOUNDARIES Clearly Opaque: Privacy Risks of the Internet of Things 19

One of the main themes to emerge from our interviews, workshops, and literature is that connected devices challenge, cross and destabilize boundaries. These can be physical boundaries, such as the walls of the home or the skin; data-type boundaries, such as personally-identifiable versus non- identifiable; or regulatory boundaries, such as telecommunications, FTC jurisdiction, or airspace regulation. Related to this is the collapsing of social contexts: family life, work life, the inner life, health care, education, offline vs. online, public vs. intimate.

This section explores how the IoT affects boundaries and people’s ability to manage them; especially as we are often inviting these devices into private spaces rather than being invaded by them. Further, the scale and proximity of sensors being deployed in the human environment will begin to challenge historical notions of bodily and emotional privacy, especially given an increase in commercial interest in emotion data. We discuss the boundaries of the home and the body, sensor fusion, and the commercial drive to cross contexts. We close by presenting two theories raised by our participants that are pertinent to how people perceive and regulate privacy: contextual integrity and boundary management. 20 Clearly Opaque: Privacy Risks of the Internet of Things BOUNDARIES

Commenting on the impact of the IoT THE HOME on the perceived sanctity of the home, Heather Patterson of Intel remarked, Our interview and workshop participants “I’ve been thinking lately about how the observed that emerging connected IoT has the potential to really shift… the devices challenge the physical and home from a black box, what used to be a contextual boundaries of the home. protective, safe space, to more of a glass house where everything that we do is now readily apparent to people who are willing to look for it.”17 One security researcher agreed that the IoT is redefining the boundaries of the home, noting:

Our mental boundaries in our homes and private lives change with the presence of IoT. We have a legal framework based around our home, geographic boundaries on data territoriality, but the IoT traverses these boundaries.18

A key technology crossing these boundaries is the /: Amazon’s Echo with Alexa, Microsoft’s , Google Home, and the recently released Apple Home Pod with . These devices introduce a combination of microphones, artificial intelligence, voice recognition, and the melding of personal profile information gleaned from the use of other services.

17 Interview 18 Workshop comment BOUNDARIES Clearly Opaque: Privacy Risks of the Internet of Things 21

These devices are qualitatively Investigation and now Director for Privacy different than, say, televisions with voice and Technology at the Consumers Union, recognition because of the degree of noted, “Privacy in the home is a very AI combined with the combination of bipartisan thing. Think about Scalia: the extensive profile information. There’s man’s home is his castle.”21 Brookman no doubt this new class of technologies is referring to Justice Antonin Scalia’s brings pleasure, convenience, opinion in the seminal US Supreme entertainment and a platform for new Court case, Kyllo v US, where police used voice interactive applications. Rather, a thermal imager without a warrant to the issues worth exploring relate to detect heat patterns emanating from the placement of a general-purpose the defendant’s house from marijuana microphone – and increasingly cameras cultivation. Justice Scalia opined: as well – into the home, a context classically seen as the quintessential In the home… all details are intimate private space. details, because the entire area is held safe from prying government eyes… Privacy in the home is an embodiment We have said that the Fourth of privacy of location, “the right of an Amendment draws “a firm line at the individual to be present in a space without entrance to the house.”22 being tracked or monitored or without anyone knowing where he or she is.”19 The 2001 case specifically concerns law The home also embodies spatial privacy: enforcement´s ability to breach the home “the protection of the privacy of people boundary, but it is illustrative of the legal in relation to the places where they enact and cultural sanctity of the walls of the their private life. Classically, this is the home. In Europe, this sanctity is embodied dwelling or house, but it can stretch to in Article 7 of the Charter of Fundamental other ‘places of private life’… private Rights of the EU: “Everyone has the places with discernable boundaries.”20 right to respect for his or her private and family life, home and communications.”23 Regarding the governance of privacy in Two US states, Arizona and Washington, the home, Justin Brookman, formerly guarantee privacy in their constitutions, of the FTC’s Office of Technology and

21 Interview 19 Wright and Raab, 2014, p. 7 22 Kyllo v. U.S., 533 U.S. 27 (2001) 20 Koops et al., 2016 p. 28 23 European Union, 2012, Art. 7 22 Clearly Opaque: Privacy Risks of the Internet of Things BOUNDARIES

both citing the home context: “No person environments. Invisible embedded shall be disturbed in his private affairs, or sensors, actuators and in particular his home invaded, without authority of communications could widen law.”24 Further, psychological, sociological, the boundaries of a private territory far anthropological and other literatures have beyond its physical boundaries. As a argued that the home is a foundational consequence the ability to perceive and aspect of the human experience; “a control who is observing or disturbing physical space that is lived – a space that a user in her private territory will is an ‘expression of social meanings decrease or even cease to exist.26 and identities.’”25 This view of privacy helpfully moves Much of the legal focus of privacy of discussion from intrusion to control. The the home centers around invasion, authors write: “The goal of territorial and arguably so do the location and privacy is to control all physical or virtual spatial privacy conceptions above. But entities which are present in the user’s the introduction of virtual assistants, virtual extended territory in order to networked toys and interactive televisions mitigate undesired observations and is based on invitation – people willingly disturbances, and to exclude undesired purchasing and installing these products entities from the private territory.”27 at home. To describe some of the new challenges posed by inviting IoT devices Given the likelihood of ubiquitous into private spaces, Bastian Könings, data collection throughout the human Florian Schaub and their colleagues have environment in the near future, the been expanding upon ‘territorial privacy,’ notion of invasion may decompose; all a concept that overlaps with both spatial the more so as people’s expectation of and location privacy: being monitored increases. It is therefore crucial to continue to introduce privacy Whereas in a traditional scenario the approaches that assert control over physical boundaries of a room would devices and data flows. The Emerging also mark the boundaries of a user’s Frameworks and Strategies section private territory, this situation will explores practical efforts to enhance drastically change in future ubiquitous user awareness and control.

24 AZ Const., Sec. 8; WA Const., Sec. 7 26 Könings and Schaub, 2011, p. 105, emph. added 25 Mallet, 2004, p. 80, citing Wardaugh 27 Konings et al., 2016: 136, orig. emph. BOUNDARIES Clearly Opaque: Privacy Risks of the Internet of Things 23

The IoT’s increased monitoring of human THE BODY AND activity is fueled by scale – a greater EMOTIONAL LIFE number of sensing devices and sensor types – and also by the proximity of those IoT devices are slowly beginning to devices to people’s bodies. The examples breach another sensitive boundary: are obvious: mobile phones with their the skin, in the sense of the boundary cameras and microphones are constantly between people’s inner and outer lives within arm’s reach; fitness trackers have – the domain of bodily privacy. As with diversifying biometric sensors; Nest has the home, legal and cultural sensitivities expanded from networked thermostats exist around the sanctity of the body to indoor surveillance cameras;30 toys and mind. Consider prohibitions on are listening to children.31 The trend is forced medical procedures or the clear: the commercial market is offering 28 surrendering of blood samples and ever more devices to monitor people’s restrictions on administering lie detector activities, environments, and, importantly, 29 tests. The Internet of Things will likely their physical bodies and emotions. start to test the sensitivities around bodily privacy and the revered private Cameras and microphones are general- nature of one’s thoughts. purpose sensors, and the platforms they connect to continue to expand their capabilities. Computer vision can noninvasively detect facial blood flow and other biological artifacts.32 Wearables go a step further, employing a variety of both biometric and non-biometric sensors like GPS, accelerometers, altimeters, and gyroscopes. Sensor component costs continue to fall so that technology once constrained to medical and industrial settings is now proliferating in consumer

30 See https://nest.com/cameras/ 31 See, Hello Barbie http://hellobarbiefaq.mattel. 28 Sarnacki, 1984 com/ 29 Regan, 1995, p. 144-169 32 Sikdar et al., 2016 24 Clearly Opaque: Privacy Risks of the Internet of Things BOUNDARIES

contexts. These technologies and trends We’re seeing a net-rise of interest in have contributed to a rise in emotion sentiment and emotion capture. detection and ‘affective computing’ – The industries are wide ranging: computing that relates to, arises from, automobiles, insurance, health, or influences emotions.33 recruitment, media… basically anywhere it’s useful to understand emotions. In Emotion detection is the capture and terms of industries that are taking the analysis of facial data (surface and below), lead on this, advertising and marketing biometrics and voice, plus sentiment are the obvious ones. Increasingly we’re analysis which is drawn from textual seeing retail move in to that area as well, and graphic expressions. Prof. Andrew plus all sorts of different sectors, ranging McStay, author of the book Emotional AI: literally from sex toys all the way up The Rise of Empathic Media, explained to national security agencies, and all that in the wearables domain, sensors the marketing and organizational stuff can measure everything from respiration in-between.34 to blood flow, skin conductivity to EEG rates. The data patterns generated by In late 2017, a consumer advocacy group these measurements are then indexed published research on a range of patents according to the emotion they ostensibly secured by Google and Amazon relating to represent. This presupposes that the potential future functions of their digital collected data can be mapped reliably home assistant products.35 In one of these, to emotions, but McStay notes that such Amazon patented a method for extracting claims should be viewed cautiously. keywords from ambient speech which Despite the scientific research and would then trigger targeted advertising. findings touted by these companies, The method is to listen for specific kinds neuroscientific, social psychology and of keywords – verbs expressing interest humanities literatures have yet to come or desire: “like,” ”love,” “prefer,” “hate,” to a consensus as to what emotions “enjoy,” and so on.36 A person may say, “I love actually are. Nonetheless, development of skiing,” and then be served relevant ads, even and commercial investment in emotion though the speech was spoken to another detection technology continues unabated:

34 McStay, Interview, emph. added 35 Consumer Watchdog, 2017; see also Maheshwari, 2018 33 Picard, 1995, p. 1 36 Edara, 2014 BOUNDARIES Clearly Opaque: Privacy Risks of the Internet of Things 25

person, not the virtual assistant. In another However, professor of philosophy Judith patent, Google describes a smart home in DeCew, referring to Warren & Brandeis’ which “mischief may be inferred based upon landmark “Right to Privacy” article,39 observable activities” by children.37 points out that even as early as 1890 a less corporeal, less physically transgressive It seems highly likely that companies reading of the bodily privacy right was will continue to expand into emotion and appropriate. She noted that Warren and sentiment observation, gaining ever more Brandeis had proposed a “more general access to what lies below our public behavior right to privacy which would protect and speech. We further explore McStay’s the extent to which one’s thoughts, link between marketing, emotions and sentiments, and emotions could be consumerism in the Risks of Harm section. shared with others.”40 For the moment, it suffices to say that emotion capture raises questions Until recently, the boundaries of the skin of bodily privacy. and the inner life have arguably been less of an information privacy concern.41 Privacy scholar Gary T. Marx argued that a However, the scale and proximity of IoT privacy intrusion occurs when the physical sensors is reinvigorating this 130-year- barrier of the skin is crossed: old privacy interest. While we believe that the IoT mainly represents an Informational privacy encompasses amplification of existing data collection physical privacy. The latter can refer trends, large-scale emotion detection to insulation resulting from natural is a significant shift. We expand on the conditions such as walls, darkness, particular risks of emotion detection in distance, skin, clothes and facial the Risks of Harm section. expression. These can block or limit outputs and inputs. Bodily privacy is one form of this. This is seen in crossing the borders of the body to implant something such as a chip or birth control device or to take something from it such as tissue, fluid or a bullet.38 39 Warren and Brandeis, 1890 40 DeCew, 2013 37 Fadell et al., 2014 41 An exception to this is the privacy concerns of 38 Marx, 2012, p. x biometrics collection 26 Clearly Opaque: Privacy Risks of the Internet of Things BOUNDARIES

The quality and quantity of IoT sensor SENSOR FUSION data enhances the inferences that can be drawn from that data, while also The combination of multiple kinds of increasing the chances that people can be sensor data into a more revealing picture identified from such data, even when it’s has been termed ‘sensor fusion.’ We analyzed in aggregate.43 Lee Tien, Senior view sensor fusion as a weakening of Staff Attorney at the Electronic Frontier boundaries between data types and data Foundation, observed that not only is sets; the amalgamation of data from the fusion of data from different devices different contexts into a more revelatory poorly disclosed by companies, it is also picture. Prof. Scott Peppet of the poorly understood by people.44 As such, University of Colorado Law School writes: the use of connected devices can lead to unexpected revelations to the myriad Just as two eyes generate depth of field companies in the data supply chain of that neither eye alone can perceive, two IoT information and their business Internet of Things sensors may reveal partners. We turn now to the impacts unexpected inferences. For example, a of violated expectations. fitness monitor’s separate measurements of heart rate and respiration can in combination reveal not only a user’s exercise routine, but also cocaine, heroin, tobacco, and alcohol use, each of which produces unique biometric signatures… [E]ach type of consumer sensor… can be used for many purposes beyond that particular sensor’s original use or context, particularly in combination with data from other Internet of Things devices.42

43 Kohnstamm and Madhub, 2014; Peppet, 2014, p. 129-131 42 Peppet, 2014, emph. added, references omitted 44 Interview BOUNDARIES Clearly Opaque: Privacy Risks of the Internet of Things 27

[P]eople act and transact in society THE IMPACT OF not simply as individuals in an DESTABILIZED undifferentiated social world, but as BOUNDARIES individuals in certain capacities (roles), in distinctive social contexts, such as There are important privacy implications health care, education, employment, when devices cross boundaries in the marketplace, and so on. These unexpected ways. In particular, this contexts should be understood as emerging characteristic of IoT devices structured settings whose features have challenges contextual integrity and evolved over time — sometimes long boundary management. Contextual periods of time — subject to a host of integrity is a theoretical framework contingencies of place, culture, historical that tries to explain why people feel events, and more.46 privacy violations.45 It describes a set of informational norms that govern When the norms governing information the varying contexts of social life: exchange within particular contexts are norms concerning the transmission or violated or contextual boundaries are distribution of personal data, the type unexpectedly crossed, people experience of data, and the actors involved. The it as a privacy violation. For example, a framework explores the way we manage person expects her doctor to disclose different social contexts, such as home information to other medical staff and her life, work life, medical care, education insurance company, but not her family; and commerce: a person expects that family members in the home know which television programs she watches, but does not expect her boss to know. In her 2009 book, Privacy in Context, sociolegal scholar Helen Nissenbaum summarized, “What people care most about is not simply restricting the flow of information but ensuring that it flows appropriately…”47

45 Barth et al., 2006; Bruening and Patterson, 46 Barth et al., 2006, p. 184 2016 ; Nissenbaum, 2009 47 Nissenbaum, 2009, p. 2 28 Clearly Opaque: Privacy Risks of the Internet of Things BOUNDARIES

Contextual integrity concerns itself with the sharing process altogether. Altman the boundaries of social contexts. In this summarized, “Privacy mechanisms way, it bears kinship to the theories of serve to define the limits and boundaries ‘boundary management.’ In 1976, social of the self. By being able to change the psychologist Irving Altman conceived of permeability of the boundaries around privacy as “an interpersonal boundary oneself, a sense of individuality develops control process” and “selective control – sometimes incorporating others and the of access to the self or one’s group.”48 In world, sometimes keeping them out.”50 In contrast to traditional ideas of privacy as this, Altman argues that privacy serves the the act of retreating or hiding information, goal of promoting autonomy.51 Altman called privacy a “dynamic dialectical process”: Later privacy and human-computer interaction (HCI) scholars incorporated Privacy is a boundary control process and expanded on Altman’s views. A whereby people sometimes make seminal HCI paper by Palen and Dourish themselves open and accessible to observed, “Privacy management is not others and sometimes close themselves about setting rules and enforcing them; off from others… [T]he issue centers rather, it is the continual management around the ability of a person or a of boundaries between different spheres group to satisfactorily regulate contact of action and degrees of disclosure with others.49 within those spheres.”52 By blurring boundaries – between contexts, self and Importantly, Altman distinguishes his other, the home and public, expected privacy framework from others that rely islands of reserve and visibility to third primarily upon withdrawal, solitude, parties – IoT devices challenge people’s anonymity and secrecy, arguing instead ability to negotiate those boundaries. that selective sharing of the self co-occurs with withdrawal and reserve. In this sense, Some of our respondents identified Altman’s theory of boundary control ties contextual integrity and boundary into Nissenbaum’s assertion that people management as issues particularly are more concerned with controlling problematic to IoT: what gets shared than by eliminating

50 Altman, 1976, p. 26 48 Altman, 1976, p. 7-8 51 Similar to Agre; see the Definitions section 49 Altman, 1977, p. 67-68, emph. added 52 Palen and Dourish, 2003, p. 131 BOUNDARIES Clearly Opaque: Privacy Risks of the Internet of Things 29

If you have sensors integrated into buildings, Similar concerns arise in recent into homes, if you have smart devices like published work. Margot Kaminski cites smart speakers that are basically listening to the contextual integrity and boundary you all day long, it becomes really difficult management issues with household to know what information’s being collected, robots, raising concerns about people when it is being collected, and it’s very suppressing their own speech and opaque where that information flows and conforming their behavior: which entities might get access to [it]… It’s no longer sufficient to close a door if you When information revealed in the home want to have a private conversation – there is shared and used outside of the home, might be other sensors that are picking up people may stop trusting that the home is who’s in the room, what they are talking a private location, and may stop sharing about, what is the mood of these people.53 information and conform their behavior to majority norms even within the home.56 … You’re putting your smart lights into your Household robots threaten the ability office building, but that also lets you track of individuals to conduct… ‘boundary employee productivity and usefulness. management’ because in addition Creating boundaries between a productive to crossing physical boundaries, or work environment and an incredibly hyper being able to ‘sense’ through physical surveillance work environment is really boundaries… robots’ social features may tough, and probably only getting worse.54 elicit trust where trust is not deserved…57

Boundary work is managing those different Challenges to boundary management, aspects of life, of tasks, of kids and work and however, affect not only personal privacy, environments and social context… When but society at large. The European we’re surrounded by other peoples’ stuff Commission’s IoT Expert Group worried that’s collecting who knows what, it makes about how the breakdown of contexts that boundary work really challenging.55 could affect democracy:

53 Schaub, Interview, emph. added 54 Jerome, Interview 56 Kaminski, 2015, p. 664, emph. added 55 Jones, Interview 57 Ibid. 30 Clearly Opaque: Privacy Risks of the Internet of Things BOUNDARIES

The perimeter of a context, keeping Law scholar Anne Uteck related the certain information or actions restricted degradation of boundary management to the boundaries of a particular to the breakdown of the reasonable restricted type of interaction, may expectation of privacy standard, and the silently disappear by technology that threat of amplified, unwanted exposure: is as ubiquitous and interconnective as IoT. Such de-perimeterisation associated Embedded surveillance networks with converging technologies challenges undermine the pretense that we control the checks and balances associated with our environment or our boundaries the separation of powers in within it, a pretense fundamental to the our democracy.58 traditional construct of privacy. It is no longer just a question of further eroding Prof. Julie Cohen cited the necessity for the distinction between private and functional boundary management in public spaces, or even simply blurring order for people to determine the shape the boundaries of this distinction, and course of their lives: but the extent to which [ubiquitous] technologies have almost limitless Privacy shelters dynamic, emergent potential to contravene any reasonable subjectivity from the efforts of expectations of privacy in private and commercial and government actors to in public.60 render individuals and communities fixed, transparent, and predictable. It protects the situated practices of boundary management through which the capacity for self-determination develops.59

58 van den Hoven, 2016, p. 13 60 Uteck, 2013, p. 82-83, emph. added; see also 59 Cohen, 2013, p. 1905 Reidenberg, 2014, p. 145 BOUNDARIES Clearly Opaque: Privacy Risks of the Internet of Things 31

Contextual integrity and boundary management are two useful frameworks CONCLUSION for exploring both the personal and the There is undoubted value in digital social implications of the context-crossing devices and services ‘following’ people qualities of IoT devices. People’s ability as they move through the different to negotiate the boundaries of their social spheres of their lives, predicting social and physical contexts affects their behaviors and offering suggestions and ability to manage their privacy. As that timely information. Indeed, one could ability weakens, people may trust objects argue that this is the ‘natural’ direction of when they should not, and lose more personalizing user experiences. However, of the ability to know who is gathering there are signs that the home is gradually data about them. At the societal level, losing its sanctity as a place of reserve the porousness of context boundaries because of the monitoring devices people can have an effect on autonomy and are inviting in. Similarly, while content democracy itself. How people behave is tailored to people’s emotional states is influenced by their ability to negotiate attractive for gaming, entertainment and boundaries and know who is ‘inside’ and other uses, sensor scale and proximity who is not. fuels concerns over emotional and bodily privacy. Adding to this is sensor fusion, which, by combining data from disparate collections, can cause greater, and possibly unexpected revelation of personal traits and activities. The boundary-spanning quality of IoT devices has a privacy impact on both individuals and society at large. 32 Clearly Opaque: Privacy Risks of the Internet of Things

GOVERNANCE GOVERNANCE Clearly Opaque: Privacy Risks of the Internet of Things 33

Governance of the IoT is intertwined with questions about how privacy rights and risks are managed by public, private and social institutions. The IoT commingles new and existing components and platforms, so unsurprisingly there is debate over whether it is an evolutionary or revolutionary development. As such, the extent to which the IoT warrants new laws, policies, and regulations is a key question.

Addressing privacy regulations for connected devices surfaces discussions about: 1) whether privacy issues are different in degree or kind from those covered by our existing Internet regulations; 2) how the IoT challenges the boundaries and enforcement of existing governance instruments; 3) political debates pertinent to the relationship between market innovation and government regulation; and 4) the level of maturity of social discourse about the interrelationship of technology and privacy. 34 Clearly Opaque: Privacy Risks of the Internet of Things GOVERNANCE

On one hand are those who consider the OLD WINE IN IoT a different beast. Joris van Hoboken NEW BOTTLES? of the Free University of Brussels, for example, sees IoT policy considerations Is the Internet of Things different in a broad sense, encompassing not from the internet? Is there a sufficient just physical devices but society as a difference to warrant changes in existing whole.61 This view echoes that of Dr. policies that govern informational Gerald Santucci, an expert in IoT and privacy and data protection? The related policy, who observed, “The IoT answers to these questions are pivotal does not concern objects only; it is about in considering how to govern the IoT. the relations between the everyday Our research participants, literature, and objects surrounding humans and stakeholders in the public policy process humans themselves.”62 Florian Schaub have a range of responses. of the University of Michigan noted the importance of the IoT’s ability to affect the physical environment: “It’s not just about information being collected about specific individuals, but it also becomes the question of technology becoming active in the user’s environment - actuation.”63 A 2014 report by the Aspen Institute also highlighted how data sensors and autonomous actuators give the IoT unprecedented power in the physical, versus purely online, world:

61 Interview 62 Santucci, 2011, p. 84 63 Interview GOVERNANCE Clearly Opaque: Privacy Risks of the Internet of Things 35

To some extent, the IoT merely enlarges and embedded in the environment: – vastly – the existing Internet. It ”We wear them on our bodies, put them dramatically increases the number on our toys, they collect data in novel of inputs into the systems of data ways we didn’t envision before. It’s collection and analytics that drive the more of a passive question of sensors digital world. In other ways, the IoT is collecting information without our something entirely new. It empowers our interaction with them.” This is not to physical world to make decisions about suggest that a regulatory response is how it interacts with us, without any required, he explained: “You could direct human intervention.64 imagine, for example, proactive steps like an opt-in consent when a mobile Prof. L. Jean Camp and her colleagues device collects location information. cited a similar view: That’s not a regulatory requirement, but a self-regulatory norm that companies The challenges and opportunities have adopted due to recognition of the arising from the IoT are fundamentally potential harms.”66 new, owing to the unprecedented combination of ubiquity, diversity, and In 2017, the US Department of Commerce connectivity among IoT devices, and the released its report, “Fostering the ability for many IoT devices to observe Advancement of the Internet of Things,” and actuate real world events without gathering comments from industry, civil any explicit human interaction.65 society organizations, advocates, trade bodies, academia and other regulatory One security researcher also noted the agencies. Regarding privacy, it stated, issue of ubiquity and a lack of direct “Several commenters argued that there user interaction, seeing the IoT as more are no new privacy issues related to IoT, ubiquitous than mobile phones that it is too early to craft regulatory responses, or that current regulation is sufficient.”67

64 Goodman, 2015, p. 6-7 66 Interview 65 Camp et al., 2016, emph. added 67 US Dep’t of Commerce, 2017, p. 31 36 Clearly Opaque: Privacy Risks of the Internet of Things GOVERNANCE

On the subject of the necessity of crafting Given this diversity of views about the new regulations specific to the IoT, one novelty of the IoT, is it possible to reach an senior government official responded: agreement as to whether it requires new kinds of policy instruments to govern it? I feel like every certain number of years, Radical reconsiderations of policy can there’s a new technology, whether it’s occur but are uncommon. Public policy the internet or mobile or IoT – we all of theory and related areas of scholarship a sudden say, “Oh, we need to rethink often argue that incrementalism is everything in this context.” I think that’s a dominant mode of policymaking.70 just the evolution of technology.68 Any realistic consideration of policy development must be seen in the larger Ian Glazer, Vice President for Identity context of existing policies, institutional Product Management at Salesforce, also arrangements, legislative appetite, and the opined about the need for new policies: politics of the day, which we discuss in the next section. These prior factors often give People tend to lose their ever-loving rise to ‘path dependency’ – a stability and mind when they start talking about IoT. sometimes irreversibility in the current Everyone chill out. We have practice state of policy choices. and principles that should be guiding us consistently, because if you have a good We find that although this is largely principle, it doesn’t need to be changed the case with IoT policy, there are some for different modalities of technology. I instances of new policy choices (‘path get really nervous when people say, “Oh, creation’) within individual technology we need a new X for this expression of domains, such as the automotive sector. technology.” Well, if we had fundamental For example, a Senate version of a federal good practice, good principle, and good autonomous vehicle policy bill currently regulation, then it shouldn’t have to get being debated in the US Congress calls updated for a specific kind for the creation of an online, searchable of technology.69 ‘motor vehicle privacy database’ which

68 Interview 69 Interview 70 Lindblom, 1979; North, 1991; Scott, 2003 GOVERNANCE Clearly Opaque: Privacy Risks of the Internet of Things 37

would list all the personal data that vehicles collect, how that data and the conclusions GOVERNANCE drawn from it are used and disclosed, AND INNOVATION how long they will be retained, and when PHILOSOPHIES they will be destroyed.71 In particular, the inclusion of language about disclosing Public, commercial and policy discourse the use of conclusions (read: inferences) surrounding the IoT is suffused with talk drawn from interactions with vehicles is of innovation; the need to encourage highly unusual in US information policy, it, to prevent it from being stifled, to and can be seen as a promising new path. ensure society and its posterity benefit from it: With regard to actuation, the legal and insurance communities are slowly The Internet of Things has already led to beginning to grapple with questions important technological breakthroughs, of IoT liability. Given the recent first and as it expands its reach, it has the death from a self-driving car72 and the potential to spur tremendous innovation. world’s largest distributed denial-of- Our challenge is to find the proper service attack coming from compromised balance between promoting this IoT devices in 2016,73 issues of product innovation and ensuring that our security liability and insurance are on the cutting- and our privacy are protected as this edge of IoT governance.74 valuable technology continues to grow.75

[B]ecause the Internet of Things… offers so many important economic and social benefits, countries should develop national strategies to promote its adoption and use.76

71 S. 1885, 2017 72 Economist, 2018 75 Internet of Things, 114th Cong., p. 4, 2015 (In- 73 Woolf, 2016 troduction by Issa) 74 See Dean, 2018 76 Castro and New, 2016, p. 1 38 Clearly Opaque: Privacy Risks of the Internet of Things GOVERNANCE

One strand of discourse that encourages Thierer contrasts this with (his version of) innovation while minimizing factors that the ‘precautionary principle,’ which he could stifle it is dubbed ‘permissionless criticizes for its constraining effect innovation.’ Adam Thierer of George on innovation: Mason University, a central proponent of this idea, defines the idea variably: The precautionary principle holds that since every technology and The notion that experimentation with technological advance could pose some new technologies and business models theoretical danger or risk, public policies should generally be permitted should prevent people from using by default.77 innovations until their developers can prove that they won’t cause any harms.80 The general freedom to experiment and learn through ongoing trial-and-error Many critics adopt a mindset that views experimentation.78 the future as something that is to be feared, avoided, or at least carefully Permissionless innovation is about planned. This is known as the ‘stasis the creativity of the human mind mentality’ and it is, at its root, what to run wild in its inherent curiosity motivates precautionary principle-based and inventiveness. In other words, thinking and policymaking.81 permissionless innovation is about freedom.79

77 Thierer, 2016, p. 1 78 Ibid., p. 8 80 Thierer, 2013, p. 353 79 Ibid., p. 9 81 Thierer, 2016, p. 23 GOVERNANCE Clearly Opaque: Privacy Risks of the Internet of Things 39

This is a politically charged, narrow PP4. Activities that present an uncertain reading of the precautionary principle, potential for significant harm should portraying it as a force for prohibition. be prohibited unless the proponent of However, in a review of the precautionary the activity shows that it presents no principle in environmental regulation, appreciable risk of harm legal scholar Richard Stewart laid out four (Prohibitory PP).”82 categories of the precautionary principle: Thierer is clearly aligning his view with “PP1. Scientific uncertainty should not the fourth category, protesting against the automatically preclude regulation of idea that developers should be required to activities that pose a potential risk of prove a technology will do no harm before significant harm (Non-Preclusion PP). it is allowed to flourish. His response to this is inversion – arguing that regulation PP2. Regulatory controls should should not proceed without evidence of a incorporate a margin of safety; activities potential harm: should be limited below the level at which no adverse effect has been The burden of proof should be on those observed or predicted who favor preemptive, precautionary (Margin of Safety PP). controls to explain why ongoing trial- and-error experimentation with new PP3. Activities that present an uncertain technologies or business models must potential for significant harm should be disallowed.83 be subject to best technology available requirements to minimize the risk of harm unless the proponent of the activity shows that they present no appreciable risk of harm (BAT PP).

82 Stewart, 2002, p. 76; see also Sunstein, 2003 83 Thierer, 2016, p. 4 40 Clearly Opaque: Privacy Risks of the Internet of Things GOVERNANCE

Industry positions on IoT governance are The very term ‘permissionless’ creates replete with this viewpoint: an antagonist; the government and concerned parties as bête noire to US Chamber of Commerce: “Without the industry and the market, whose evidence of heightened privacy concerns noble goal to produce social good and or consumer harm, there is no reason consumer welfare through innovation not to allow the IoT market to mature must be defended. The discourse of under the frameworks that exist for permissionless innovation fuels a protecting consumers’ legitimate general distrust in regulation by falsely privacy interests.”84 pitting ‘freedom’ against reasonable efforts to govern technology and T-Mobile: “NTIA and other federal protect current and future citizens agencies should only adopt from economic, social and democratic cybersecurity and privacy regulations harms. However, as illustrated by Stewart, based on clear evidence of actual harm precautionary approaches exist that would after balancing the need for regulation allow for innovation while still mandating against the deterrent effects regulation strong IoT privacy characteristics. will have on innovation Without a doubt, governance and and investment.”85 legislation are imperfect mechanisms, and attempts to craft forward-facing Consumer Technology Association: “As a consumer protection and privacy rules general matter, the increasing number of will be inevitably flawed. However, “the devices should not automatically trigger precautionary principle benefits privacy new regulations – before acting, there protection insofar as it emphasizes should be evidence of real harms.”86 the normative values of prudence and transparency.”87 Prudence is warranted in particular when concerned with democratic harms because they are diffuse and take a long time to manifest and detect.

84 Eversole et al., 2016 85 Massey et al., 2016 86 Kearney and Reynolds, 2016 87 Costa, 2016, p. 23 GOVERNANCE Clearly Opaque: Privacy Risks of the Internet of Things 41

Despite what Thierer would have us In part this is achieved through applying believe, outright champions of the the precautionary principle “in order to prohibitory version of the precautionary avoid risk-taking without a larger public principle are rare. Only a small number of discussion. The promotion of this principle technologies are regulated in this fashion, is therefore a way to involve citizens in the most obvious being the development decision-making.”89 of new drugs and medical devices, and in transportation. Setting aside questions of Costa cites the United Nations Rio regulatory efficiency, the reason for such Declaration on Environment and prohibitory precaution is the potential Development, which employs the threat to human life. But, to imagine that precautionary principle to protect the such prohibition exists commonly in other environment: “Where there are threats domains of technology regulation – in of serious or irreversible damage, lack of particular, consumer devices and internet full scientific certainty shall not be used technologies – is a straw man argument. as a reason for postponing cost-effective In both the United States and Europe, measures to prevent environmental there are almost no examples of degradation.”90 This formulation aligns consumer technology regulation with Stewart’s PP1 and PP2 above. As to the employing a prohibitory version of the inadequacy of monetary damages, Costa precautionary principle. writes that:

Luiz Costa of the University of Namur sees Precaution… considers that some the precautionary principle as useful to damages cannot be repaired or privacy and data protection law because it compensated with money because addresses issues of information and power not everything can be converted into asymmetry, spurs public debate, recognizes money. Considerable oil leaks can cause that monetary damages for future harms damages to the environment that are may be inadequate, and takes account of the irreparable; lives lost in an accident are uncertainty of scientific evaluation.88 On the irreversible. Instead of compensating first two points, Costa notes that legislation damage, precaution urges the need to protects citizens by counterbalancing avoid some damages.91 the power of governments and industry.

89 Ibid. p. 15, emph. added 90 UN General Assembly, 1992 88 Costa, 2016 91 Costa, 2016, p. 17, emph. added 42 Clearly Opaque: Privacy Risks of the Internet of Things GOVERNANCE

This is particularly salient to privacy and data protection, where the potential BLURRED dangers to individuals are so often non- REGULATORY economic: embarrassment, stigmatization, BOUNDARIES loss of dignity, intrusion, and decisional interference.92 This is also true for the Overlapping the thematic areas of longer-term, difficult-to-detect threats Boundaries and Governance is the view to autonomy, freedom of thought, private that the Internet of Things muddles spaces, and a liberal democratic order. As the boundaries between regulatory with the environment, some injuries to agencies. Further, the US’s current the health of a democracy are potentially sectoral approach is seen by some irreparable. Or, as Paul Ohm writes: “If we stakeholders as being insufficient to deal worry about the entire population being with the many gray areas that arise in dragged irreversibly to the brink of harm, IoT governance policy. we must regulate in advance because hoping to regulate after the fact is the same as not regulating at all.”93

Taken together, these arguments form a counter-narrative to the proponents of permissionless innovation, who are largely silent on threats to the health of a democracy, or the danger of a general diminishment of private spaces. We argue that the discourse of permissionless innovation is ultimately a deregulatory thrust, often libertarian in character, that serves the interests of economic actors while denying society the vital role of government regulation in protecting a broader range of social and democratic interests.

92 See generally Solove, 2006 93 Ohm, 2010, p. 1751 GOVERNANCE Clearly Opaque: Privacy Risks of the Internet of Things 43

There are a number of examples of If you’ve created a mobile health app regulatory blurring. Drones (of the non- that is spitting out outcomes based on hobbyist type) in the US are regulated by the data you’re generating, what does the Federal Aviation Administration, but that mean? Where are the lines for what they do not have the authority to regulate should be regulated? Either because their privacy aspects. The National it’s creating health recommendations Highway Traffic Safety Administration that would affect your health and (NHTSA) released guidance on safety, or because it’s promising things automated vehicles in September 2016 that it can’t deliver – that would be with privacy guidelines94 that included a the difference between the Food & recommendation for data minimization, Drug Administration or the FTC being but one year later removed all references involved. The Department of Health and to it in an update, saying instead that “the Human Services, the FTC, the FDA have FTC is the chief Federal Agency charged all recognized that this is rising: this with protecting consumers’ privacy and blurring of the uses for sensors and IoT personal information.”95 and health-related care are blending over, at least in a regulation sense.96 The health sector is emblematic of muddled regulatory boundaries. Lee Tien of the Electronic Frontier Michelle De Mooy of the Center for Foundation sees US federal policy being Democracy & Technology (CDT) described hampered by agency competition, a lack of its regulatory confusion: regulatory cohesion, overlapping regulatory jurisdictions, and a lack of familiarity with emerging technology: “There is a lot of jockeying… that’s exacerbated by the fact that a lot of these issues are new, so they haven’t had to deal with privacy before.”97 Relatedly, the US Department of Health and Human Services noted how both mobile health developers and the citizenry could be misled by the absence or overlap of regulatory authorities:

94 NHTSA, 2016 96 Interview 95 NHTSA, n.d. 97 Interview 44 Clearly Opaque: Privacy Risks of the Internet of Things GOVERNANCE

It would not be surprising if individuals As more objects get connected to are confused, and do not understand, the Internet, it will be more and more that HIPAA [the Health Insurance difficult to confine their data within a Portability and Accountability Act] may single regulatory silo. Current tools will not protect the privacy and security of be utterly inadequate in the new their health information collected by connected world.99 equipment or an app if that collection of information is not offered by the In IoT discussions, several stakeholders individual’s provider or on its behalf… and researchers seek the creation of an [S]ome mHealth developers themselves omnibus regime with baseline privacy may not be aware of the regulatory rules.100 When asked about the efficacy requirements that attach to their and validity of the reasonable expectation work and have requested of privacy standard, a fundamental additional guidance.98 element of US privacy governance, Michelle De Mooy responded: In the domain of corporate wellness programs, privacy and identity expert I think it’s poor. It’s definitely not ideal, but Anna Slomovic cited the issue of sensor at the same time, when you don’t have a fusion and the general promiscuity of baseline privacy law it makes it incredibly data, observing that the US’s sectoral difficult to not have that standard. The approach to privacy may be insufficient. expectations are set in some ways by the She argued in favor of a broader approach laws that are sector-specific, and so they to privacy regulations: perhaps have shaped public perception of what they consider sensitive, but I Wellness programs give us a glimpse don’t think there’s any way around that. of a future where data collected from I wish that we had some sort of baseline multiple devices and multiple sources legislation or law that would give people will be combined and analyzed, resulting a different set of common, everyday in a level of surveillance that is much expectations for their privacy.101 greater than the sum of its parts… While

US privacy protections are sectoral, 99 Slomovic, 2015, emph. added data flows in the real world are not. 100 Comparisons of the merits of sectoral versus omnibus privacy rules have been debated by numerous other scholars. See, e.g., Schwartz, 98 US Dep’t of Health and Human Services, 2016, 2013 p. 9-10 101 Interview GOVERNANCE Clearly Opaque: Privacy Risks of the Internet of Things 45

Heather Patterson of Intel echoed Anna stating that it “continues to address privacy Slomovic’s concerns for the inadequacy concerns in a range of contexts, from of a sectoral regime in the face of support for baseline privacy legislation context blurring: that would include IoT services, to work to promote the availability of strong When it all becomes one broader encryption (including in IoT devices).”105 context and there are vast treasure troves of information about us that can Many industry stakeholders, however, be assembled and reassembled, then oppose changing the policy landscape. we need to be mindful about whether Comments submitted for the omnibus privacy protections are now aforementioned Dep’t of Commerce IoT warranted and whether we want to paper by commercial companies and up-level privacy regulations so that it trade bodies are illustrative: follows people [everywhere] rather than the spheres that the people happen to Although IoT enables the collection of be functioning within when that data data through inter-connected devices becomes collected from them.102 that previously may not have been feasible, the underlying data collection Law professor Paul Schwartz observed is not a new policy issue that would that the United States is an anomaly in its warrant a new framework for addressing absence of baseline privacy rules: “The privacy… The FTC’s existing privacy United States’ unique path as a matter of framework, which incorporates many of form (no omnibus law) and substance (a the Fair Information Practice Principles limited set of [fair information principles]) (FIPPs), is well suited to address the has made it an outlier in relation to the IoT environment.106 global community.”103

In its 2015 IoT report, the FTC reiterated its call for “broad-based (as opposed to IoT-specific) privacy legislation.”104 The Department of Commerce concurred,

102 Interview, emph. added 103 Schwartz, 2013, p. 1636 105 US Dep’t of Commerce, 2017, p. 42 104 Federal Trade Commission, 2015, p. viii 106 Epps, 2016 46 Clearly Opaque: Privacy Risks of the Internet of Things GOVERNANCE

Self-regulation is nimble, and can be deregulatory character of the current more easily updated to address changes Administration in combination with in the marketplace and technology… a Republican-led Congress points to In fact, self-regulatory codes may be a reduction in the already low chances the best way to effectuate consumer of such legislation occurring. adoption of the IoT. As a backstop with respect to consumer privacy, the The European General Data Protection FTC can utilize its Section 5 authority Regulation (GDPR) coming into force in to protect against any privacy-related May 2018 is a substantial upgrade to the practices that are unfair or deceptive.107 EU’s existing omnibus data protection rules. Its impact will affect American Such views comport with the commercial companies since the Regulation applies logic that increases in regulation cause to entities processing Europeans’ data, increases in costs and limitations on irrespective of a company’s geographic product developments. location. Compared to the existing US privacy regime, the GDPR requires far Despite the US’s top privacy regulator and more internal assessment and compliance the agency tasked with multistakeholder of data practices, and it threatens engagement on IoT publicly supporting sanctions for non-compliance. It will test baseline privacy regulation, prospects the argument that increased regulatory seem dim. One prior legislative attempt, burdens can stifle innovation. the 2015 Consumer Privacy Bill of Rights Act, was stillborn,108 failing to galvanize consumer protection groups109 or gain bipartisan support. Omnibus federal privacy legislation has yet to reach an advanced stage in Congress, and the

107 Kearney and Reynolds, 2016 108 See Singer, 2016, for one account of the causes 109 See, e.g., an open letter from numerous groups to the White House, available at http:// www.consumerwatchdog.org/resources/ ltrobamagroups030315.pdf GOVERNANCE Clearly Opaque: Privacy Risks of the Internet of Things 47

If we have the policy knowledge about THE NEED FOR what is happening to our data – how well ENHANCED SOCIAL it’s being stored, how well is it being de- DISCOURSE identified, what sense and meaning can be derived from it at this present level A major theme to emerge from of de-identification – we’d be able to get interviews and our two workshops more of the information that we need was a perceived need for more social to make a social, public decision about conversations about the collection what the ideal is… That’s one area that and use of personal data, in tandem I think the law is very important; why with more transparency on the part of [advocates] often focus on transparency data collectors. This theme reflects an and disclosure. Consumers should be awareness that public understanding able to know where their data goes. of the coming waves of IoT devices, They should be able to know, if you essential to informing governance and claim you are de-identifying data that the evolution of norms, is currently you collect, what is the protocol that you inadequate. Lee Tien of the EFF linked use for de-identifying? What data is that the need for transparency on the part FitBit actually collecting? Where does it of data collectors to society’s ability actually go? Who are your partners?110 to hold this conversation: Dawn Nafus, an anthropologist at Intel, argued for society’s need for a greater capacity to critically understand data: “If more people had a clear idea of what data actually meant and what it doesn’t mean… then you can start to deepen the critique and the resistance against the bad uses, as well as elaborate the good ones.”111

110 Interview 111 Interview 48 Clearly Opaque: Privacy Risks of the Internet of Things GOVERNANCE

Tien expressed optimism about the The converse of this need to have a evolving state of IoT discourse, remarking social conversation about data use is that interest in privacy and security has a general lack of understanding about gained mileage in public discussion over connected devices and technology. the last decade or two.112 However, Heather Interview and workshop participants Patterson had a less sanguine view on the noted that ignorance pervades the state of discourse on IoT data and privacy: offices of policymakers, politicians, their lawyers, and the public. Rafi Rich, We need more social discourse about a consultant and former what is happening right now, and I’m director of buildings and licensing for just not seeing it there and [instead Israel, remarked on the privacy issues I’m] seeing lots of conversations about surrounding the deployment of smart shareholder value. Frankly, I just think city technology: the world is bigger than that.113 Most mayors don’t have a clue whatsoever These comments are not restricted to the about the dangers of data going IoT – they fit comfortably in conversations around, so they basically don’t see it as about big data, or indeed about the a problem, because although there are generally datafied world we now live in. regulations around for privacy, most Fortunately, privacy remains a popular cities that I have been working with don’t topic among the general public and really understand that when they give specialists alike. As Nafus and Tien note services to private companies there is above, deepening the capability for critique a risk. The local authority… I won’t say is a part of reaching a better ideal of the they don’t care, but they just don’t have ‘information society.’ enough knowledge to care. I think that local authorities in most places around the world are still in the 19th century. Technology moves much too fast for them.114

112 Interview 113 Interview 114 Rich, Interview GOVERNANCE Clearly Opaque: Privacy Risks of the Internet of Things 49

Speaking of his experiences while principal Lee Tien expressed his view of government advisor on technology, data and privacy policymakers even more bluntly: in the California State Attorney General’s office, Justin Erlich lamented the lack of in- My view is that they don’t know house technical expertise and availability technology well enough to know when of external help: they don’t know… I have engineers who brief me on technology I don’t know; I I saw our lawyers wanting to do the right probably have more technical support thing with consumer protection and the than legislators and policymakers. internet, but the problem is: 1) there is no That’s wrong. Why do I have more clear law on emerging tech and lawyers access to more cryptographers, machine can be risk averse, 2) there’s a need to get learning experts, and technologists than creative about legal theories, but we don’t people who actually make decisions?116 completely understand the underlying technology and how it works so it’s hard Greater social comprehension of data to develop them… It feels like bringing a use is a strategy that nearly all knife to a gunfight.115 IoT governance participants seek. Transparency in the form of privacy policies is encouraged by industry and the privacy community alike, and both sides see benefit in people being much more informed about how data is collected, analyzed, and shared. We discuss different ways to encourage greater awareness in the Emerging Frameworks and Recommendations sections.

115 Workshop comment 116 Interview, emph. added 50 Clearly Opaque: Privacy Risks of the Internet of Things GOVERNANCE

IoT governance will proceed in much CONCLUSION the way that policy proceeds generally: incrementally, sometimes haphazardly, The question as to whether the IoT is with greater and lesser degrees of different enough from prior internet order and coordination. The difficulty technologies to warrant its own special of creating adequate privacy rules regulations is in no way settled. This is is exacerbated by the multiplicity partly because there are stakeholders on of agencies who claim (and refuse) both sides of the discussion with much to jurisdiction. US state legislation will lose – an increase in regulation will likely continue to press onward based on its result in higher compliance costs, but an own institutional dynamics, sometimes absence of appropriate regulation could in tension with federal prerogatives. IoT further injure the state of privacy. As the policy is inseparable from politics, history, Dep’t of Commerce paper cited above institutional arrangements, and power, reports, policy stakeholders have yet and so the choice to regulate different to come to a consensus that there aspects of the IoT ecosystem – data is a privacy problem. gathering and use, security, liability – will inevitably be influenced by existing rules, the power of stakeholders, dominant economic philosophies, discourse, and perhaps the infrequent, unexpected ‘shock’ to the system, such as a massive denial-of-service attack. GOVERNANCE Clearly Opaque: Privacy Risks of the Internet of Things 51

The GDPR represents the most ambitious The mechanics of public policy information policy project on the globe. are a rarified subject, but public It envisions affecting Europeans and discourse remains an important input non-Europeans alike, and is an attempt to policymaking. As our research to export European privacy and data respondents have highlighted, there is protection values to wide range of value in arming the populace and policy countries, some of whose companies actors with more information to consider (read: America’s) will inevitably be IoT privacy issues. For the general resentful of it. It will test whether sectoral public, enhancing the transparency or omnibus regulations are better of data collection and use practices suited to address the privacy challenges is one important method, as are the connected devices, and as such will adversarial and educational approaches provide an analytically rich counterpoint of advocacy groups and journalists. In to US IoT policy evolution. policymaking itself, there is a need for more technologist interaction with policy actors, which will only occur if the labor economics are well-aligned. 52 Clearly Opaque: Privacy Risks of the Internet of Things

PRIVACY MANAGEMENT PRIVACY MANAGEMENT Clearly Opaque: Privacy Risks of the Internet of Things 53

The Internet of Things heralds a qualitative shift in the way privacy is managed, both by people and by the organizations that create, sell and operate connected devices. It amplifies prior privacy challenges like the opacity of data flows and actors, and it creates new issues, such as creating a false sense of trust due the familiarity and unobtrusiveness of devices.

This section explores the IoT’s effects on individuals’ ability to manage their own privacy – issues of awareness, understanding, and control. In particular, we discuss the IoT’s impact on transparency, notice and disclosure, choice and trust. We explore the significant privacy impact of forgetting that data-collecting objects are present in our environment, and delve into people’s changing relationships with their devices, in particular the impact of the IoT on children. 54 Clearly Opaque: Privacy Risks of the Internet of Things PRIVACY MANAGEMENT

This report directly informed the US THE IoT’S IMPACT Privacy Act of 1974, America’s only ON TRANSPARENCY omnibus privacy legislation;118 the modern Fair Information Practice Any attempt to manage privacy Principles (FIPPs); and the OECD’s 1980 presupposes that people are aware of Guidelines on the Protection of Privacy the collection, use and disclosure of and Transborder Flows of Personal Data, their personal data by others. Actors in which in turn directly informed the EU’s privacy-sensitive arenas are generally 1995 Data Protection Directive and the expected to practice transparency in recently enacted General Data Protection order to facilitate this awareness. The Regulation (GDPR). 1973 US government report “Records, Computers and the Rights of Citizens” The US FIPPs call for Transparency: “[An envisioned five Fair Information organization] should be transparent and Practices, two of which state: provide notice to the individual regarding its collection, use, dissemination, and — There must be no personal data maintenance of personally identifiable record-keeping systems whose very information (PII).”119 In the GDPR, existence is secret. transparency is a key part of ensuring fairness, intrinsic to consent, intended to — There must be a way for an individual go beyond privacy notices, and is intended to find out what information about to be “user-centric.”120 him is in a record and how it is used.117

118 Though it only applies to Federal entities. 117 US Dep´t of Health, Education and Welfare, 119 US Dep’t of Homeland Security, 2008 1973 120 Center for Information Policy Leadership, 2017 PRIVACY MANAGEMENT Clearly Opaque: Privacy Risks of the Internet of Things 55

The Transparency principle and its antecedents stem from the democratic principles of autonomy and self- determination: a person must be aware of what data is collected about them and how it’s used to fully construct herself and act freely. Transparency about data collection and use enables people to make informed choices about the direction of Spot their informational lives. It also underpins other privacy practices, such as the ability not gather data, this one explicitly does, to act upon, correct or delete data, and the combining it with artificial intelligence ability to exercise a measure of control and sending it back to Amazon to assist over the information central to one’s life. in the compilation of detailed profiles Connected devices are intended to about people’s preferences. This be unobtrusive, to blend in with the camouflaged monitoring of our activities environment; and in many cases they puts the IoT in tension with Transparency. take the form of familiar objects that have A 2016 report by Canada’s Office of the been upgraded with networking, sensors, Privacy Commissioner (OPC) summarized and other new functions. Wearables look the issue: like watches, jewelry and fitness gear. Amazon Echoes look like speakers. Hello As these technologies become ubiquitous, Barbie looks like mute and deaf Barbie. we may have little or no warning or Voice-activated televisions look like… awareness that they are even in place; televisions. In late 2017, Amazon released they simply become part of the backdrop the Echo Spot, which looks like a clock of our daily lives. How, then, can citizens radio with a large screen and, based on who may or may not want to use this its marketing, is intended to be put on technology ensure that someone is held the nightstand in a similar way.121 The accountable for its use? How will they be difference is that it has a small camera at able to challenge how the information is its top. And, while clock radios do used, and how will they be able to give any kind of meaningful consent?122

121 See Introducing Echo Spot: https:// www.amazon.co.uk/Introducing-Ama- 122 Office of the Privacy Commissioner of Canada, zon-Echo-Spot-Black/dp/B01J2BK6CO 2016, p. 23 emph. added 56 Clearly Opaque: Privacy Risks of the Internet of Things PRIVACY MANAGEMENT

The unobtrusiveness of connected devices means that data collection activities are THE IoT’S IMPACT invisible. To be sure, invisible data collection ON NOTICE AND is the hallmark of the internet era – from DISCLOSURE the logs of the first computer networks, to tracking on the web of the 1990s, to modern, Online data gathering is subject to mature online profiling, to the proliferating norms and laws of disclosure. Privacy IoT devices of today, data collection has policies for websites and applications happened in the background, silently. are institutionalized and often mandated, Discussing the ethics of online behavioral though the efficacy of providing notice tracking for advertising ends, privacy expert about data practices has been widely Omer Tene noted in 2011 that: debated.124 However, in these early years of IoT development and device sales, Users are seldom aware of the data research reveals a diminishing adherence collection processes, prospective data to these norms. The issue is two-fold: 1) uses, and identity of the myriad actors the reduction or elimination of screens involved, including not only advertisers from IoT devices makes the display of and publishers, but also ad networks, ad privacy notices problematic, and 2) exchanges, analytics services, affiliate some device manufacturers are being marketers, market researchers, and search lax in their duty (legal or norm-derived) engine optimizers… As a result, behavioural to disclose data collection and use targeting clearly challenges the [OECD’s practices. and EU Data Protection Directive’s] principles of transparency and individual choice.123

The IoT amplifies these challenges in its expansion of data collection from the online to the offline world. This is all the more true when one considers data collection by other people’s devices and IoT devices in public spaces.

123 Tene, 2011, p. 5 124 See, e.g., Calo, 2012; Sloan and Warner, 2013 PRIVACY MANAGEMENT Clearly Opaque: Privacy Risks of the Internet of Things 57

The loss of screen space and resulting just strips away the screen and strips impediment to displaying privacy policies away the personal computing aspect of has been discussed by a number of privacy and offers us this new platform scholars and stakeholders. The FTC stated to reconsider what privacy is and how in its 2015 IoT report: we want to manage it.”127

Many IoT devices – such as home As to IoT manufacturers being lax in appliances or medical devices – disclosures, a pattern of poorly informing have no screen or other interface to users is emerging. Prof. Scott Peppet communicate with the consumer, surveyed twenty popular IoT devices thereby making notice on the device in 2014, including the Nest Thermostat, itself difficult, if not impossible. For the FitBit, health products, and home those devices that do have screens, monitoring systems, in an attempt to the screens may be smaller than even gauge the depth and degree of their the screens on mobile devices, where privacy disclosures.128 He found them providing notice is already a challenge. to be shockingly inadequate: Finally, even if a device has screens, IoT sensors may collect data at times when None of the twenty devices included the consumer may not be able to read a privacy- or data-related information notice (for example, while driving).125 in the box. None even referred in the packaging materials or user guides to However, Michelle De Mooy of the the existence of a privacy policy on the Center for Democracy & Technology manufacturer’s website… Some policies was optimistic about notifications seem to apply to both website use and improving in the IoT space: “When you sensor-device use. Other policies limit view transparency as integral to the their application to website use, not functioning of a device or product or sensor-device use, but provide no means service, the opportunities for engaging to locate a device-related privacy policy. with users become more broad and This leaves unanswered whether any innovative.”126 Meg Leta Jones of privacy-related policy applies to the data Georgetown also sees opportunity for a re- generated by these devices.129 think of notification in the IoT: “The IoT

127 Interview 125 Federal Trade Commission, 2015, p. 22 128 Peppet, 2014 126 De Mooy, Interview, emph. added 129 Peppet, 2014, p. 140-142, orig. emph. 58 Clearly Opaque: Privacy Risks of the Internet of Things PRIVACY MANAGEMENT

Europe’s Article 29 Data Protection In the same vein, a 2013 study of 23 paid Working Party (Art29WP) has taken a and 20 free mobile health and fitness apps strong position against vague or general found that: notices: “Information and consent policies must focus on information which — 26% of the free and 40% of the paid apps is understandable by the user and should had no privacy policy not be confined to a general privacy policy on the controllers’ website.”130 But Peppet — 39% of the free and 30% of the paid apps found the language in the devices’ policies sent data to someone not disclosed in to be exceedingly ambiguous: the app or the privacy policy132

These policies are often confusing And, in 2016 a study of over 300 devices about whether sensor or biometric by 25 of the world’s data protection data count as ‘personal information’ authorities found: and thus unclear about how such data can be shared with or sold to third — 59% of devices failed to adequately parties… privacy policies for consumer explain to customers how their sensor devices often do not mention personal information was collected, ownership of sensor data. Of the twenty used and disclosed products covered… only four discussed data ownership explicitly… only three — 68% failed to properly explain how provided clear information on exactly information was stored what sensors the product included or what sensor data the product collected… — 72% failed to explain how customers many of these Internet of Things privacy could delete their information off the policies provided no information device on what sensor data their ­ device generated.131 — 38% failed to include easily identifiable contact details if customers had privacy concerns133

130 Article 29 Data Protection Working Party, 2014, p. 22 132 Ackerman, 2013, p. 19-20 131 Peppet, 2014, p. 142-144, orig. emph. 133 UK Information Commissioner’s Office, 2016 PRIVACY MANAGEMENT Clearly Opaque: Privacy Risks of the Internet of Things 59

Some of our respondents believed that However, it is difficult to know whether companies were intentionally opaque this opacity is deliberate or whether it about their data gathering and use is simply impossible to write intelligible practices. Michelle De Mooy remarked: privacy policies that are legally cognizable and adhere to regulation, an issue we Everything is stuffed into a privacy address in the Governance section. policy that of course no one reads, and Strong research exists that the general they know that. I think it illustrates how level of privacy policy comprehension important trust truly is and the fact that is low,135 and the device surveys we cite companies do understand that, because above illustrate an alarming trend of they hide so much of what they’re doing diminishing notification quality in the in either doublespeak or in these lengthy IoT. But it is still early days, and there are privacy policies or terms of service that serious research attempts to address the they know that their users aren’t likely above shortcomings, which we detail to look at. So, I think it’s a deliberate in the Emerging Frameworks and opacity on the part of the industry. Strategies section.

Lee Tien of the EFF shared this view:

It’s all designed to create more of a Potemkin Village of disclosure, but nothing that actually allows you to say, “Wow, that’s all going to Company X? Maybe I should be worried that Company X is getting all of this information from the 79 different companies that I do business with.”134

134 Interview 135 McDonald and Cranor, 2008 60 Clearly Opaque: Privacy Risks of the Internet of Things PRIVACY MANAGEMENT

You think you’re being observed, so FORGETTING you behave differently, but there are DATA-COLLECTING also habituation aspects. You might DEVICES ARE behave differently at first, and then NEARBY you become attuned to the technology being there, so you become maybe a bit The combination of diminished screens more laissez-faire in terms of how you and lax disclosure of privacy information behave. You forget that these devices makes it hard for purchasers of IoT are active, but that doesn’t mean that products to understand what these you’re not leaving digital traces of devices see, hear and know, and how mundane behavior anymore, and this their manufacturers and other parties data can reveal information about your will use the collected data. This issue preferences, what you like, what you is at the heart of the Canadian OPC’s don’t like, or your health, your family observation above: “We may have little situation, your financial situation – all or no warning or awareness that they kinds of different things that people 136 are even in place; they simply become prefer to keep private. part of the backdrop of our daily lives.” Ergo, people will forget about the Meg Leta Jones saw such habituation presence of data-collecting devices, as partly a result of anthropomorphizing and there are privacy management risks our devices: from forgetting devices are around. Elaborating on the relationship between We can be duped a little bit by our behavior and IoT monitoring technology, objects. We can get really comfortable Florian Schaub of the University of with them really fast; anthropomorphize Michigan remarked: them in ways that cause us to divulge information beyond what we would intend.137

136 Interview, emph. added 137 Interview, emph. added PRIVACY MANAGEMENT Clearly Opaque: Privacy Risks of the Internet of Things 61

Law scholar Margot Kaminski remarks This discomfort serves as a reminder that, upon the same problem in her analysis of most of the time, people are simply not the privacy implications of home robots: fully cognizant of the level of surveillance going on around them. Addressing this, There is evidence that people treat L. Jean Camp and her colleagues argue anthropomorphic robots with increased for the “engineering design principle of compassion and trust. A robot that lulls ‘least surprise’: The IoT should behave people into revealing more than they in a manner that is both expected by intend to may be viewed as deceptive and clearly communicated to every technology; or it may be treated similarly stakeholder with which it interacts.”140 to false human friends.138 Patterson notes how the problems of In her anthropological research on IoT forgetting about the devices are likely devices users, Heather Patterson of Intel to be exacerbated by their further hears of discomfort occurring at the integration in human environments: boundary of forgetting and awareness: It’s not just about IoT anymore: it’s IoT We forget that they’re on until they plus artificial intelligence plus machine wake up and surprise us, or let us know learning. We’re now facing the prospect that they’ve been tracking us all along of using devices that are always on, by telling us how long it will take us to always watching, always listening get to work before we’ve even asked. and always talking to one another. We realize, “Oh, my phone completely Particularly in closed spaces like the knows… Google knows my pattern and I home or the workplace, I think we’re wasn’t aware of that.” They’re intrusive.139 looking at a future where there’s really a strong possibility of those systems being embedded in our built environments.141

138 Kaminski, 2015, pp. 666 and 671 140 Camp et al., 2016 139 Interview 141 Interview 62 Clearly Opaque: Privacy Risks of the Internet of Things PRIVACY MANAGEMENT

Relatedly, workshop participants raised Reduction of awareness and the idea that connected device owners understanding of privacy risks affects may have poor mental models of how they another aspect of privacy management: gather data. Jen King, Director of Privacy at Choice, the ability to knowingly accept Stanford’s Center for Internet and Society, or decline a data-gathering product or noted that her research on RFID technology service, which is a mainstay of privacy revealed that users universally expected regimes. Florian Schaub noted that our some kind of feedback when data was being freedom to choose is increasingly limited collected.142 More generally, King saw two to just the initial purchase of a device, with kinds of products emerging: “new products little control over what happens afterward: that we will have never encountered, and everyday products that have IoT In the home environment you don’t functionality slapped on. We already have really have that much control over your well-established mental models about the privacy with IoT devices. Your biggest objects around us, so expecting users to control element is deciding which make that kind of leap or understand the devices you place in your home tradeoffs is unproductive because they have and vetting them for good these set mental models.”143 Ergo, everyday privacy practices.144 products that acquire IoT functionality put an increased strain on people’s already Schaub also notes, “It’s often difficult nebulous understanding of data harvesting. to find this information for consumer devices and take it into account in any Reduced knowledge about personal, kind of purchasing decisions,” which is intimate data collection; decreasing supported by the research we discuss attention to the presence of monitoring above regarding the inadequacies of IoT devices; and forgetting devices are around privacy notices. while being lulled into revealing ever more personal information all contribute to a diminishment of private spaces, a topic we discuss in greater detail in the Risks of Harm section.

142 Workshop comment 143 Workshop comment 144 Interview PRIVACY MANAGEMENT Clearly Opaque: Privacy Risks of the Internet of Things 63

Importantly, the Canadian OPC argues that the ability to choose erodes as IoT PEOPLE’S functions become more prevalent and CHANGING ‘non-smart’ devices are less available: RELATIONSHIPS

As smart devices and appliances WITH THEIR become more and more normalized, DEVICES there is an increasing ‘erosion of Slowly but surely, smart devices are choice’ for individuals who would have becoming entire smart environments, preferred their ‘non-smart’ versions.145 homes and public spaces. Things that were inert are becoming aware, These are serious challenges to reliance fundamentally changing the way people upon Choice as a fair information interact with and relate to their physical principle. Market shifts towards environment. A TV is no longer just a ‘smart’ features that are intentionally broadcast device – it is a transmitter/ unobtrusive lead to less understanding receiver/observer. A toy is not simply of data collection, and less ability a canvas for a child’s imagination – it to decline those features. Choice is is a puppet whose strings are pulled further eroded as sensors, cameras and by far-off processes, seeking self- microphones become standard features. improvement via the data it gathers. If all televisions have cameras, how can A speaker is a virtual assistant with, as one choose not to have a camera staring technology law scholar Ryan Calo puts into the faces of one’s family when it, a “tiny salesperson”147 inside, fusing watching a movie? The problem is further your real-time lived experience with past exacerbated by the presence of devices purchases, preferences, and predictions. we do not own or control – the ‘Internet of Other People’s Things,’ a term used by Meg Leta Jones146 to describe the growing number of devices that not only capture sensitive data about their owners, but also the people around them.

145 Office of the Privacy Commissioner of Canada, 2016, p. 21, emph. added 146 See Jones, 2014 147 Calo, 2013 64 Clearly Opaque: Privacy Risks of the Internet of Things PRIVACY MANAGEMENT

These devices are not neutral; they are Campaign for a Commercial-Free constructed with a commercial logic Childhood (CCFC), who campaigned encouraging people to share. The IoT against its sale under the banner “Hell embraces and extends the logic of No Barbie.”149 CCFC opposed children’s social media – intentional disclosure, conversations with the doll being social participation, and continued shared with corporations who have a investment in interaction. There is no commercial interest in them, believed doubt that disclosing data in new ways that parents should not have access to is fun and valuable to the purchasers of the recordings, and argued that Hello devices, but networked, sensing consumer Barbie undermines creative play. Hello products have special characteristics that Barbie’s technical advancements – speech must be investigated and critiqued. recognition, cloud-based processing, and easy Wifi access – have given rise to Children’s use of the IoT is an new relationship arrangements between instructive case. One of the most visible children, objects, their parents, and the and known examples of internet- myriad stakeholders who bring these new connected toys is Hello Barbie.148 toys into existence. Introduced in 2015, this upgraded version of the nearly 60-year-old Barbie The new relationship dynamics doll connects to the cloud and is able introduced by Hello Barbie’s features to parse children’s language and talk raise serious trust issues, especially as its back. Before this modern automaton was young users are probably unaware that even in the stores it stoked controversy, Hello Barbie is recording everything they drawing the ire of a group called say. Jones and Meurer observed:

149 Campaign for a Commercial-Free 148 See http://hellobarbiefaq.mattel.com/ Childhood, n.d. PRIVACY MANAGEMENT Clearly Opaque: Privacy Risks of the Internet of Things 65

Hello Barbie frequently requests the trust of both parents and their CONCLUSION children, yet it is simultaneously built The Internet of Things is catapulting to undermine those relationships of us into a new era of surveillance. IoT trust. It is superficially designed to act devices can blend into the background, as a child’s best friend, but the doll take on the form of a child’s best friend, records and shares every conversation or encourage people to willingly share with a child’s parents. Parents, on the their data on social networks. Over other hand, are given the perception time, connected devices will become so of complete control over the stored well-integrated into our lives that we conversations, yet their child’s data forget they are there. Meanwhile, the has the potential to be shared with nature of IoT devices makes it difficult numerous third parties… If Hello Barbie to practice transparency and implement were a real person, she would be more easily accessible privacy policies, likely to be known as the Gossip Queen especially in the case of screenless IoT than a trusted friend.150 products, rendering it very difficult to make informed choices regarding these Internet-connected toys are a useful devices. Early evidence suggests that IoT case by which to illustrate how IoT manufacturers are making this problem devices affect relations between people. worse by being lax about the breadth and Particularly, the burden of managing availability of privacy notices. children’s privacy has historically fallen upon parents. The IoT increases Still, the new relationships created by the weight of that burden. technology evolution can bring new benefits, new forms of entertainment, and new, valuable ways for families to interact. Rather than reject these new devices and relationships outright, we wish to highlight their privacy deficiencies and poorly understood data-gathering qualities in service of better policies, product designs and norms.

150 Jones and Meuer, 2016, p. 4 66 Clearly Opaque: Privacy Risks of the Internet of Things

INCENTIVES FRAMEWORK AND MECHANISMS INCENTIVES FRAMEWORK Clearly Opaque: Privacy Risks of the Internet of Things 67 AND MECHANISMS

Understanding organizational incentives is key to identifying governance mechanisms that can address IoT privacy issues such as choice and personal data management, and informing policies that encourage appropriate privacy protections. 68 Clearly Opaque: Privacy Risks of the Internet of Things INCENTIVES FRAMEWORK AND MECHANISMS

These incentives are shaped by the become normalized and consumer economic dynamics of IoT privacy. breach fatigue has set in, so industry Some of the factors that contribute motivation (e.g., fear of consumer to the current lack of market- and lawsuits, reputation damage, regulatory regulatory-based incentives for industry action) has diminished; stakeholders to embrace IoT privacy management include: — Lack of incentives for manufacturers to invest in adequate security measures, — Insufficient information about the costs given the low cost and/or low margin and benefits of stronger or weaker nature of IoT devices; privacy protection to different IoT stakeholders; — Diminished accountability, as the majority of relationships in the IoT — Confusion about data ownership and ecosystem are business-to-business responsibility across IoT data flows and and not business-to-consumer. supply chains, including challenges with tracking data provenance One framework that can serve as a model from collection to use in order to to understand and improve incentives for monitor and enforce privacy in the IoT privacy is the ensemble of methods IoT ecosystem; which have been established to combat environmental pollution, as the two — Lack of technical understanding and issues share some similar features.151 152 awareness, impeding overseeing In this section, we analogize the forces entities from asking the right questions that induce a reduction in the release of and seeking effective solutions; harmful pollutants to the mechanisms that incentivize the abatement of collection, — Uncertainty about IoT liability use or disclosure of privacy-sensitive exposure, and awareness of supply chain vulnerabilities; 151 See, e.g., Hirsch, 2006; Froomkin, 2015; Foulon et al., 2002 152 Note that the range of potential incentive — Inadequacy of the collective-action mechanisms we offer generalizes to any re- mechanisms operating as forcing source management system. Here we offer up privacy-inducing information as the resource functions for privacy management to be managed and encourage people to think – for example, data breaches have expansively about the application of these incentives to IoT privacy. INCENTIVES FRAMEWORK Clearly Opaque: Privacy Risks of the Internet of Things 69 AND MECHANISMS

information (‘personal data emissions’) that In contrast to regulatory standards models, infringe on privacy (‘privacy pollution’).153 market-based approaches can be price- or quantity-based and include mechanisms In general, mechanisms that aim to manage such as: permit/credit/cap-and-trade the consumption and production of data systems; taxes, fees, and charges; subsidies; that compromises privacy rights and tax-subsidy combinations; self-regulation, interests fall into two broad categories. The insurance, and revenue/remuneration. first encompasses regulatory approaches Privacy Reduction Credits would set limits that set standards for collection, use and/or on rates of personal data emissions, whereby disclosure of privacy-sensitive data, while companies could earn credits by reducing the second comprises pure market-based privacy pollution emissions. A cap-and-trade approaches that rely on economic forces variant would set a maximum allowable to correct for producer and consumer cap on total personal data emissions. Both behavior. In general, regulatory approaches would allow companies the choice between provide organizations with more certainty reducing their privacy pollution emissions at the expense of flexibility, while the or purchasing allowances from low privacy market-based approaches do the reverse. polluters. The tax-based approach would There are also several variations of these place a per-unit monetary fee on privacy approaches, including hybrid approaches pollution. The disadvantage is that this that combine elements of the two. approach would not guarantee a specific amount of reduction in privacy pollution, A regulatory standards approach modeled as it limits itself to penalizing violators after after environmental regulations could the fact. A subsidies mechanism would set personal data emissions limits in reward privacy polluters for reducing order to reduce externalized costs to the personal data emissions and could take the IoT ecosystem and broader internet, but form of grants, low-interest loans, favorable may impose potentially large costs on tax treatment, or procurement mandates/ polluters via fines and litigation, as well preferences. Through this scheme, the as enforcement costs on victims and accrual of benefits would encourage oversight entities. companies to enter the market.

153 US Environmental Protection Agency, n.d. 70 Clearly Opaque: Privacy Risks of the Internet of Things INCENTIVES FRAMEWORK AND MECHANISMS

Hybrid approaches create regulation Voluntary market-based approaches are using a mix of standards and pricing currently the prevailing force at play with combinations, liability rules, and regard to IoT privacy in the US, although information disclosure. A standards-pricing government regulation is gaining ground in mechanism would create a personal data specific sectors such as connected vehicles, emission standard that imposes a tax for unmanned aerial systems, and medical emissions that exceed the standard. Liability devices.157 Market-driven approaches law (negligence, strict liability, breach of are predicated on the promise that the warranty) is one of the logical candidates market will reward companies who reduce for application to the IoT, but it is unclear or eliminate personal data emissions by to what extent it could support harmful IoT providing them with a competitive and/ personal data emissions claims, if at all.154 or economic advantage. The purported Disclosure regimes include both mandatory benefits of voluntary approaches for reporting, such as state data breach laws, industry would be largely reputational and voluntary reporting, as in labeling (improved public image, resulting in schemes like the proposed EC Trusted IoT customer loyalty) and/or preemptive, in Label155 or the IoT Security Foundation’s that self-regulation can inform and ease Best Practice User Mark.156 transition to formal law by amortizing the costs associated with becoming compliant when law does go into effect.

154 Modern cyber-risk contexts characterized by software defects and insecure devices expose loopholes in the traditional product liability regime – strict liability, negligence, breach of warranty. This is primarily due to the economic loss doctrine which bars recovery for produc- tivity loss, business disruption, and other com- mon damages caused by software defects. As well, the application of design defects princi- ples to software is difficult given the complexi- ty of the devices and recent tort reform trends that have limited liability. Further, the interven- ing cause of damage from insecure software is typically a criminal or tortious act by a third party, so principles of causation might limit lia- bility for manufacturers. See, e.g., Dean, 2018. 157 See, e.g., AV START Act, S.1885; Federal 155 European Commission, 2016b Aviation Administration, Unmanned Aircraft 156 IoT Security Foundation, n.d. Systems, JO 7200.23A INCENTIVES FRAMEWORK Clearly Opaque: Privacy Risks of the Internet of Things 71 AND MECHANISMS

It is generally agreed that there is a lack COUNTERING of incentive for transparency of data flows. MARKET FAILURE Lee Tien of the EFF noted: OF PRIVACY We know that the companies and the It remains to be seen whether privacy law enforcements and governments in the IoT will become a market failure [don’t] have a strong institutional that results in costs to society and incentive to tell everybody what they’re necessitates stronger government doing. Neither does Google… That’s a intervention.158 The question thus arises: gap that hopefully law can fill.159 left to its own devices, would the IoT market fail to find an appropriate degree One example of a disclosure and of privacy protection for consumers? transparency mechanism would be a rating In cases of market failure due to system to correct information asymmetries, information asymmetry, transparency foster the IoT privacy controls market, and and disclosure of privacy practices may socialize the value of privacy attributes of be of use in informing more optimal the IoT. Consumer Reports´ Digital Standard decisions for industry and consumers is one possible vision of this.160 when attempting to align incentives to counter a possible market failure If privacy is to evolve into a market of IoT privacy. Market-based or hybrid differentiator, there must be clear approaches that force companies to agreement as to when personal data internalize the costs of personal data collection, use and disclosure are required emissions would be possible strategies in order to deliver services and ensure core that could help correct for negative functionality without unduly impeding the externalities. utility of the IoT.

Another possible market-correcting mechanism along the transparency continuum would involve presenting the trade/sale/exchange of personal data to

158 See, e.g., President’s Council of Advisors on 159 Interview Science and Technology, 2014; Camp and 160 See The Digital Standard Johnson, 2012; Schneier, 2017; Vila, et al., 2003 https://www.thedigitalstandard.org/ 72 Clearly Opaque: Privacy Risks of the Internet of Things INCENTIVES FRAMEWORK AND MECHANISMS

consumers as an explicit and transparent The GDPR may serve as a prototype bargain. Currently the bilateral value regulatory incentive mechanism due to exchange where users create profiles and its strong monetary penalties for failure to provide data in exchange for free or low- properly manage personal data.162 As well, cost products and services is assumed it may produce positive externalities for without any real dialogue or conscious the regulated entities, such as economic choice by users. Tel Aviv’s DigiTel Resident benefits to business operations by way of Card serves as an example of explicit enhanced efficiency. For example, it could exchange, wherein users give consent for provide homogenous data governance the government to leverage existing IoT between a (device) and a infrastructure.161 Companies then tap into smartphone (service). the existing behavioral sensors (e.g., location data on mobile devices), ultimately resulting A variant of the regulatory approach would in cost savings that are passed on to users. involve stronger mechanisms to help victims of data breaches recover monetary Regulation is not inherently at odds with damages. While the GDPR affords private market competitiveness and innovation: rights of action, most current US privacy markets do not and cannot function without laws do not, relying instead on government rules. On the one hand, technology-neutral enforcement actions or preempting regulation has the advantage of being more litigation opportunities with mandatory generalizable and future-proof. However, arbitration. Neither approach gives people there is the risk it will fail to serve as a any incentive to hold firms to account. forcing function due to the discretionary As well, regulation or common law could latitude in enforcement. On the other focus on placing more responsibility hand, technology-specific regulation, while on intermediary platforms, either by providing more certainty to the market mandating some form of privacy protection ex ante, lacks the flexibility that is needed or by enforcing transparency in the ways when dealing with developing technology data is collected and used. It could also that does not yet have accurate cost-benefit take the form of clarifying data breach assessments. The IoT market represents notification requirements to cover breaches a confluence of burgeoning social and of customer IoT data. These solutions, political factors. while also imperfect, have the benefit of

161 See Resident’s Card: https://www.tel-aviv.gov. 162 The GDPR can sanction up to 4% of global il/en/Live/ResidentsCard/Pages/default.aspx revenue or up to ¤20 million. INCENTIVES FRAMEWORK Clearly Opaque: Privacy Risks of the Internet of Things 73 AND MECHANISMS

sidestepping the issue of quantifying harm, low likelihood risks of harm. Users are which both taxes and private rights of action unlikely to realize that collection over time would probably require.163 and aggregation with other such data may result in privacy violations that are high Another way to address negative likelihood and large severity. Regulation externalities is by directly taxing companies would do well to address these risk factors that produce privacy pollution (do not in prescribing industry practices. comply with privacy standards/produce personal data emissions). Rather than The transfer of privacy risk vis-à-vis banning privacy pollution outright, each insurance is another mechanism to privacy polluter could choose their incentivize companies to address IoT preferred means of compliance, such privacy risk, although it can carry moral as in cases of firms choosing whether to hazard concerns when companies wholly comply with the 1990 Clean Air Act or pay disregard their privacy pollution under a carbon tax. Companies could then raise the mistaken belief that insurance will prices, implement privacy protections, unequivocally address privacy liability. US reduce the production of privacy-sensitive commercial liability insurance for privacy data, or choose to pay for their violations. harm has heretofore evolved largely on the The challenge to this approach lies in the heels of the various state data breach laws. difficulty of quantifying the size and cost Similarly, personal coverage is nascent, of privacy harms. In both cases, there is a existing largely in the form of vendor- behavioral economics aspect of IoT privacy offered identity theft protection services; risk: considering consumers’ tendency to but if it were to gain ground, companies underestimate the severity, likelihood and could arguably lose any incentive to timeframe of IoT privacy risk, regulations lower their personal data emissions. would need to be scrupulously designed in The challenge with both personal and order to help consumers better gauge these commercial liability insurance lies in the risks.164 For example, users’ sense of privacy uncertain coverages and pricing due to the risk from willingly sharing seemingly dynamic and interrelated nature of IoT benign location or sentiment data may privacy threats, not unlike the situation that present as short-term, small severity and exists currently with cyber insurance for security failures.

163 Tyler Moore, personal correspondence, December 2017. 164 See, e.g., Federal Trade Commission, 2017 74 Clearly Opaque: Privacy Risks of the Internet of Things INCENTIVES FRAMEWORK AND MECHANISMS

As for incentives, subsidies and disclosure NATURE OF THE approaches transfer the costs and IoT ECOSYSTEM burden of proof compliance to regulated companies rather than the government, If personal data emissions can be which can be appropriate when firms narrowed to their point sources – are in a better position to monitor emissions from identifiable and specific and report their emissions. Tax or 165 locations – then market approaches regulatory prohibitions are likely to fail are a better strategy, because they because of costly enforcement and are more distinguishable and more widespread noncompliance. controllable compared to other distributed approaches. Non-point sources of personal data emissions, which will likely be the case with IoT, are challenging to monitor and enforce and are therefore not conducive to regulatory mechanisms. When both data emission sources are present, a tax-based approach is more aligned when data handlers are identifiable, and subsidies or permits work better when data control is dispersed and unknown. Direct regulation is a favorable incentives approach where personal data is sticky and is retained by ecosystem actors.

165 US Environmental Protection Agency, n.d. INCENTIVES FRAMEWORK Clearly Opaque: Privacy Risks of the Internet of Things 75 AND MECHANISMS

One suggested course of action for UNCERTAIN COSTS incentivizing privacy risk management AND BENEFITS by both the supply and demand sides166 is by granting legal property interests in If the cost of abating personal emissions personal data, such that individuals could is uncertain and it is important to be compensated when their data is used prevent polluters (e.g., large web for commercial purposes.167 Criticisms platforms, consumer data brokers, of this approach include: markets credit reporting agencies) from bearing cannot function efficiently if property potentially high costs, then price-based rights accrue to data; personal data is an market approaches are appealing. If extension of one’s person and individuals there is more uncertainty with regard to should not be able to alienate a basic the benefits of controlling personal data right; only the wealthy would benefit in emissions and the objective is to prevent a data-as-property regime; US property high environmental costs (negative law would have to undergo a revolution externalities), then a quantity-based to account for personal data; regulation market instrument is more prudent would be needed because markets do (e.g., cap and trade). not respond to individual privacy needs; there would be significant increased costs to businesses, including administrative costs and impediments to innovation that is predicated on the potential to benefit from data; and individual privacy rights will lead to market fraud.

Counterarguments, however, are: currently, individuals can already choose to alienate basic rights like freedom of speech in order to receive social benefits; disparate treatment toward the poor

166 The supply side are the manufacturers, devel- opers, integrators, and retailers. The demand side are the users and entities seeking privacy protection. 167 Sholtz, 2001 76 Clearly Opaque: Privacy Risks of the Internet of Things INCENTIVES FRAMEWORK AND MECHANISMS

is not likely since they would possess I think it’s really disingenuous in a way to their identity as much as the wealthy; hold up the college student innovators data property rights would simply be an and say, “Oh, poor them,” when you’ve got extension of existing right to publicity [these] experienced repeat players in the (e.g., celebrities’ property rights in money business who have gone through photographs); businesses currently the privacy issues with a LinkedIn or a engage in a tremendous amount of Google or a Facebook, and already have a ‘wasted’ advertising and marketing that huge installed base and deep investment produces no return on their investment, in those issues. It’s not a new thing for and consensual-based exchange of data them even if it’s a new thing for the two would eliminate that guesswork; and guys who wrote an app.169 there are non-economic benefits to privacy such as political freedom and Implicit in Tien’s statement is that liberty that comprise social democracy investors are a natural choice for holding and welfare.168 companies accountable for privacy pollution, because they have historical Another potential source of incentives views into recurring privacy issues and for companies to control personal data can leverage their financial support to pollution lies with the investment supply alter behavior. The reasons why investors chain. The investment community is should be interested in addressing an untapped mechanism that could privacy pollution is beyond the scope potentially act as a forcing function to of this report. incentivize corporate social responsibility amidst a market that is lacking in incentives to embrace IoT privacy. Commenting on the positive influence that investors could have on IoT startups, Tien advocated:

168 Schwartz, 2004 169 Interview INCENTIVES FRAMEWORK Clearly Opaque: Privacy Risks of the Internet of Things 77 AND MECHANISMS

In order to facilitate the supply side of Finally, incentives for dealing with the IoT privacy management, it is essential disparate impacts of privacy management to have a clear understanding of what on innovation and market competition constitutes ‘sensitive data’ and evokes can include a combination of market a privacy risk. Additionally, companies instruments. These may be appropriate in interested in embracing privacy controls helping correct market inefficiencies that to reduce pollution need be able to grasp result when privacy regulation disrupts the benefits of maintaining lower data companies that have market power, as in identifiability. The norms and standards the health sector where innovation relies that define privacy pollution need to go on privacy-sensitive data. If cost burdens beyond conspicuous first-order identifiers were more equalized, some market to include the accrued privacy harm instruments might favor incumbents, from derivative data fusion and linking allowing them to control prices and (behavioral profiling, altered behavior, creating barriers to entry. A permit reduced autonomy). This will bolster system that sets aside a certain number the role of privacy as a factor in industry of permits for new entrants might correct stakeholders’ design and development this problem. models alongside considerations of cost, security, safety, accuracy, and reliability. In light of the current lack of market- and regulatory-based incentives for industry stakeholders to embrace IoT privacy management, we have proposed a path forward by modeling our recommendations for addressing personal data emissions by industry on the various incentive mechanisms currently used to address environmental pollution. 78 Clearly Opaque: Privacy Risks of the Internet of Things

RISKS OF HARM RISKS OF HARM Clearly Opaque: Privacy Risks of the Internet of Things 79

The IoT is the latest technical evolution to expose the divide between people’s desire for privacy and the law’s treatment of privacy harms. IoT privacy management, governance, incentives, and solutions are all bound by the notion of harm. 80 Clearly Opaque: Privacy Risks of the Internet of Things RISKS OF HARM

We use risk as a way to understand Risk is typically defined as the severity the likelihood that a harm will come and likelihood of occurrence of a harm to light. Our framing of IoT privacy to something of value (an asset) when harms in terms of risk reflects the fact a threat exploits a vulnerability. Our that, in the technical environment of model for analyzing risk of IoT privacy the IoT, some canonical harms170 may harm frames the issue as a confluence not have manifested themselves yet, of threats and vulnerabilities to privacy and some unforeseen harms may not rights and interests – the assets in our yet be considered under current legal application of risk.172 This ‘privacy risk regimes.171 Before describing those equation’ considers the potential for harm privacy harms, we first apply the to both individuals and society as a result principles of risk to the IoT environment of threats that exploit IoT vulnerabilities. in order to have a better grounding While this framing of risks of harm and to understand the privacy harms that its particular naming conventions may emerge in this new context. be unfamiliar to some of the privacy community, it mirrors core concepts from privacy frameworks such as the privacy impact assessment (PIA) - e.g., threats to personal data, potential risks, protection and mitigation strategies, controls, etc.173

170 E.g., economic loss from identity theft, fraud, loss of liberty from inaccurate information or improper use or exposure of information; and more traditional privacy torts: 1. Intrusion upon the plaintiff’s seclusion or solitude, or into his private affairs. 2. Public disclosure of embarrassing private facts about the plaintiff. 3. Publicity which places the plaintiff in a false 172 See, e.g., NIST SP 800-30: http://csrc.nist.gov/ light in the public eye. 4. Appropriation, for the publications/nistpubs/800-30/sp800-30.pdf; defendant’s advantage, of the plaintiff’s name The CNSS Instruction No. 4009: https://www. or likeness. See Prosser, 1960 ecs.csus.edu/csc/iac/cnssi_4009.pdf 171 E.g., information collection by surveillance; 173 See, e.g., DHS PIA Template: https://www. aggregation of information; insecurity of infor- dhs.gov/xlibrary/assets/privacy/privacy_pia_ mation; and disclosure, exposure, distortion, template.pdf; EU Privacy Impact Assessment and increased accessibility of information. See Framework: http://www.vub.ac.be/LSTS/pub/ Solove, 2006. Dehert/507.pdf RISKS OF HARM Clearly Opaque: Privacy Risks of the Internet of Things 81

The scope and degree of precision of TECHNICAL IoT data capture moves the resulting UNDERPINNING information beyond fuzzy snapshots of human activities towards high-resolution The technical characteristics of the IoT data and inferences. The scale and underpin and are the precursor for the opacity of data collection and flows will privacy threats, vulnerabilities, and influence the relationship between harms described subsequently. The individuals and organizations regarding architecture of the IoT reduces friction the collection, use and disclosure of in collecting, processing, disclosure information, in ways that have important and actuation of data. The effect is a privacy implications. In sum, the technical blurring of the temporal, spatial and drive to optimize and reduce friction in organizational boundaries that have information flows results in increased heretofore separated the physical, friction for individuals attempting to digital, biological, and social spheres. maintain privacy through control of their IoT sensors act as vectors for digitizing information. We expand upon this further anything that can be sensed, capturing in the Privacy Management section. communications and visual, auditory, physical, and biological information that can then be managed, interconnected, and controlled. 82 Clearly Opaque: Privacy Risks of the Internet of Things RISKS OF HARM

We focus on the more tacit and PRIVACY THREATS understated portrayal of the threat model: the IoT drives equity conflicts between Privacy threats in the IoT are legitimate actors. In other words, the characterized by access, collection, use IoT may elicit a comparatively greater (analysis, actuation), and disclosure of privacy threat by other, non-malicious personal data in violation of people’s ecosystem stakeholders – industry, rights and expectations. As we discuss the government, and other people – as further in the Boundaries section, the a function of competing rights and scope of threats associated with the IoT interests that arise from IoT capabilities. is more expansive than the classic online Consider smart cities, for example, realm, raising the probabilities of privacy where sensors collect, analyze, and risks. Broadly speaking, the threat share data from light pole sensors that landscape of IoT privacy should include monitor vehicle and pedestrian traffic, the omnipresent attack vectors available parking, and local transportation: one to malicious actors. person’s expectation of privacy (to not be monitored or targeted) may conflict with the government’s interest in enhanced public services, which may clash with a fellow citizen’s expectation of safety, which may collide with industry’s claim to commercial free speech (product and service offerings based on travel logistics).

These tensions represent another threat posed by the IoT in the form of a potential power imbalance. Governments are motivated to leverage personal information to secure infrastructure, interact with citizens, and serve other national interests. Industry is incentivized to exploit this same information to sell goods and services, and to protect the enterprise. In opposition to these RISKS OF HARM Clearly Opaque: Privacy Risks of the Internet of Things 83

two actors, citizens have an interest in of platforms and services upon which limiting data collection and ensuring their consumers depend. If these IoT platforms privacy, allowing data to be aggregated to are fueled by data from users, the users’ promote anonymity, security, safety, and lack of control over that data will threaten social democracy and regaining control self-determination and ultimately create a of their economic interests. How these self-perpetuating power imbalance. tensions resolve with regard to privacy in the IoT will be determined in part by The scope of the IoT threats is further the extent to which the IoT is leveraged to complicated by a general lack of address power imbalances and associated understanding of how threats will information asymmetries unfavorable manifest. This makes it difficult to to consumers. ascertain which precautions and mitigation measures to put in place If one form of power is the ability to to avoid or minimize adverse impacts. collect, process, and actuate data to The Mirai botnet in the Fall of 2016 that exert control over individuals in ways commandeered hundreds of thousands that negatively impact their self- of unwitting IoT devices to impose determination, the IoT threatens to shift millions of dollars in damage from the power landscape by exacerbating business interruption, fraud, and loss of disproportionate control of personal data and customer loyalty174 revealed the information by companies and significant harm potential of using IoT perpetuating a lack of the transparency devices to wreak financial, operational, which is so essential to consumers’ and physical harm.175 accruing appropriate control. More equitable power – the chance for consumers to have a say in how their data is handled – would serve as a social and democratic check and balance. Power inequity, on the other hand, is a barrier to meaningful negotiations, competition and bargaining over rights and interests. In the IoT, power inequity will be a threat to privacy to the extent that data control 174 Antonakakis, et al. 2017 is unchecked and consolidated by owners 175 Cogeco, 2017; Altman Vilandrie & Company, 2017 84 Clearly Opaque: Privacy Risks of the Internet of Things RISKS OF HARM

The traditional boundaries by which PRIVACY society has constructed privacy VULNERABILITIES expectations are blurring, as we discuss in the Boundaries section. It is hard Another element in the privacy risk for individuals to know if the physical equation involves understanding the features that have assured a sense weaknesses or gaps in protections that of solitude, permitted people to act can be exploited by the aforementioned anonymously, and supported control over threats to cause privacy harm. The identities are becoming ineffective. This scale and volume of data available is the case when data flows in an opaque, for collection and use expands the unobtrusive, automatic, regularized range of opportunities to exploit data manner – all promised features of the IoT. that implicates privacy and therefore increases the probability of realized Even when users are aware of data flows, harm. As discussed above in the privacy vulnerability is increased by technical underpinning, the digitization inadequate security of IoT devices. As of anything that leaves a trace or well, even when deficient security is is subject to sensing – biometrics, not the cause of privacy vulnerabilities, emotions, behaviors – introduces context-shifting and blurring between a privacy exposure point. data collection for commercial and social settings creates another type of privacy vulnerability. Personal and social transactions and activities that are mediated by commercial information products and services are increasingly subject to commercial pressures to generate revenue. While the revenue model for the IoT is still emerging, it seems likely that many services will be RISKS OF HARM Clearly Opaque: Privacy Risks of the Internet of Things 85

predicated upon users trading personal data for them. Consumers are being asked IoT PRIVACY HARMS to provide and link more information to Understanding the adverse impacts on avail themselves of IoT functionality, yet individual and societal privacy rights so far have been given limited tools to and interests (the assets) comprises the control that personalization. Even when third consideration in this framing of IoT an individual is not the direct target privacy risk. We focus here on potential of sensing, incidental data captured distinct differences in degree or kind of by other people’s devices and the harms in the context of the IoT. Notably, interconnectedness of large volumes this includes harms that the law has not of data increase privacy vulnerability. codified but which the IoT evinces, such as social harms. In addition to the degree of vulnerability in the previous examples, the IoT introduces a relative difference in the kind of vulnerability that can enhance privacy risk. For example, sensing and digitizing sentiments and emotions yields a new path to measuring intimate features of people in ways not seen before.

To summarize, thus far we have applied the principles of risk to the context of IoT privacy to furnish a model to better understand, articulate and justify what constitutes privacy harms in the IoT that may not be sufficiently considered or foreseen by existing legal and technical controls. 86 Clearly Opaque: Privacy Risks of the Internet of Things RISKS OF HARM

PERSONAL INFORMATION BREACHES (4th Amendment prohibition on unlawful AND IDENTITY THEFT search and seizure in the US). We Conspicuous risks of harm in the IoT are consider anything that impedes people’s those that ensue from inadequate security, self-determination while directly or such as breaches of personal information indirectly engaging with information and identity theft.176 If an IoT company systems to constitute autonomy harm. loses data about users’ personal behaviors gathered in their homes or in activities in Collective autonomy harms can have public and their identity is linked to this disparate, far-reaching impact on the data, this could cause measurable harm economic, physical, and psychological to consumers. Given the large numbers well-being of individuals and groups. of data-collecting devices the IoT Inadequate management of the personal portends – a larger attack surface – data that informs the models produced by breaches may increase. The resultant machine learning algorithms can result in harms in such cases may be the most public health and civil services disparities. easy to quantify, relative to the other Similarly, environmental sensor data can harms discussed below. fail to aid vulnerable populations based on race or socioeconomic conditions. AUTONOMY HARM Collective autonomy harm can manifest The value of autonomy is inextricably as unequal access to and control of data. bound up in the law’s treatment of This risks engendering mistrust between privacy harms. Indeed, transgressions individuals and institutions, resulting in of autonomy can be said to underline impediments to or disengagement from the majority of privacy harms, whether social, political and economic activities psychological (embarrassment, that define individual and collective stigmatization, loss of trust, chilling effects identities. Collective autonomy harms on ordinary behavior, discrimination, in the IoT warrant attention because if intrusion on seclusion); economic left unabated, these power imbalances (discrimination in employment, credit, get technologically embedded and education, and insurance) or physical institutionalized. They become hard to repeal and impact the entire fabric of 176 For example, in 2018 nearly 150 million users’ social relationships within which privacy personal details collected by the UnderAr- mour/MyFitnessPal app, including usernames, interests reside. email addresses and passwords, were leaked in a data breach. RISKS OF HARM Clearly Opaque: Privacy Risks of the Internet of Things 87

DIMINISHED USER PARTICIPATION the ability to act upon data about oneself – Both literature and our interviews point to affect the disposition of data, ergo to diminished user participation as a the active verbs: access, rectify, erase, potential harm in the IoT ecosystem. correct, object. The 1980 OECD guidelines contained an Individual Participation Principle. It In part, Participation is diminished by a sought to give people the right to obtain weakening of people’s ability to knowingly personal data held about them by data consent to IoT data collection and use. collectors, to be given a reason if their A report by the US President’s Council requests were denied and an ability to of Advisors on Science and Technology challenge the denial, and to correct or summarized the issue: erase incorrect or incomplete data. The US Fair Information Practice Principles Cameras, sensors, and other also call for Individual Participation observational or mobile technologies along nearly identical lines.177 Further, the raise new privacy concerns. Individuals GDPR specifies a range of participation often do not knowingly consent to principles, including rights of access, providing data. These devices naturally rectification, erasure, data portability, and pull in data unrelated to their primary the ability to object to data processing purpose. Their data collection is often and marketing. invisible. Analysis technology (such as facial, scene, speech, and voice Participation means having some say in recognition technology) is improving the treatment of data about you. If privacy rapidly. Mobile devices provide location is “the claim of individuals, groups, or information that might not be otherwise institutions to determine for themselves volunteered. The combination of data when, how, and to what extent information from those sources can yield privacy about them is communicated to others,”178 threatening information unbeknownst then Participation is an enactment of that to the affected individuals.179 claim. Naturally, one must know who has data about oneself to participate in its use, so Participation is necessarily reliant on transparency. Participation also requires

177 US Dep’t of Homeland Security, 2008 179 President’s Council of Advisors on Science and 178 Westin, A., 1967, p. 7 Technology, 2014 88 Clearly Opaque: Privacy Risks of the Internet of Things RISKS OF HARM

A concern over ambient data collection ‘concrete injury in fact.’ Identity theft by the people near you also connects to claims that rest on increased risk rather consent – if a person doesn’t know who is than actual fraudulent use are often collecting information near them, there’s deemed too speculative,181 as are damage no way to consent to that collection. claims related to fear and emotional distress that result from increased VIOLATION OF EXPECTATIONS vulnerability to a future attack. The OF PRIVACY difficulty calculating damages is one basis The law sets formal expectations for dismissing such claims, as seen in June of privacy rights. When there is 2017 when a federal judge in California incongruity between what society dismissed a class action lawsuit against believes privacy harm to be and what Facebook for tracking its logged-out can be remedied via the law, the result users’ internet activity, in part because of is fractured expectations of privacy. This the absence of sufficiently ‘particularized incongruity in defining privacy harms and concrete’ harm. 182 emerges as a result of changes wrought by technology. Courts in the US are at Signals of fractured expectations of least partially addressing the blurring privacy are also evidenced in regulatory between the concrete manifestations actions. However, at least as far as and abstract concepts of privacy injury enforcement actions by the leading US when information is the proxy, but the consumer protection agency go, the gap general trend is slow to make the leap between principles and implementation towards admitting these abstract concepts is smaller. The Federal Trade Commission of privacy injury. Specifically, case law has negotiated consent decrees based anchors on financial or physical harm on non-monetary, abstract, autonomy that has provably already occurred and generally does not recognize the risk of future harm. Nor does it recognize negative impacts that are cumulative and 181 At the time of this writing, the 1st and 3rd US collective.180 For example, the threshold Circuit Courts of Appeals do not recognize; the 7th and 9th Circuits recognize an alle- standard to bring about a lawsuit or gation of future harm if there is “danger of achieve a favorable verdict demands sustaining some direct injury” that is “both real and immediate”—such as identity theft. 182 Facebook Internet Tracking Litigation, US Dis- trict Court, Northern District of California, No. 180 Solove, 2006 12-md-02314 RISKS OF HARM Clearly Opaque: Privacy Risks of the Internet of Things 89

and dignity-based harms.183 In what is toys. Key reasons for the use of emotion now an infamous, long-standing case, it detection are the ability to gauge reactions declared ‘substantial injury’ to include to content or advertising; affective intangible but concrete harm caused behavior by a product, reacting to a child’s by the disclosure of sensitive medical or adult’s emotions and dynamically information, declaring that “public changing the product’s behavior to deepen disclosure of private information is by and diversify its interactions; and nudging itself an actual concrete harm, even or manipulation in commercial contexts. absent tangible effects or emotional “With insight into emotions, advertisers injury” and, “a practice may be unfair if and retailers have a higher chance of the magnitude of the potential injury is influencing [behavior] and nudging us large, even if the likelihood of the injury to spend,”185 McStay writes. One of his occurring is low.”184 interview subjects, representing a global trade association for the market research ENCROACHMENT ON industry, expanded upon this, saying that: EMOTIONAL PRIVACY As we discuss in the Boundaries section, The future of advertising and marketing the IoT is bringing devices ever closer lies in passive and always-on data to people’s bodies. The increasing scale collection, and that the Holy Grail is real- and proximity of sensors is beginning time information about customer needs to open new frontiers in detecting and emotions… Today this is dependent people’s emotional states and inner on advances in mobile and wearable life. The potential resultant harm is an technology, and correlation of geo- encroachment upon people’s emotional location with contextual and behavioural privacy – a relatively new phenomenon information. The value of passive data being amplified by connected devices. collection is instant access to transactions and conversations… Seen this way, Prof. Andrew McStay cites biosensors and biometric data promise experimentation with emotion detection additional real-time understanding as in robots, in-home virtual assistants, people move throughout everyday life, children’s toys, video games, and sex the city and retail spaces.186

183 See, e.g., In the matter of DesignerWare, LLC, No. 112-3151 (Apr. 15, 2013). 185 McStay, 2018, Ch. 8 184 LabMD, Inc. v. FTC, Case No. 16- 16270 186 McStay, 2016, p. 4, emph. added 90 Clearly Opaque: Privacy Risks of the Internet of Things RISKS OF HARM

Another of McStay’s respondents also sensors deeper into private and public related emerging devices to emotion spaces, it seems inevitable that the IoT detection. Dubious of traditional will begin to challenge our emotional marketing surveys and focus groups, privacy. The Internet of Things, this Chief Insight Officer at an ad firm bringing its sensors closer to bodies sought better methods to understand and thereby nearer the ability to, in Josh customers, “including biometric data Cohen’s terms, “externalize your inner about emotions to understand ‘brand self,”189 brings to the fore a privacy levers,’ or how to get people to act, click, risk that has been largely dormant. As buy, investigate, feel or believe… to the harm, McStay succinctly asks, [T]his objective involves ‘understanding “if the body, emotions, and feelings people through all of their devices and aren’t valuable in and of themselves, interaction points, i.e. wearable devices, what is?”190 There is a need for a social mobile and IoT.’ In general, the aim… conversation about data, and the time to is to ‘collect it all’ so to build more discuss the privacy impacts of emotion meaningful interaction with brands.”187 detection technology is now, while its use is not yet commonplace. McStay notes this executive’s “interest in emotional transparency, or DIMINISHMENT OF PRIVATE SPACES the unfolding of the body to reveal All of the privacy-challenging IoT reactions, indications of emotions, characteristics we have discussed – feelings about brands, tracing proximity, scale, increased monitoring, of customer journeys and boundary crossing, reduced ability information that will help create to opt-out of collection – add up to an ‘meaningful brands.’”188 increasing diminishment of private spaces. This harms people’s ability to Given the advertising industry’s achieve solitude and reserve, both from growing hunger for the fullest range others and in their thoughts. We are of human data, the incorporation of concerned about a reduction in the emotion detection across of a range of availability of spaces for individuals industries, the broadening capability to be able to retreat to and not be of sensors, and penetration of those observed – spaces where one can

187 Ibid. 189 Cohen, 2014, p. 30 188 McStay, 2018, Ch. 8, emph. added 190 Interview RISKS OF HARM Clearly Opaque: Privacy Risks of the Internet of Things 91

control who is present, who is listening, Michelle De Mooy of the CDT put the who is watching; places of seclusion. diminishment of private spaces in the This diminishment of private spaces context of corporate actors: translates to a reduced ability to withhold data on lifestyle preferences, You have companies that are enormous family dynamics, hobbies, etc. from like Google and Facebook that control third parties. A central site of this huge amounts of the economy, and it’s concern is the home. Heather Patterson arguable that their influence might be of Intel commented that: even more than the government’s - they have access to vast amounts of personal The bringing in of Alexas and Nest data and therefore surveillance by [these thermostats and DropCam cameras companies] could be more damaging to and Hue lights and everything privacy and ultimately more destructive being connected up, there’s a lot of to democratic principals.193 convenience associated with it, but I’m seeing in the research a lot of pervasive Privacy law scholar Joel Reidenberg discomfort that’s just arising from this discussed how the proliferation of feeling of being watched or never truly technology that can identify people in being alone.191 both public and non-public spaces harms a long-standing expectation of privacy Meg Leta Jones worried about the impact by obscurity: of diminishing private spaces on children: Society can no longer claim any I think we should continue to find quiet, expectation of anonymity in crowds… closed spaces that don’t have any type The ubiquity of image-capture devices in of surveillance in them, and I think that private hands also means that individuals kids should be really, really free from lose an expectation of privacy in non- those spaces. That is part of them public places… what might have been developing their own generation of an a previously obscure, anonymous information society.192 presence in a private place becomes an identified act.194

191 Interview 193 Interview 192 Interview 194 Reidenberg, 2014, p. 149, emph. added 92 Clearly Opaque: Privacy Risks of the Internet of Things RISKS OF HARM

Elegantly driving home Prof. Reidenberg’s Heather Patterson noted that people act point, the New York Times reported differently when they feel watched: in March of 2018 that Madison Square Garden, New York’s famed event space, People do watch those things in their had been surreptitiously using face- homes that they are really concerned scanning technology for security. A will expose them to judgment, to senior vice president of the company who exclusion and isolation, and to feeling manages the technology for the venue shame and feeling dirty and feeling lazy, remarked, “The days of having 40,000 feeling less than other people in to 60,000 people in the stadium and not some ways.196 knowing who they are, I think those days are going to disappear.”195 Similarly, Brookman and Hans write:

In order to remain a vibrant and CHILLING EFFECTS innovative society, citizens need room Lack of control over the constant data for the expression of controversial — collection can result in psychological and occasionally wrong — ideas without and behavioral chilling effects contrary worry that the ideas will be attributable to consumers’ intentions. These effects to them in perpetuity. In a world where may manifest as a reluctance to engage increasingly every action is monitored, or trepidation when encountering IoT stored, and analyzed, people have a devices. The psychological insecurity substantial interest in finding some way about unwanted interference and to preserve a zone of personal privacy manipulation may further manifest in that cannot be observed by others.197 people’s actions. Feelings of malaise, resignation, or helplessness are NORMALIZATION OF SURVEILLANCE subjectively real but, people may be AND MANIPULATION unable to articulate legally cognizable In 2014, Vint Cerf remarked in a report harm because they lack information imagining life in 2025 that “continuous about whether or how data is actually monitoring is likely to be a powerful being used. element in our lives.”198 Here again

196 Interview 197 Brookman and Hans, 2013, p. 4-5 195 Draper, 2018 198 Pew Research Center, 2014, p. 16 RISKS OF HARM Clearly Opaque: Privacy Risks of the Internet of Things 93

is the common theme that the IoT is organizational practices and the desire an evolution of prior historical trends for efficiency, speed, control and – in this case, technology further coordination does not mean that all is normalizing surveillance, retrenching well. All it means is that we have to be the existence of a surveillance society. careful identifying the key issues and This is a loaded term, of course, and vigilant in calling attention to them.”201 requires disentanglement from its ominous connotations. Surveillance De Mooy noted the tension between the studies scholar David Lyon writes: Internet of Things and the democratic values of a freedom from surveillance Surveillance is part of the way we run and from government interference: the world in the twenty-first century. Conventionally, to speak of surveillance With devices in our homes that are society is to something maybe surreptitiously recording us or sinister, smacking of dictators and collecting and sharing information that totalitarianism. But the surveillance we’re unaware of, or even just the fact society is better thought of as the that they’re in our homes recording, is outcome of modern organizational that a violation of the idea that we are trends in workplaces, businesses and free from surveillance? From government the military than as a interference? That in our homes, it is a covert conspiracy.199 private space? I think once that idea is challenged, then you have questions of It is in this light that Cerf’s comments is it possible to ever find a private space, should be read. Alongside – or despite and how necessary is private space to – the utopian predictions of the IoT’s freedom of thought? I would answer that near-term contributions to social it’s very necessary. It’s at the core.202 welfare, connected devices must be recognized as sense organs in the Just as the relentless, granular tracking surveillant assemblage200 of the digital of online activity has been normalized, age. As Lyon explains further, “getting the Internet of Things will enable and surveillance into proper perspective normalize preference and behavior as the outcome of bureaucratic tracking in the offline world. Indeed,

199 Lyon, 2008, emph. added 201 Lyon, 2008 200 Haggerty and Ericson, 2000 202 Interview 94 Clearly Opaque: Privacy Risks of the Internet of Things RISKS OF HARM

the characteristics of the IoT force one to of users, health and wellness libraries ask if the very notion of an offline world with unprecedented and complete will exist in the near future. entries of incalculable value to business associates, employers, and The emotional privacy concerns detailed insurance companies.205 above are undergirded by two emerging trends: the datafication of people’s We asked her what was the problem emotions so as to make everyday with this: emotional life machine-readable; and subjecting emotion data to commodity I feel like there’s something kind of logic. McStay observed, “In terms of fundamental [quality] that can get lost… where this is going, it’s about attention, when we live in a society where… in the emotion, intention, and ultimately it’s eyes of [some] others [we] exist just to about commodification.”203 be stripmined and have our individual particles re-assembled and converted Or, as scholar Shoshanna Zuboff puts to cash.206 it, the machine-readable human life represents “the migration of everydayness as a commercialization strategy… [reorienting] the nature of the firm and its relation to populations.”204 In a 2013 paper about self-generated health data, Heather Patterson wrote:

Encouraging… rigorous, whole body quantification of the self not only habituates individuals to the concept and practice of scanning and cataloguing activity and consumption habits, it results in corporations holding vast treasure troves of highly personal health data about tens of thousands

203 Interview 205 Patterson, 2013, p. 9 204 Zuboff, 2015, p. 76 206 Interview RISKS OF HARM Clearly Opaque: Privacy Risks of the Internet of Things 95

Julia Powles echoed Patterson, titling a In terms of what is at stake, at a 2015 article, “We are citizens, not mere minimum, nobody could disagree that physical masses of data for harvesting.”207 there’s a better than average chance Relatedly, one other researcher warned of raising the capacity to manipulate about how the information people human behavior, typically in a contribute to their own commodification consumer setting.209 could be used against them: Prof. Ryan Calo puts such concerns in If you think about products like the context of the ‘mediated consumer,’ Alexa… where sensing is being used to raising an alarm about the nature and increasingly commercialize a home, that timings of manipulating behavior: is inherently dangerous. It takes a space that nominally is not commodified, and [F]irms can increasingly choose when commodifies it. So effectively, anything to approach consumers, rather than you say now can be used against your wait until the consumer has decided own economic interests.208 to enter a market context… In an age of constant ‘screen time,’ however, in When asked what is at stake with the which consumers carry or even wear collection of intimate information and devices that connect them to one or the datafication of everyday emotional more companies, an offer is always life, McStay pointed out that this data an algorithm away. This trend of firms allows corporations to manipulate and initiating the interaction with the control human behavior with the aim of consumer will only accelerate as our increasing consumption: thermometers, appliances, glasses, watches, and other artifacts become networked into an ‘Internet of Things.’210

207 Powles, 2015 209 Interview, emph. added 208 Interview 210 Calo, 2013, p. 1002-1005, emph. added 96 Clearly Opaque: Privacy Risks of the Internet of Things RISKS OF HARM

Worryingly, Calo notes how the American consumer protection regime is ill-prepared to address the potential harms of manipulating human behavior, stating that “market manipulation has had only a modest impact on regulatory policy.”211 This concern was brought into stark relief in 2017 and the beginning of 2018 when it was shown that Russian disinformation campaigns were conducted through Facebook to affect US voting behavior,212 and that extensive profile data on over 87 million users was inappropriately collected by Cambridge Analytica, a political consulting firm, to use in their psychological profiling and targeting of US voters in the 2016 election.213

211 Ibid., p. 1002 212 Karpf, 2017 213 Chang, 2018 RISKS OF HARM Clearly Opaque: Privacy Risks of the Internet of Things 97

diminish. And, if people feel they are CONCLUSION being watched, they might chill their The IoT is an evolutionary step in the speech and conform their behavior in development of industrial, enterprise, and response. The closeness of sensors to consumer technologies. The scale and people’s bodies implicates emotional proximity of sensor technology, ubiquity, privacy, especially considering there is opaqueness, and unobtrusiveness a commercial interest in consumers’ amplify power imbalances between emotions as a way to better understand citizens and data collectors, making and sell to them. Broadly, the IoT them more vulnerable to a range of portends a greater ability to render privacy harms. Data breaches and the society machine-readable, enhancing loss of personal information will likely surveillance and commodification. The increase as the IoT creates a larger attack tracking of behavior and preferences surface. People’s expectations about that are now a standard part of living the private nature of their activities, online will migrate to the offline world. interactions, and behaviors are likely However, all is not lost – the next two to be further challenged. As the human sections, Emerging Frameworks and environment acquires more monitoring Strategies and Recommendations, devices, private spaces into which discuss a range of ways to address the people can retreat and find reserve may above risks and harms. 98 Clearly Opaque: Privacy Risks of the Internet of Things

EMERGING FRAMEWORKS AND STRATEGIES EMERGING FRAMEWORKS Clearly Opaque: Privacy Risks of the Internet of Things 99 AND STRATEGIES

This section assembles the principles, practices, techniques, and frameworks that emerged from our interviews, workshops and literature to address the privacy risks examined in prior sections. They fall into three categories: user control and management, notification, and governance.

100 Clearly Opaque: Privacy Risks of the Internet of Things EMERGING FRAMEWORKS AND STRATEGIES

PRE-COLLECTION USER CONTROL Data minimization AND MANAGEMENT One strategy to improve the privacy STRATEGIES posture of IoT devices is to not collect certain information at all. As the FTC has To enhance people’s control and abilities stated: “Thieves cannot steal data that has to manage personal data we find that been deleted after serving its purpose; there are pre-collection and post- nor can thieves steal data that was not collection strategies for organizations collected in the first place.”214 Avoiding to implement, and identity management data collection not only reduces the (IDM) strategies, which we separate possibility of data loss, but it may reduce because they cut across both pre- and cost or computational overhead if that post-collection. data is truly not needed for features to function. A privacy impact assessment (discussed below) can surface which data elements could be eliminated. Some data protection laws use the term ‘purpose limitation’ to refer to the practice of restraint in data pre-collection strategies: if the data does not serve an immediate business or essential functional purpose, do not collect it.215

214 FTC, 2015, p. 35 215 See Art29WP Opinion 03/2013 on Purpose Limitation: http://ec.europa.eu/justice/ article-29/documentation/opinion- recommendation/files/2013/wp203_en.pdf EMERGING FRAMEWORKS Clearly Opaque: Privacy Risks of the Internet of Things 101 AND STRATEGIES

An illustrative case of preventable Do Not Collect ‘Switches’. Devices can automatic data collection is vehicle and often should be designed with easy infotainment systems. Plugging a phone ways to shut down collection sensors. into a car’s USB port can sometimes Research on RFID privacy in the early trigger an automatic data transfer, pulling 2000s suggested that users have a right across people’s contacts, GPS information, to disconnect from their networked phone number, and other personal data, environment, and so should be able to often without appropriate or granular easily deactivate the tracking functions consent.216 Many infotainment systems of RFIDs. French Internet expert Bernard do not provide obvious or user-friendly Benhamou coined the term the ‘silence ways to delete such data. In the case of of the chips’ to capture this notion,218 rental cars, this raises obvious privacy and the Art29WP219 has recently updated concerns, as it leaves a very detailed set of the idea, stating: “Similarly to the ‘Do personal information lying unprotected Not Disturb’ feature on smartphones, IoT in a quasi-public space. In 2016, the FTC devices should offer a ‘Do Not Collect’ issued a warning on this issue, suggesting option to schedule or quickly disable car USB ports should not be used at all sensors.” This could be interpreted for charging.217 The design imperative to mean actual physical switches, or here is clear: always ask for data-specific obvious toggles within an interface. permission before downloading phone information, do not collect irrelevant information, and make it easy for people to delete all personal data. For example, in addition to limiting the data gathered by infotainment system, a privacy-by-design approach would suggest that rental car companies should have a standard policy to wipe all personal data from cars right after they are returned.

216 Akalu et al., 2016 218 Santucci, 2013 217 Schifferle, 2016 219 Art29WP, 2014, p. 22 102 Clearly Opaque: Privacy Risks of the Internet of Things EMERGING FRAMEWORKS AND STRATEGIES

Wake Words and Manual Activation Privacy Impact Assessments While a common defining characteristic One way to assist all the above design of the IoT is that devices are always- considerations is by performing a on, from a design perspective there’s privacy impact assessment (PIA), what much more nuance to consider. Stacey the GDPR calls a Data Protection Impact Gray of the Future of Privacy Forum Assessment (DPIA). PIAs are systematic helpfully illustrates the granularity of processes to evaluate the impact and risks ‘always-on,’ proposing three categories of of collecting, using, and disseminating microphone-enabled devices: manually- personally identifiable information in a activated (e.g., a switch or button), speech- project, product, service, or system. The activated (via a ‘wake word’), and truly goals are to identify privacy risks; ensure always-on (constantly transmitting compliance with national or local laws, data).220 The strongest privacy choice is contractual requirements, or company one where users are aware that a device policy; and put risk mitigation strategies is listening or watching, such as via a red in place. PIAs help organizations get light or other indicator, and must take a a better sense of the personal data clear action to activate it. In line with the they handle, as well as understand the ‘Do Not Collect switches’ described above, associated risks and how to manage issues Gray notes that a hard ‘mute’ button on in case of a data breach or accident.222 devices with wake words is a privacy- Basic elements comprising a PIA include: enhancing choice, curbing unintended audio capture and alleviating concerns about surveillance and intrusion.221

220 Gray, 2016 221 Ibid., p. 9 222 See Wright and DeHert, 2011 EMERGING FRAMEWORKS Clearly Opaque: Privacy Risks of the Internet of Things 103 AND STRATEGIES

• Data sources In the US, PIAs have been mandatory for federal agencies for some time, but they • Data flows through the remain uncommon in industry. The GDPR product/service lifecycle requires data controllers223 to perform • Data quality management plan DPIAs when data processing is “likely to result in a high risk to the rights and • Data use purpose freedoms of natural persons.”224 The • Data access inventory—who inside Art29WP released guidance on how to and outside the organization can interpret this requirement. The list of access the data activities that would trigger a DPIA is somewhat complex, but the following are • Data storage locations particularly germane to the IoT: • Data retention length • Systematic monitoring • Applicable privacy laws, regulations, and principles • Sensitive data or data of a highly personal nature • Identification of privacy risks to users and the organizations and the severity • Innovative use or applying level (e.g., High, Medium, Low) new technological or organizational solutions225 • Privacy breach incident response strategy

223 In GDPR terms, the primary entity that col- lects, stores, and directs the processing of personal data. External entities that process data on behalf of the data controller are called ‘data processors.’ 224 GDPR Art. 35 225 Art29WP, 2017, p. 9-10 104 Clearly Opaque: Privacy Risks of the Internet of Things EMERGING FRAMEWORKS AND STRATEGIES

The GDPR applies to all companies The GDPR creates a right to erasure. who process Europeans’ personal data, As with many aspects of the GDPR, this including American and other countries’ right is not straightforward and has companies who serve customers in the counterbalancing aspects, but in general European Union, whether or not they European Union data subjects have the have a physical or legal presence in a right to cause data controllers to delete European member state. For the first time, data about them if they withdraw consent the GDPR puts in place a sanction regime or when the data is no longer needed for for companies who do not carry out an the purpose for which it was collected assessment when they should: up to 2% of or processed.227 There is no substantive annual turnover or ¤10 million, whichever regulatory equivalent in the US,228 but it is greater.226 remains a principle that enhances a user’s participation rights.229 Giving users the POST-COLLECTION ability to easily delete data supports their ability to make substantive choices about Data Deletion their personal information. Continuing from Data Minimization above, deleting data is a very good risk- reduction strategy. This could mean:

• deleting data as close to the sensor as possible

• deleting data after aggregating it

• deleting data after a period of time has elapsed and the data has become ‘stale’

227 See Maldoff, 2016 for a brief overview 228 The exception to this is if a company promises in their privacy policy that they will give users the ability to delete their data, they must fulfill this promise else be exposed to FTC action for deceptive practices. 226 Ibid. 229 See Risks of Harm section EMERGING FRAMEWORKS Clearly Opaque: Privacy Risks of the Internet of Things 105 AND STRATEGIES

Consent withdrawal Encryption Requiring consent before data is Traditional encryption schemes were collected or processed is a key feature of designed with greater resources in mind. privacy and data protection regimes, but As a result, many of them will not work withdrawing that consent is problematic. well with IoT devices, which challenge the Arguably, consent is meaningless without basic strategy of encrypted data at rest and an ability to withdraw it, to say nothing in transport because they are ‘resource- of granular consent possibilities (‘yes constrained’: low power, small form factor, for this purpose, no for that purpose,’ low memory, low processing power, low etc.). The GDPR allows for the possibility cost, limited network bandwidth, low of preventing further processing by storage capacity. This has spurred much mandating that “it shall be as easy to research and development on lightweight withdraw consent as to give it.”230 Such cryptography, which takes IoT devices’ withdrawal would then be a prelude to constraints into consideration. NIST’s 2017 triggering one’s right of erasure. Ease report on the subject is a useful starting of consent withdrawal and deletion are point for exploring lightweight encryption both supportive strategies for user choice, as an IoT privacy and security strategy.231 control, and participation. IDENTITY MANAGEMENT Since the early 2000s, the identity management (IDM) community has actively explored many modern privacy issues: the separation of informational contexts, cross-correlation of online activities, the value of pseudonymity, the sensitivity of identification, uncontrolled profiling, and the need for personal data management systems to be designed for and controlled by users.232

231 McKay et al., 2017 232 See, e.g., ICPP and SNG, 2003, “Identity Man- agement Systems: Identification and Compari- son Study”: https://www.genghinieassociati.it/ wp-content/uploads/2011/11/ICPP_SNG_IMS- 230 GDPR Art. 7(3) Study_Summary.pdf 106 Clearly Opaque: Privacy Risks of the Internet of Things EMERGING FRAMEWORKS AND STRATEGIES

IDM research and commercial work has • Who is the device owner? yielded valuable language and concepts, • Are there additional users? such as: unlinkability – the intentional separation of data events and their sources, • Is there an option for unidentified breaking the ‘links’ that form between the Guest Users? different places users go online or between • Is there an option for different devices; and unobservability pseudonymous use? – building systems that are blind to user activity. Both of these can be employed in • How are users’ system rights managed? the design of IoT devices and their host • Can users be given delegated/partial systems. For example, unlinkability in an control rights? (E.g., change the heater IoT context can be restated as: my car/ temperature but not turn it off) fitbit/voice-assistant (and therefore, the manufacturers and intermediaries) do • Who can access data stored on a device not need to know which websites I visit, or or direct sensor feeds? which other devices I use. Unobservability • Can device owners see other is a design principle that can be applied to users’ data? intermediaries and IoT platforms, blinding them from user activity. These two concepts • Can two users see one another’s data are a form of data minimization, limiting or change each other’s settings? which parties can see personal data. • Can data from a device be verifiably associated with a particular user? The identity dimensions of the IoT are just beginning to take shape, and there are two • How do users disconnect/disassociate key considerations: machine identity and themselves? human identity. Machine identity concerns the ways that devices are authenticated and kept track of within an IoT management system. This encompasses security, machine ‘trust,’ and the provenance of the data coming off a device. Human identity, in the IoT context, concerns questions which all have privacy implications: EMERGING FRAMEWORKS Clearly Opaque: Privacy Risks of the Internet of Things 107 AND STRATEGIES

Identity management is a useful lens In her work, Maler is seeking to “shift through which to view IoT privacy as the consent concept to permissioning.”235 it moves the discussion away from Maler illustrates what she calls intrusion, unexpected practices, breaching “relationship-driven permissions” confidentiality, and threats to seclusion into through the example of a family home a discussion of privacy as control, access and an Airbnb rental. The home contains management, and selective sharing. That is, the home owner and family members; it IDM concerns itself with architectures of has many ‘smart’ devices in it, and each permission. IDM expert Eve Maler observed: resident can have different permission rights. When the family rents out their IoT data is available for you to share, but entire home on Airbnb, the guests are you may not wish to share it with everybody given a narrow set of rights to use the in the world. It’s selective sharing. Privacy smart lock, the air conditioner, the has historically meant, “Stay away from speakers, internet access, smart lighting, me and don’t take my data,” but it’s and so on. When the guests leave, their starting to appear to me in a model where access and other rights are revoked. if it’s user-controlled, person/individual- controlled, data-subject controlled, then As an example of work which aims ‘controllable’ means there’s an ebb and flow. to change the non-granular, binary ‘Controllable’ means you get to release it conception of consent to multi-modal, when you want to.233 granular permissioning arrangements, Maler points to the emergence of ‘consent Maler uses the example of the Google tech.’ These are particular families of Docs Share button as a model for standards and technology that build delegated access and selective sharing upon each other to yield frameworks that of both data and system functions. The address different pieces of Maler’s vision Share button allows a document owner to of control and selective sharing: grant others the ability to edit, comment, or only view its contents. Maler asks, “Is that opt-in? Is that opt-out? No. But it’s permissioning of a very powerful sort.”234

233 Interview 234 Maler, 2017 235 Ibid. 108 Clearly Opaque: Privacy Risks of the Internet of Things EMERGING FRAMEWORKS AND STRATEGIES

• OAuth 2.0 – a protocol for authorization According to Maler, these technology flows for applications and devices families help systems meet high privacy implementation standards such as those • OpenID Connect – an identity required in the GDPR for unambiguous authentication layer built upon consent, proof of that consent, data OAuth 2.0 minimization, accuracy, and quality.238 • User-Managed Access (UMA) – a IDM technology also directly addresses protocol designed to give people a boundary management concerns by unified control point for authorizing who providing tools to control how we let people and what can get access to their digital and organizations in and keep them out. data, content, and services, irrespective More broadly, all of the above identity of their location management technologies, concepts, and perspectives are helpful discussion topics • Health Relationship Trust (HEART) – when attempting to envision a world where patient-centric health data sharing from people have fine-grained control over the heterogeneous sources; based on the sharing of data collected by their devices. three above technologies236 If the IoT does indeed herald a much more • Consent receipts – track a user’s monitored environment, then strategies consent, linking it to a privacy policy, and technologies that help people shape making it available across their informational life through selective organizational boundaries237 sharing are essential.

236 See http://openid.net/wg/heart/ 237 See https://kantarainitiative.org/confluence/ display/infosharing/ Consent+Receipt+Specification 238 Maler, 2017 EMERGING FRAMEWORKS Clearly Opaque: Privacy Risks of the Internet of Things 109 AND STRATEGIES

NOTICE TIMING NOTIFICATION Research has shown that the timing STRATEGIES of notices has an impact on their effectiveness. Notifications are most Both the US and Europe are heavily commonly displayed to users at setup, invested in Notification as a privacy and when programs, services, or devices are data protection strategy. While there is first used. But notifying users at the time 239 vigorous debate as to its effectiveness, when data is being actively collected or it remains a bulwark of modern privacy used is also a good way to get a user’s regimes. Fortunately, there is active attention. These ‘just-in-time’ notices research and policy discussion in the support people’s ability to make a decision following areas: about whether to accept or reject a data practice. Crucially:

Just-in-time notices and obtaining express consent are particularly relevant for data practices considered sensitive or unexpected. For instance, in the case of mobile apps, access to sensitive information such as the user’s location, contacts, photos, calendars, or the ability to record audio and video should be accompanied by just-in-time notices.240

239 See, e.g., Calo, 2012; Sloan and Warner, 2013 240 Schaub et al., 2015, pp. 6-7 110 Clearly Opaque: Privacy Risks of the Internet of Things EMERGING FRAMEWORKS AND STRATEGIES

The California State Attorney General,241 Context-dependent notices appear based UK Information Commissioner’s Office242 on changes in a user’s or system’s context, and Art29WP243 have all endorsed this such as when a user enters or leaves strategy and developers have been certain locations, or, in the case of social implementing it for some time. A common media, changes the intended audience of a example is when a browser or mobile app post. In theory, an IoT device could display asks for permission to know your location privacy choices when it moves from inside or access your camera or contacts. the home to outside it, or from when a user is alone to when there are others present. Researchers also endorse three other An advanced form of this notice could timing-oriented strategies: periodic, inform the user about potential inferences context-dependent, and persistent.244 that could be made based on data gathered Periodic notifications remind users of in differing contexts.246 ongoing data practices, giving them the opportunity to reaffirm or adjust their Persistent notices provide awareness of consent over time. Schaub et al. write, ongoing data practices, such as icons in the “Periodic reminders of data practices can status bar of mobile phones to indicate that further help users maintain awareness location data is being used, or indicator of privacy-sensitive information flows. lights when a device is recording. Since Reminders are especially users can become quickly inured to their appropriate if data practices presence, though, a “system should only are largely invisible.”245 provide a small set of persistent indicators to indicate activity of especially critical data practices.”247 Nor is it efficient or effective to display everything about a system’s data practices in one notice at one time. Instead, privacy notices should be layered: “notices shown at different times, using different modalities and interfaces, and varying in terms of content and Example of periodic notification granularity in a structured approach.”248 241 California Dep’t of Justice, 2013 242 UK Information Commissioner’s Office, n.d. 243 Art29WP, 2017, p. 28 246 Schaub, Interview 244 Schaub et al., 2015 247 Schaub et al., 2015, p. 7 245 Ibid., p. 7, emph added 248 Ibid. p. 5 EMERGING FRAMEWORKS Clearly Opaque: Privacy Risks of the Internet of Things 111 AND STRATEGIES

NOTICE COMPREHENSION problem. The solution to this is for It’s critical, of course, that notices be regulators to release guidance on the type understood by their audiences. The of notice that is permissible and complies opacity and density of most existing with existing regulations, and what level privacy policies are central complaints of innovation or leeway is acceptable.250 and hindrances; they are ‘written by lawyers for lawyers.’ Each of the strategies In response to these issues, the FTC held listed above attempts to improve a workshop on disclosure evaluation comprehension in one way or another, but in September of 2016. This important, the question remains: how can companies far-ranging workshop explored be encouraged to make their privacy comprehension, attention, cognitive policies understandable and useful? A key models, measurement, decision-making, answer is that they must test their notices and lowering the cost of notice testing. for comprehensibility: The resultant Staff Report251 is an essential primer for both industry and regulators. There needs to be testing that shows that this notice is actually effective and One burgeoning area of notification first of all, being noticed. That people research is the attempt to automate see that there’s a notice, that they can different parts of privacy notification, understand the notice, and that they can preference determination, and choice. act on the notice.249 Interdisciplinary researchers have been exploring ways to allow software to learn Florian Schaub further noted that, within about users’ privacy preferences by industry, there is often a fear that moving observing their choices and behavior away from one defined privacy policy over time.252 Based on what these software to something more integrated into the agents learn, they could automatically user experience could create duplicate configure a user’s privacy settings within messages that might not provide the a mobile phone or IoT device, or offer comprehensive picture of a data practice suggestions. These agents could also that a privacy policy would. This may be periodically ‘nudge’ users to examine or perceived as a legal or compliance

250 Ibid. 251 Federal Trade Commission, 2016 249 Schaub, Interview 252 Sadeh, 2017 112 Clearly Opaque: Privacy Risks of the Internet of Things EMERGING FRAMEWORKS AND STRATEGIES

revisit privacy options. Early research be derived from sensor data and the types shows that these nudges are effective in of analysis applied to the datasets, without motivating users to engage more with necessarily delving into trade secrets. their settings.253 Automation is also being Further, when claiming that personal data employed to help users be more aware has been de-identified, the standards used of IoT devices in their environment. to do so should be disclosed. Researchers from Carnegie Mellon University have developed a ‘privacy- NOTICES CONTRIBUTING TO NORMS aware notification infrastructure,’ where Finally, IoT privacy policies should commit IoT devices broadcast their existence companies to the principle that consumers and their sensing capabilities, which own the sensor data generated by their are then collected by an IoT resource bodies, cars, homes, phones and other registry, and users are notified by an IoT connected devices.256 While creating Assistant on their mobile device.254 Such a proprietary interests in personal data is design engages directly with the privacy an extremely complicated proposal,257 problems of device invisibility and their the norm created by such a commitment lack of substantive interfaces. would be highly advantageous, amplifying citizen’s expectations about the value NOTICE CONTENT of their personal data and how it should As we discussed in the Privacy be stewarded. Management section, there is a troubling pattern emerging of manufacturers not Taken together, the above notification sufficiently disclosing how devices collect enhancement strategies represent some data. Regulators should issue guidance of the best research and development on privacy policy disclosures about the available to increase people’s awareness of nature and use of sensor data: what how data about them is gathered and used. sensors are on the device, what data is The strategies provide meaningful ways gathered, where it is stored, and whether it for people to affect the data practices going is encrypted or de-identified.255 The same on around them, while also reducing the is true of the types of inferences that can cognitive burdens involved in doing so.

253 Ibid. 254 Ibid. 256 Ibid. 255 Peppet, 2014, p. 162 257 Prins, 2006 EMERGING FRAMEWORKS Clearly Opaque: Privacy Risks of the Internet of Things 113 AND STRATEGIES

EU GENERAL DATA GOVERNANCE PROTECTION REGULATION STRATEGIES The most significant governance development pertinent to the IoT is the This section covers legislative strategies, EU’s GDPR. While we have highlighted as well as public-private and market- specific elements of the GDPR in prior driven governance strategies. We sections (such as the PIA section above), discuss emerging law, suggested legal a comprehensive review of this expansive changes and rights, risk standards and law is beyond the scope of this report, certification regimes. but it will clearly affect how companies collect data and notify users about its usage. These are early days for the law and there is sure to be much ambiguity in the months and years ahead. Still, with its global focus and expensive sanctioning power, the GDPR’s impact on the privacy posture of devices and the services behind them will potentially have wide effect. Europe’s ePrivacy Directive, which is still being negotiated, is expected to play a similarly strong role. It specifically cites the IoT, noting that the principle of confidentiality applies to machine- to-machine communications, and that additional specific safeguard could be adopted under sectoral regulation.258

258 European Commission, 2017b, Recital 12 114 Clearly Opaque: Privacy Risks of the Internet of Things EMERGING FRAMEWORKS AND STRATEGIES

BASELINE PRIVACY LAW IN THE US DELETION RIGHTS As we mentioned in the main Governance The GDPR stipulates a right to erasure section, one oft-discussed strategy is the in cases when data is no longer needed enactment of baseline, omnibus federal for processing, when people withdraw privacy law for the United States. This consent, and other reasons.259 It’s not an would help to clarify privacy expectations absolute right, and it may prove difficult to for the populace, and improve the state exercise in some cases, but it establishes of privacy that results from a porous, the core principle that people have a sectoral regulatory regime. The European right to cause organizations to delete data model need not be absorbed wholesale; about them. This is a new and important it serves as an instructive example. If privacy right, speaking directly to issues indeed sectors and social contexts are of autonomy, choice, and strong user collapsing, an omnibus, baseline federal control. Sometimes called ‘the right to be privacy law is one method to strengthen forgotten,’ this right has been the source privacy as the number and diversity of IoT of much contention. Detractors see it as a devices increase. step towards erasing historical memory or enabling the hiding of past misdeeds. The GDPR goes out of its way to address such criticisms, carving out exceptions for public interest, public health, and scientific or historical inquiry.260 Still, the power of this new right should not be overlooked, and non-European countries should also implement it in service of progressive policy. As the IoT is likely to collect enormous amounts of personal and sensitive data, and since people may not be aware of the scope of it, a right to erase data is a valuable way to give people enhanced control over that data. California has implemented a limited form of deletion rights for

259 GDPR, Art. 17 260 Ibid. EMERGING FRAMEWORKS Clearly Opaque: Privacy Risks of the Internet of Things 115 AND STRATEGIES

children, requiring websites, apps, and USE REGULATIONS other service providers to make their Given the view that Notice and Consent content or information invisible upon may be failing strategies in the face of request (but it may stay resident on the ever-increasing data collection, some have providers’ servers).261 suggested regulating data by use, rather than trying to declare all expected data The chance that deletion rights will uses up front at the time of consent.262 emerge in US national legislation is very In other words, after collection, some low. That said, the norm is important – it data uses would simply be disallowed. creates an expectation that organizations One of the world’s first data protection should delete data when a person wishes laws, the US Fair Credit Reporting Act, them to. Businesses often grant rights, is a use regulation: it specifies that powers, options and protections that consumer credit data may only be used the law does not mandate, such as the in creditworthiness, employment, and periodic notification examples we cite housing decisions.263 With regard to the above. There is nothing preventing US IoT, we’ve identified the problems with businesses from offering people the right data collected in one context leaking into to have their data deleted when they stop others. Not only can this violate users’ using a service, and they should be urged expectations, but there is the danger of to do so by the privacy community and discrimination when unwanted parties the public. Deletion rights are especially learn about personal characteristics, valuable for people whose data was such as the case of an employer learning collected when they were children, but about an employee’s personal or political we argue that the principle has general activities. Prof. Scott Peppet argues that application. It supports the view that privacy advocates and consumer groups people are allowed to change their minds should focus on keeping IoT data from about consent and data sharing, violating contextual boundaries. For and it can help rectify data collection example, IoT health and fitness data or users were unaware of. details learned from in-home devices should be restricted from use by insurers, lenders and employers:

262 Cate, et al., 2014 261 CA SB-568 263 Fair Credit Reporting Act, 15 U.S.C. § 1681 116 Clearly Opaque: Privacy Risks of the Internet of Things EMERGING FRAMEWORKS AND STRATEGIES

A woman tracking her fertility should Such types of prohibitions are already not fear that a potential employer could being enacted to prevent car insurers access such information and deny from conditioning the sale or claim her employment; a senior employee payment of an insurance policy upon monitoring his fitness regime should gaining access to a vehicle’s event data not worry that his irregular heart rate recorder (‘black box’).266 A central point of or lack of exercise will lead to demotion contention in the use regulations debate or termination; a potential homeowner is whether it should be businesses or seeking a new mortgage should not regulators that specify which uses violate be concerned that in order to apply expectations.267 The kinds of prohibitions for a loan she will have to reveal her Peppet suggests would be legislatively fitness data to a bank as an indicator of driven, but there is also an incentive character, diligence, or personality.264 for businesses to consider restricting their data uses in service of consumer Similarly, people should neither be acceptance of IoT technologies. For forced nor economically coerced into example, businesses have wide latitude in revealing data collected by IoT devices. which defaults they choose to enable in Peppet notes: their products. Starting from the idea that certain data is sensitive, such as health One can easily imagine health and life and fitness information, manufacturers insurers demanding or seeking access can choose to deploy devices with sharing to fitness and health sensor data, or and publicizing turned off. This comports home insurers demanding access to with the Art29WP’s view: “Information home-monitoring system data. As such published by IoT devices on social data become more detailed, sensitive, platforms should, by default, not become and revealing, states might consider public or be indexed by search engines.”268 prohibiting insurers from conditioning coverage on their revelation.265

266 See Rosner, 2016a, pp. 3-5; Peppet, 2014, pp. 152-155 264 Peppet, 2014, p. 151 267 See Rosner 2016b 265 Ibid., p. 151 268 Art29WP, 2014, p. 23 EMERGING FRAMEWORKS Clearly Opaque: Privacy Risks of the Internet of Things 117 AND STRATEGIES

RISK STANDARDS • FTC’s standard for Fairness: [1] the Standards that address risk management act or practice causes or is likely to in the IoT are nascent at best. There is cause substantial injury to consumers opportunity to leverage guidance from [2] which is not reasonably avoidable other domains whose institutionalized by consumers themselves and [3] not risk standards have proven to be capable. outweighed by countervailing benefits to Examples include: consumers or to competition.269

• US Dep’t of Health and Human Services • De-Identification: The de-identification Risk of Harm Threshold for Breach (de-ID) of data is maturely practiced in Notification under the ‘HITECH’ Act: the health field, and there are several a breach is a use or disclosure that methodologies that facilitate the “compromises the security or privacy management of re-identification risk. of the protected health information” NIST’s Internal Report 8053270 and the that “poses a significant risk of financial, HITRUST De-Identification Framework271 reputational, or other harm to the are valuable collections of de-ID methods individual.” A breach is presumed to and frameworks. The Future of Privacy have occurred unless the business Forum has helpfully assembled “A Visual proves that there is a low probability that Guide to Practical De-Identification” protected health information has been that displays the spectrum of fully compromised. identifiable personal data to aggregated anonymous data.272

269 FTC, 1980 270 Garfinkel, 2015 271 HITRUST, n.d. 272 Finch, 2016 118 Clearly Opaque: Privacy Risks of the Internet of Things EMERGING FRAMEWORKS AND STRATEGIES

SECTOR-SPECIFIC PROTECTIONS: • Companies should be required to explain WEARABLES fully and in clear language what their In their comprehensive report, “Health data practices are, and there should Wearable Devices in the Big Data Era,” be standardization of terminology so the authors lay out a set of actionable that comparisons are possible. They requirements for the health and fitness should also be required to make public wearable sector: disclosures about how they analyze data and use the derived knowledge. • All data collected from a health or wellness wearable device should be • Wearable and other connected-health considered sensitive, and thus require an companies should not share user affirmative and effective consent process information with any third parties where before they can be collected and used. advertising, marketing, or the promotion of other services are involved. • Clear, enforceable standards should be established for both the collection and • Companies should comply with a use of information on wearables and person’s request for her own data as other Internet-connected devices, with soon as possible and at the lowest cost. allowances for consumers to place limits on the data collected by and about them. EMERGING FRAMEWORKS Clearly Opaque: Privacy Risks of the Internet of Things 119 AND STRATEGIES

• The metrics used to determine how CERTIFICATIONS de-identification is most effectively Various government and private sector accomplished should be disclosed and bodies have begun publicly discussing subject to independent verification. what an IoT certification scheme might look like. In 2016, the European • Wearable devices and apps should be Commission (EC) expressed support for tested to determine that consumers a Trusted IoT Label as part of its larger will be able to understand their privacy Digital Single Market program to help choices and terms of services. consumers better understand varying levels of privacy and security in IoT • Self-regulatory organizations should products. The EC likened this idea to develop standards that apply to all energy efficiency labeling requirements, sectors of the consumer connected- in place since 2010.274 In September of health industry, along with a process for 2017, the EC released a proposal for new independent auditing. regulations to bolster cybersecurity in the European Union, in which they discuss • The various participants in the digital the role of cybersecurity certification health sector, including the wearable and in relation to the IoT: “The digital single mobile apps industry, should develop a market, and particularly the data economy set of fair marketing practices for using and the Internet of Things, can only thrive health-related data.273 if there is general public trust that such products and services provide a certain level of cybersecurity assurance.”275

274 European Commission, 2016a, footnote 88 273 Montgomery et al., 2017, pp. 6-7 275 European Commission, 2017a, p. 31 120 Clearly Opaque: Privacy Risks of the Internet of Things EMERGING FRAMEWORKS AND STRATEGIES

The private sector has also begun to • Data security propose certification schemes. The UK • Customer and consumer privacy nonprofit IoT Security Foundation has launched a voluntary, self-assessed Best • Data governance Practice User Mark “intended to help users • Hardware & software security communicate publically that they take IoT security seriously, are IoT security aware • Interoperability and are conscious of their responsibilities • Provenance as a supplier of IoT products or services.”276 Also in the UK, work on an ‘IoT Mark’ • Lifecycle has begun that would be formalized as a • Accountability in the supply chain consumer-facing certification mark under British law. Growing out of the London IoT Meetup community, this mark scheme has attracted the attention of a wide range of stakeholders and will cover277:

276 IoT Security Foundation, n.d. 277 See https://iotmark.wordpress.com/ EMERGING FRAMEWORKS Clearly Opaque: Privacy Risks of the Internet of Things 121 AND STRATEGIES

In the US, Consumer Reports has partnered The schemes listed above are potential with Ranking Digital Rights, a nonprofit components of IoT governance, norm setting, research project that reviews commercial consumer purchasing behavior, government privacy policies; and the Cyber Independent procurement, and product liability. The Testing Lab, a nonprofit software security Digital Standard, in particular, is the most testing organization founded by security comprehensive attempt to unify product expert Peiter ‘Mudge’ Zatko and others, to features with value-laden characteristics, create The Digital Standard: “an ambitious, such as commitment to human rights, under open, and collaborative effort to create a a single certification banner. digital privacy and security standard to help guide the future design of consumer software, digital platforms and services, and Internet- connected products.”278 The Standard uses a wide-ranging set of ‘tests’ to evaluate IoT products and companies for characteristics such as safety, product stability, required password strengths, security lifecycle, willingness to disclose vulnerabilities, degree of data control by users, data retention and deletion, data minimization, transparency, and corporate social responsibility.279

278 See https://www.thedigitalstandard.org/ 279 See https://www.thedigitalstandard.org/ the-standard 122 Clearly Opaque: Privacy Risks of the Internet of Things

RECOMMENDATIONS RECOMMENDATIONS Clearly Opaque: Privacy Risks of the Internet of Things 123

In this section we make recommendations to improve the state of IoT privacy based on areas we find to be either unaddressed or weakly addressed by current efforts, as well as areas where existing strategies should be amplified. The recommendations fall into four categories: research, funding, fostering discussion, and governance.

124 Clearly Opaque: Privacy Risks of the Internet of Things RECOMMENDATIONS

CHILDREN’S ISSUES RESEARCH Most privacy research is focused on adults. Given that there will be connected EMOTION DETECTION toys marketed specifically for children, Much more research is needed on the not to mention household and public IoT subject of emotion detection, in particular: products that also capture children’s data, more research on children’s privacy is • People’s expectations about the essential, especially concerning: gathering and use of emotion data, both from adults and children • Emotion detection sentiment and analysis of children •Policy weaknesses around the collection of emotion data • Child development in relation > US: how it fits within the PII to IoT and AI toys framework, and the adequacy of this approach; whether the Children’s • A broadened exploration of children’s Online Privacy Protection Act privacy rights in light of new (COPPA) is adequate for the collection technology’s ability to reveal more of children’s emotional data to parents and commercial actors > EU: the evolving interpretation of emotion data under the GDPR > How policy and the law deal with ‘intimate’ but not yet identifiable personal data > Consumer protection dimensions of stockpiles of emotion data

• Security and privacy of existing emotion detection products (analyses should be performed by researchers and commercial firms) RECOMMENDATIONS Clearly Opaque: Privacy Risks of the Internet of Things 125

NORMS DISCRIMINATION Social norms are one of the most There is already a body of research important factors in determining how emerging on the discriminatory effects privacy manifests in product development of big data.281 What’s needed is additional and how the public reacts to these research on how IoT devices are part of products. Norms can either develop the supply chain of information that can organically or be fostered. More research lead to discriminatory practices, such as: is needed on ‘norm entrepreneurship’ – the intentional shaping of norms by • Legal, commercial and technical different actors. Research can help research on data collected by wearables, illuminate how: by cars, or in the home leading to discriminatory insurance offerings • Governance actors such as State Attorneys General affect the • Leakage of personal data from privacy landscape280 private contexts into the work environment leading to discriminatory • Commercial actors affect people’s firing or retribution, or effects on perception of privacy hiring practices

• Journalism can affect people’s • The melding of IoT data with privacy expectations credit reports

• Certification regimes, like the EU’s suggestion of a ‘Trusted IoT Label,’ can affect privacy views and purchasing habits

• The increased presence of microphones and cameras in the home affect behavior

281 See, e.g., Gangadharan, et al., 2014, “Data and Discrimination: Collected Essays”: https:// newamerica.org/documents/945/data-and-dis- 280 Citron, 2016 crimination.pdf 126 Clearly Opaque: Privacy Risks of the Internet of Things RECOMMENDATIONS

INSURANCE • Further research into which private The cyber insurance market is young and or government departments are usually focused on security issues, such as best positioned to make privacy ransom and data breaches, but insurance risk management an element policies for privacy-related failures are of merger reviews beginning to emerge. More research is needed on the intersection of privacy, • Further research into the privacy risk insurance, governance, norms, and firm calculus performed by shareholders behavior. Research into privacy insurance during mergers and acquisitions for harms beyond identity theft for individuals, such as reputational damage • Research into how Cost-Benefit Analysis from a data breach, would be valuable. can align with long-term social harms, such as the diminishment of private GOVERNANCE spaces or chilling effects on behavior Research into governance activities and speech will help to show which methods work and which need improvement. Critical • Continued legal research on IoT privacy areas include: in the context of product liability; this is especially important in light • Further comparative analysis and of the IoT’s abilities to affect the interdisciplinary research on the physical environment lessons learned from food and drug safety, environmental harm, and product • Research into the effects of the GDPR’s liability as models for governmental and class action rights private sector approaches to IoT privacy RECOMMENDATIONS Clearly Opaque: Privacy Risks of the Internet of Things 127

• Bridging identity management into FUNDING other fields – IDM is barely represented We recommend allocating more funding in the academic community to the following areas in order to enhance grantmakers’ portfolios of privacy • Projects that help consumers better interventions and programs: understand the privacy and security aspects of IoT products, such as • Programs to embed more Consumer Reports’ Digital Standard technologists with policymakers, such as the Congressional Innovation Fellowship • Creating visualizations and graphic program and the Ford-Mozilla Open communications of: Web Fellows program > data flows for non-specialist audiences • Privacy advocates and consumer > representations of privacy harms protection groups, with an emphasis and values on technology issues for the latter

• Journalists, both to ensure better training on new technologies, and to sustain adversarial journalistic organizations like ProPublica 128 Clearly Opaque: Privacy Risks of the Internet of Things RECOMMENDATIONS

FOSTERING In addition, we encourage: DIALOGUE • More dialogue between the risk management community and the Governments, nonprofits and advocacy academic privacy community organizations have a continuing role to play in fostering dialogue, both in • More dialogue between the insurance professional communities and in society community and the privacy community at large. Dedicated educational outreach, public service announcements, and • Active public campaigns to enhance discussions at relevant conferences people’s views of the value of their data should be conducted on the following topics: • More global dialogue to learn from different privacy cultures and • The state of public and governance models private surveillance • Professional dialogue to harmonize • The discourse of privacy values rather international risk frameworks and than harms: healthy democracy, full best practices development of one’s personality, decision-making free from manipulation • Professional dialogue to further develop metrics, models, data sets, and testbeds to test and evaluate IoT products and best practices

• Publishing of case studies of IoT product privacy development RECOMMENDATIONS Clearly Opaque: Privacy Risks of the Internet of Things 129

• Mandate data disclosure, transparency, GOVERNANCE and reporting to understand residual privacy risk from the IoT and manage The following are recommendations for this risk appropriately; e.g., expand IoT governance, broadly construed: laws to cover entities not covered by existing breach disclosure laws, require • Apply pressure on sector-specific disclosure of breach events irrespective legislation to include strong privacy and of whether a harm has been manifest, fairness elements require more detailed and standardized breach reports, develop repositories • Help create enhanced methods for of privacy breach incidents to help American users to report privacy companies with situational awareness, infringements; similar to Europeans’ require breach reporting within ability to report companies to data 72 hours protection authorities

• Facilitate cyber insurance markets • Champion policies that require IoT manufacturers to indicate how • Actively enhance privacy norms at the long they will provide security updates government level through progressive for products procurement policies

• Experiment with tax, subsidy, and permit-based mechanisms to incentivize companies to make more socially desirable choices about privacy ‘pollution’ 130 Clearly Opaque: Privacy Risks of the Internet of Things RESEARCH PARTICIPANTS

Andrew McStay, Professor of Digital Life, RESEARCH Bangor University PARTICIPANTS Dawn Nafus, Anthropologist, Intel

The following people graciously lent their Heather Patterson, Senior Research time and voices to this research. Their Scientist, Privacy & Ethics, Intel titles are the ones they had at the time of their participation. Rafi Rich, CEO, S.U.iT.S. (Smarter Urban iT & Strategies)

Interviews Florian Schaub, Assistant Professor, Justin Brookman, Director, Privacy and School of Information, University Technology Policy, Consumers Union of Michigan

Michelle De Mooy, Director, Privacy & Lee Tien, Senior Staff Attorney, Electronic Data Project, Center for Democracy & Frontier Foundation Technology Joris van Hoboken, Professor of Law, Vrije Ben Dean, Ford/Media Democracy Fund Universiteit Brussel Technology Exchange Fellow, Center for Representative #1, IoT Industry Company Democracy & Technology Representative #2, IoT Industry Company Ian Glazer, VP Identity Product Management, Salesforce Senior US Government Official

Noah Harlan, Board Member, EdgeX Foundry Workshop Participants

Joe Jerome, Policy Counsel, Privacy & Jim Adler, VP, Toyota Research Institute Data Project, Center for Democracy & Technology Stephen Balkam, CEO, Family Online Safety Institute Meg Leta Jones, Assistant Professor, Communication, Culture & Technology, Justin Brookman, Director, Privacy and Georgetown University Technology Policy, Consumers Union

Eve Maler, VP Innovation & Emerging Dan Caprio, Chairman, Providence Group Technology, ForgeRock RESEARCH PARTICIPANTS Clearly Opaque: Privacy Risks of the Internet of Things 131

Betsy Cooper, Executive Director, Center Maria Rerecich, Director, Electronics for Long-Term Cyber Security, Testing Team, Consumers Union UC Berkeley Jatinder Singh, Senior Research Associate, Ben Dean, Ford/Media Democracy Fund University of Cambridge Technology Exchange Fellow, Center for Adam Thierer, Senior Research Fellow, Democracy & Technology Mercatus Center, George Mason Laura Denardis, Professor, School of University Communication, American University Lee Tien, Senior Staff Attorney, Electronic Justin Erlich, former Principal Advisor Frontier Foundation on technology, data, and privacy to Ian Wallace, Senior Fellow in the California Attorney General, California International Security Program & Co- Department of Justice Director of the Cybersecurity Initiative, Andrea Glorioso, Counselor (Digital New America Economy/Cyber), Delegation of the Danny Weiss, Regional Director, European Union to the United States Washington DC Area, Common Sense Stacey Gray, Policy Counsel, Future of Media Privacy Forum Timothy Yim, Director of Data & Privacy, Joe Jerome, Policy Counsel, Privacy & Startup Policy Lab Data Project, Center for Democracy & Meg Young, Ph.D Candidate, Information Technology School and Tech Policy Lab, University Jennifer King, Co-Director, Center for of Washington Technology, Society & Policy, UC Advocate Technologist Berkeley Representative, IoT Industry Company Emily McReynolds, Program Director, Tech Policy Lab, University of Representative, Social Media Company Washington Senior US Government Official Melanie Millar-Chapman, Manager, Senior Security Researcher Research, Office of the Privacy Commissioner of Canada 132 Clearly Opaque: Privacy Risks of the Internet of Things

BIBLIOGRAPHY BIBLIOGRAPHY Clearly Opaque: Privacy Risks of the Internet of Things 133

Aberbach, J. and Rockman, B. (2002). Antonakakis, et al. (2017). Understanding the Mirai Conducting and Coding Elite Interviews. PS: Botnet. Proceedings of the 26th USENIX Security Political Science and Politics, Vol. 35(4). 673-676. Symposium. Retrieved from https://www.usenix. org/system/files/conference/usenixsecurity17/ Ackerman, L. (2013). Mobile Health and Fitness sec17-antonakakis.pdf Applications and Information Privacy. Privacy Rights Clearinghouse. Retrieved from https:// Article 29 Data Protection Working Party. (2014). www.privacyrights.org/sites/default/files/mobile- Opinion 8/2014 on the on Recent Developments medical-apps-privacy-consumer-report.pdf on the Internet of Things. Retrieved from http:// ec.europa.eu/justice/article-29/documentation/ Agre, P. and Rotenberg, M. (1997). Technology opinion-recommendation/files/2014/wp223_en.pdf and Privacy: The New Landscape. Cambridge: MIT Press. Article 29 Data Protection Working Party. (2017). Guidelines on Automated individual decision- Akalu, R., et al. (2016). Paving the way for making and Profiling for the purposes of Regulation Intelligent Transport Systems (ITS): The Privacy 2016/679. Retrieved from http://ec.europa.eu/ Implications of Vehicle Infotainment Systems. newsroom/document.cfm?doc_id=44137 Proceedings of the 6th ACM Symposium on Development and Analysis of Intelligent Vehicular Article 29 Data Protection Working Party. Networks and Applications (pp. 25-31). (2018). Guidelines on transparency under New York: ACM. Regulation 2016/679. Retrieved from http:// ec.europa.eu/newsroom/article29/document. Altman, I. (1976). Privacy: A Conceptual Analysis. cfm?action=display&doc_id=51025 Environment and Behavior, 8(1): 7-30. Barth A., Datta, A., Mitchell, J., and Nissenbaum, Altman, I. (1977). Privacy Regulation: Culturally H. (2006). Privacy and Contextual Integrity : Universal or Culturally Specific?Journal of Social Framework and Applications. Proceedings Issues 33(3): 66-84. of the IEEE Symposium on Security and Privacy. Retrieved from https://ssrn.com/ Altman Vilandrie & Company. (2017). Are your abstract=2567438 company’s IoT devices secure? Internet of Things Breaches are Common, Costly for U.S Bennett, C. and Raab, C. (2003). The Governance Firms. Retrieved from http://www.altvil.com/ of Privacy: Policy Instruments in Global wp-content/uploads/2017/06/AVCo.-IoT-Security- Perspective. Burlington: Ashgate Publishing. White-Paper-June-2017.pdf 134 Clearly Opaque: Privacy Risks of the Internet of Things BIBLIOGRAPHY

Braun, V. and Clarke, V. (2006). Using thematic Castro D. and New, J. (2016). Everything the analysis in psychology. Qualitative Research in U.S. Government Is Doing to Help the Private Psychology, 3(2), 77-101. Sector Build the Internet of Things. Center for Data Innovation. Retrieved from http://www2. Brookman, J. and Hans, G.S. (2013). “Why datainnovation.org/2016-federal-support-iot.pdf Collection Matters: Surveillance as a De Facto Harm,” Center for Democracy and Technology, Cate, F., Cullen, P. and Mayer-Schönberger, V. available at (2014). Data Protection Principles for the 21st http://www.futureofprivacy.org/wp-content/ Century: Revising the 1980 OECD Guidelines. uploads/Brookman-Why-Collection-Matters.pdf Retrieved from https://www.oii.ox.ac.uk/archive/ downloads/publications/Data_Protection_ Bruening, P. and Patterson, H. (2016). A Context- Principles_for_the_21st_Century.pdf Driven Rethink of the Fair Information Practice Principles. Retrieved from https://ssrn.com/ Center for Information Policy Leadership. abstract=2843315 (2017). Comments by the Centre for Information Policy Leadership on the California Department of Justice. (2013). Privacy Article 29 Data Protection Working Party’s on the Go: Recommendations for the Mobile “Guidelines on Transparency”. Retrieved from Ecosystem. Retrieved from https://oag.ca.gov/ https://www.informationpolicycentre.com/ sites/all/files/agweb/pdfs/privacy/privacy_on_ uploads/5/7/1/0/57104281/cipl_response_to_ the_go.pdf wp29_guidelines_on_transparency-c.pdf

Calo, R. (2012). Against Notice Skepticism in Chang, A. (2018). The Facebook and Cambridge Privacy (And Elsewhere). Notre Dame Law Analytica scandal, explained with a simple Review, 87(3): 1027-1072. diagram. Vox. Retrieved from https://www.vox. com/policy-and-politics/2018/3/23/17151916/ Calo, R. (2013). Digital Market Manipulation. facebook-cambridge-analytica-trump-diagram George Washington Law Review, 82(4): 995-1051. Citron, D. (2016). The Privacy Policymaking of Camp, J., Henry, R., Meyers, S., and Russo, G. (2016). State Attorneys General. Notre Dame Law Review Comment in response to Dept. of Commerce 92(2): 747-816. NTIA “Request for Comments on the Benefits, Challenges, and Potential Roles for the Government Cloud Security Alliance Mobile Working Group. in Fostering the Advancement of the Internet of (2015). Security Guidance for Early Adopters Things”. Retrieved from https://www.ntia.doc.gov/ of the Internet of Things. Retrieved from files/ntia/publications/camp_et_al.pdf https://downloads.cloudsecurityalliance.org/ whitepapers/Security_Guidance_for_Early_ Camp, L. and Johnson, M. (2012). The Economics Adopters_of_the_Internet_of_Things.pdf of Financial and Medical Identity Theft. New York: Springer. Cogeco. (2017). The Cost of DDoS Attacks and Building the Business Case Campaign for a Commercial-Free Childhood. for Protection. Retrieved from https://www. (n.d.). Hell No Barbie: 8 reasons to leave Hello cogecopeer1.com/wp-content/uploads/2017/03/ Barbie on the shelf. Retrieved from http://www. Counting-the-Costs-of-DDoS-Attacks-DDoS- commercialfreechildhood.org/action/hell-no- Services-Whitepaper.pdf barbie-8-reasons-leave-hello-barbie-shelf Cohen, J. (2013). What Privacy is For. Harvard Law Review, 126(7): 1904-1933. BIBLIOGRAPHY Clearly Opaque: Privacy Risks of the Internet of Things 135

Cohen, J. (2014). The Private Life: Why We European Commission. (2017a). Cybersecurity Act. Remain in the Dark. London: Retrieved from https://ec.europa.eu/transparency/ Granta Books. regdoc/rep/1/2017/EN/COM-2017-477-F1-EN-MAIN- PART-1.PDF Consumer Watchdog. (2017). Google, Amazon Patent Filings Reveal Digital Home Assistant European Commission. (2017b). Regulation on Privacy Privacy Problems. Retrieved from http://www. and Electronic Communications. Retrieved from http:// consumerwatchdog.org/report/home-invasion- ec.europa.eu/newsroom/dae/document.cfm?doc_ google-amazon-patent-filings-reveal-digital- id=41241 home-assistant-privacy-problems European Union. (2012). Charter of Fundamental Costa, L. (2016). Privacy and the Precautionary Rights of the European Union 2012/C 326/02 Principle. Computer Law & Security Review 28(1): 14-24. Eversole, A., Day, T. and Brown, M. (2016). Comments of the US Chamber of Commerce Center for Advanced Dean, B. (2018). Strict Product Liability and Technology and Innovation. Retrieved from https:// the Internet of Things. Center for Democracy www.ntia.doc.gov/files/ntia/publications/cati. & Technology. Retrieved from https://cdt.org/ iotcommentsfinal.pdf files/2018/04/2018-04-16-IoT-Strict-Products- Liability-FNL.pdf Fadell, A., Matsuoka, Y., Sloo, D. and Veron, M. (2016). Smart System that Suggests or DeCew, J. (2013). Privacy. In E. Zalta (ed.). Automatically Implements Selected Household Policies Stanford Encyclopedia of Philosophy. Stanford: Based on Sensed Observations. US2016/0259308A1 The Metaphysics Research Lab. Fairfield, J. and Engel, C. (2015). Privacy as a Public Draper, K. (2018, Mar 13). Madison Square Good. Duke Law Journal, 65(3), 385-457. Garden Has Used Face-Scanning Technology on Customers. New York Times. Retrieved from Federal Trade Commission. (1980). FTC Policy https://www.nytimes.com/2018/03/13/sports/ Statement on Unfairness. Retrieved from https:// facial-recognition-madison-square-garden.html www.ftc.gov/public-statements/1980/12/ftc-policy- statement-unfairness Edara, K. (2014). Keyword Determinations from Voice Data. US2014/0337131A1 Federal Trade Commission. (2015). Internet of Things: Privacy & Security in a Connected World. Retrieeved Epps, D., Saunders, K., and Tye, M. (2016). from https://www.ftc.gov/system/files/documents/ Verizon’s Comments. Retrieved from https:// reports/federal-trade-commission-staff-report- www.ntia.doc.gov/files/ntia/publications/vz_ november-2013-workshop-entitled-internet-things- comments_re_ntia_iot_notice_6-2.pdf privacy/150127iotrpt.pdf

European Commission. (2016a). Advancing Federal Trade Commission. (2016a). Putting the Internet of Things in Europe. Retrieved Disclosures to the Test: Staff Summary. Retrieved from from http://eur-lex.europa.eu/legal-content/ https://www.ftc.gov/system/files/documents/reports/ TXT/?uri=CELEX:52016SC0110 putting-disclosures-test/disclosures-workshop-staff- summary-update.pdf European Commission. (2016b). Digital Single Market – Digitising European Industry Questions Federal Trade Commission. (2017). Informational Injury and Answers. Retrieved from http://europa.eu/ Workshop. [slides] Retrieved from https://www.ftc.gov/ rapid/press-release_MEMO-16-1409_en.htm system/files/documents/public_events/1256463/ informational_injury_workshop_slides.pdf 136 Clearly Opaque: Privacy Risks of the Internet of Things BIBLIOGRAPHY

Finch, K. (2016). A Visual Guide to Practical Jones, M. (2014). Privacy Without Screens & the Data De-Identification. Future of Privacy Forum. Internet of Other People’s Things. Idaho Law Retrieved from https://fpf.org/2016/04/25/a- Review, 51(3): 639-660. visual-guide-to-practical-data-de-identification/ Froomkin, A.M. (2015). Regulating Mass Jones, M. and Meurer, K. (2016). Can (and Surveillance as Privacy Pollution: Learning from Should) Hello Barbie Keep a Secret? 2016 IEEE Environmental Impact Statements. University of International Symposium on Ethics in Engineering, Illinois Law Review 2015(5): 1713-1790. Science and Technology. Retrieved from https:// papers.ssrn.com/sol3/papers.cfm?abstract_ Foulon, J., Lanoie, P., and Laplante, B. (2002). id=2768507 Incentives for Pollution Control: Regulation or Information?, Journal of Environmental Economics Kaminsky, M. (2015). Robots in the Home: What and Management, 44(1): 169-187. Will We Have Agreed To? Idaho Law Review 51(3): 661-667. Garfinkel, S. (2015). NISTIR 8053: De- Identification of Personal Information. Karpf, D. (2017). Sorting through the Facebook- Retrieved from https://nvlpubs.nist.gov/nistpubs/ Russia-Trump Advertising Story. Retrieved ir/2015/NIST.IR.8053.pdf from https://medium.com/@davekarpf/sorting- through-the-facebook-russia-trump-advertising- Goodman, E. (2015). The Atomic Age of Data story-d096e3df3edb Policies for the Internet of Things. Queenstown: Aspen Institute. Kearney J., and Reynolds, A. (2016). Comments of the Consumer Technology Association. Gray, S. (2016). Always On: Privacy Implications Retrieved from: https://www.ntia.doc.gov/files/ of Microphone-Enabled Devices. Future of ntia/publications/cta_comments_re_ntia_iot_rfc- Privacy Forum. Retrieved from https://fpf.org/ final-060216_2.pdf wp-content/uploads/2016/04/FPF_Always_On_ WP.pdf Kohnstamm, J. and Madhub, D. (2014). Mauritius Declaration on the Internet of Things. 36th Haggerty K., and Ericson, R. (2000). British International Conference of Data Protection Journal of Sociology, 51(4): 605–622. and Privacy Commissioners. Retrieved from https://edps.europa.eu/sites/edp/files/ Hirsch, D., (2006). Protecting the Inner publication/14-10-14_mauritius_declaration_ Environment: What Privacy Regulation Can Learn en.pdf from Environmental Law. Georgia Law Review, 4(1): 1-63. Könings, B. and Schaub, F. (2011). Territorial Privacy in Ubiquitous Computing. Eighth HITRUST. (n.d.). HITRUST De-Identification International Conference on Wireless On-Demand Framework. Retrieved from https://hitrustalliance. Network Systems and Services. (pp. 104-108). net/de-identification/ New York: IEEE

IoT Security Foundation (n.d.) Best Practice User Konings, B., Schaub, F., and Weber, M. (2016). Mark FAQ and Terms of Use. Retrieved from Privacy and Trust in Ambient Intelligent https://www.iotsecurityfoundation.org/best- Environments. In S. Ultes, F. Nothdurft, T. practice-user-mark/ Heinroth, and W. Minker (Eds.). Next Generation Intelligent Environments. Dordrecht: Springer. BIBLIOGRAPHY Clearly Opaque: Privacy Risks of the Internet of Things 137

Koops, B-J, Newell, B., Timan, T., Škorvánek, I., Massey, C., Sharkey, S., Roland, J., and Crawford, Chokrevski, T., and Galič, M. (2017). A Typology D. (2016). Comments of T-Mobile USA, Inc. of Privacy. University of Pennsylvania Journal of Retrieved from https://www.ntia.doc.gov/files/ International Law 38(2): 483-575. ntia/publications/t-mobile_ntia_internet_of_ things_comments_vfinal.pdf Kyllo v. U.S., 533 U.S. 27 (2001). McDonald, A. and Cranor, L. (2008). The Cost of Lindblom, C. (1979). Still Muddling: Not Yet Reading Privacy Policies. I/S: A Journal of Law Through. Public Administration Review and Policy for the Information Society 4(3): 543- 39(6): 517-526. 565.

Lyon,D. (2008). Surveillance Society. Talk for McKay, K., Bassham, L., Turan, M., and Mouha Festival del Diritto, Piacenza, Italy. Retrieved N. (2017). NISTIR 8114: Report on Lightweight from http://www.festivaldeldiritto.it/2008/pdf/ Cryptography. Retrieved from https://nvlpubs. interventi/david_lyon.pdf nist.gov/nistpubs/ir/2017/NIST.IR.8114.pdf

Maheshwari, A. (2018, March 31). Hey, Alexa, McStay, A. (2016). Empathic media and What Can You Hear? And What Will You Do advertising: Industry, policy, legal and citizen With It? New York Times. Retrieved from https:// perspectives (the case for intimacy). Big Data & www.nytimes.com/2018/03/31/business/media/ Society 3(2): 1-11. amazon-google-privacy-digital-assistants.html McStay, A. (2018). Emotional AI: The Rise of Maldoff, G. (2016). Top 10 Operational Impacts Empathic Media. London: Sage. of the GDPR: Part 6 – RTBF and data portability. IAPP: The Privacy Advisor. [blog]. Retrieved from Montgomery, K., Chester, J., and Kopp, K. (2017). https://iapp.org/news/a/top-10-operational- Health Wearable Devices in the impacts-of-the-gdpr-part-6-rtbf-and-data- Big Data Era: Ensuring Privacy, Security, and portability/ Consumer Protection. Center for Digital Democracy. Retrieved from https://www.democraticmedia. Maler, E. (2017). Designing a New Consent org/sites/default/files/field/public/2016/aucdd_ Strategy for Digital Transformation. RSA wearablesreport_final121516.pdf Conference. Retrieved from https://www. rsaconference.com/events/us17/agenda/ National Highway Traffic Safety Administration. sessions/6826-designing-a-new-consent- (n.d.) Automated Driving Systems: FAQ: strategy-for-digital Voluntary Guidance. Retrieved from https://www. nhtsa.gov/manufacturers/automated-driving- Mallet, S. (2004). Understanding home: a critical systems#automated-driving-systems-faq review of the literature. The Sociological Review 52(1), 62-89. National Highway Traffic Safety Administration. (2016). Federal Automated Vehicles Policy. Martin, E.D. (1928). Education. In C. Beard Retrieved from https://www.transportation. (ed.), Whither Mankind: A Panorama of Modern gov/sites/dot.gov/files/docs/AV%20policy%20 Civilization. New York: Longman, Green and Co. guidance%20PDF.pdf Marx, G. (2012). Privacy Is Not Quite Like the Weather. In D. Wright and P. De Hert (Eds.). Privacy Impact Assessment. Dordrecht: Springer. 138 Clearly Opaque: Privacy Risks of the Internet of Things BIBLIOGRAPHY

Nissenbaum, H. (2009) Privacy in Context. Picard, R. (1995). Affective Computing. M.I.T Stanford: Stanford University Press. Media Laboratory Perceptual Computing Section Technical Report No. 321. Retrieved from https:// North, D. (1991). Institutions. The Journal of affect.media.mit.edu/pdfs/95.picard.pdf Economic Perspectives, 5(1): 97-112. Powles, J. (2015, Mar 11). We are citizens, not Office of the Privacy Commissioner of Canada. mere physical masses of data for harvesting. (2016). The Internet of Things: An introduction to The Guardian. Retrieved from https://www. privacy issues with a focus on the retail and home theguardian.com/technology/2015/mar/11/we- environments. Retrieved from https://www.priv. are-citizens-not-mere-physical-masses-of-data- gc.ca/en/opc-actions-and-decisions/research/ for-harvesting explore-privacy-research/2016/iot_201602/ President’s Council of Advisors on Science and Ohm, P. (2010). Broken Promises of Privacy: Technology. (2014). Big Data and Privacy: A Responding to the Surprising Failure of Technical Perspective. Retrieved from https:// Anonymization. UCLA Law Review, Vol. 57(6): obamawhitehouse.archives.gov/sites/default/ 1701-1777. files/microsites/ostp/PCAST/pcast_big_data_ and_privacy_-_may_2014.pdf Ovum. (2015). Understanding the IoT Opportunity: An Industry Perspective – Use cases Prins, C. (2006). When personal data, behavior and opportunities in different verticals. Retrieved and virtual identities become a commodity: from https://www.linkedin.com/pulse/ovum- Would a property rights approach matter? report-delivers-eight-clear-messages-internet- SCRIPT-ed, 3(4): 270-303. things-niall-hunt/ Prosser, W. (1960). Privacy. California Law Review Palen, L. and Dourish, P. (2003). Unpacking 48(3): 383-423. “privacy” for a networked world. Proceedings of the SIGCHI Conference on Human Factors in Regan, P. (1995). Legislating Privacy. Technology, Computing Systems (pp. 129-136). New York: ACM. Social Values, and Public Policy. Chapel Hill: University of North Carolina Press. Patterson, H. (2013). Contextual Expectations of Privacy in Self-Generated Health Information Reidenberg, J. (2014). Privacy in Public. University Flows. TPRC 41: The 41st Research Conference on of Miami Law Review 69(1): 141-160. Communication, Information and Internet Policy. Retrieved from https://papers.ssrn.com/sol3/ Rosner, G. (2016a). Internet of Things Privacy papers.cfm?abstract_id=2242144 Forum Comments Submitted to the National Telecommunications and Information Peppet, S. (2014). Regulating the Internet Administration. Retrieved from https://www. of Things: First Steps Toward Managing ntia.doc.gov/files/ntia/publications/rosner_ Discrimination, Privacy, Security, and Consent. response_-_iot_privacy_forum.pdf Texas Law Review 93(1): 85-176. Rosner, G. (2016b). In the age of connected devices, Pew Research Center. (2014). The Internet of will our privacy regulations be good enough? Things Will Thrive By 2025. Retrieved from http:// O’Reilly. [blog]. Retrieved from https://www.oreilly. www.pewinternet.org/2014/05/14/internet-of- com/ideas/in-the-age-of-connected-devices-will- things/ our-privacy-regulations-be-good-enough BIBLIOGRAPHY Clearly Opaque: Privacy Risks of the Internet of Things 139

Sadeh, N. (2017). Privacy in the Age of IoT: Schwartz, P. (1999). Internet Privacy and the Technologies to Help Users, Developers, IoT State. Connecticut Law Review, 32: 815-859. Vendors and Regulators. Presentation at Privacy Engineering Research and the GDPR: A Trans- Schwartz, P. (2004). Property, Privacy, and Personal Atlantic Initiative. Retrieved from https:// data. Harvard Law Review, 117(7): 2056-2128. www.cylab.cmu.edu/_files/pdfs/partners/ conference2017/Sadeh.pdf Schwartz, P. (2013). Information Privacy in the Cloud. University of Pennsylvania Law Review Saldaña, J. (2009). The Coding Manual for 161(6): 1623-1662. Qualitative Researchers. London: Sage. Scott, W. (2003). Institutional carriers: reviewing Santucci, G. (2011). The Internet of Things: The modes of transporting ideas over time and space Way ahead. In O. Vermesan and P. Friess (eds.) and considering their consequences. Industrial Internet of Things - Global Technological and and Corporate Change, 12(4): 879-894. Societal Trends From Smart Environments and Spaces to Green ICT. Aalborg: River Publishers. Sholtz, P. (2001). Transactions Costs and the Social Costs of Online Privacy. First Monday, 6(5). Santucci, G. (2013). Privacy in the Digital Retrieved from http://www.ojphi.org/ojs/index. Economy: Requiem or Renaissance? An Essay on php/fm/article/view/859/768 the The Future of Privacy. The Privacy Surgeon. Retrieved from http://www.privacysurgeon.org/ Sikdar, A., Behera, S., and Dogra, D. (2016). blog/wp-content/uploads/2013/09/Privacy-in- Computer-Vision-Guided Human Pulse Rate the-Digital-Economy-final.pdf Estimation: A Review. IEEE Reviews in Biomedical Engineering, 9, 91-105. Sarnacki, D. (1984). Analyzing the Reasonableness of Bodily Intrusions. Marquette Law Review, 68(1), Slomovic, A. (2015). Workplace Wellness, Privacy 130-153. and the Internet of Things. [blog post]. Retrieved from https://www.annaslomovic.com/single- Schaub, F., Balabako, R., Durity, A., and Cranor post/2015/01/31/Workplace-Wellness-Privacy- L. (2015). A Design Space for Effective Privacy and-the-Internet-of-Things Notices. Proceedings of the Eleventh Symposium on Usable Privacy and Security (SOUPS 2015) Singer, N. (2016 Feb 28). Why a Push for Online Retrieved from https://www.usenix.org/system/ Privacy Is Bogged Down in Washington. New files/conference/soups2015/soups15-paper- York Times. Retrieved from https://www.nytimes. schaub.pdf com/2016/02/29/technology/obamas-effort-on- consumer-privacy-falls-short-critics-say.html Schifferle, L. (2016). What is your phone telling your rental car? FTC Consumer Information. Sloan, R. and Warner, R. (2013). Beyond Notice [blog]. Retrieved from https://www.consumer.ftc. and Choice: Privacy, Norms, and Consent. Journal gov/blog/2016/08/what-your-phone-telling-your- of High Technology Law 14(2): 370-412. rental-car Solove, D. (2006). A Taxonomy of Privacy. Schneier, B. (2016, Jan 26). IoT Security: University of Pennsylvania Law Review, 154(3), The Market Has Failed. Schneier on Security. 477-560. Retrieved from https://www.schneier.com/news/ archives/2017/01/iot_security_the_mar.html 140 Clearly Opaque: Privacy Risks of the Internet of Things BIBLIOGRAPHY

Stewart, R. (2002). Environmental regulatory UN General Assembly. (1992). Rio Declaration decision making under uncertainty. In T. Swanson on Environment and Development. Retrieved (Ed.). An Introduction to the Law and Economics from http://www.un.org/documents/ga/conf151/ of Environmental Policy: Issues in Institutional aconf15126-1annex1.htm Design. Bingley: Emerald Publishing US Department of Commerce. (2017). Fostering Sunstein, C. (2003). Beyond the Precautionary the Advancement of the Internet of Things. Principle. University of Pennsylvania Law Review, Retrieved from https://www.ntia.doc.gov/files/ 151(3): 1003-1058. ntia/publications/iot_green_paper_01122017.pdf

Tene, O. (2011). Privacy: The New Generations. US Department of Health and Human Services. International Data Privacy Law, 1(1): 15–27. (2016). Examining Oversight of the Privacy & Security of Health Data Collected by Entities Not The Economist. (2018, Mar 22). A pedestrian has Regulated by HIPAA. Retrieved from https://www. been killed by a self-driving car. The Economist. healthit.gov/sites/default/files/non-covered_ Retrieved from https://www.economist.com/ entities_report_june_17_2016.pdf news/science-and-technology/21739149- driverless-tragedy-pedestrian-has-been-killed- US Department of Health, Education and Welfare. self-driving-car (1973). Records, Computers and the Rights of Citizens. Retrieved from https://epic.org/privacy/ Thierer, A. (2013). Technopanics, Threat Inflation, hew1973report/default.html and the Danger of an Information Technology Precautionary Principle. Minnesota Journal of US Department of Homeland Security. (2008). Law, Science & Technology, 14(1): 309-386. The Fair Information Practice Principles. Memorandum Number 2008-01. Retrieved Thierer, A. (2016). Permissionless Innovation: from https://www.dhs.gov/sites/default/files/ The Continuing Case for Comprehensive publications/privacy_policyguide_2008-01_0.pdf Technological Freedom. Arlington: Mercatus Center at George Mason University. US Environmental Protection Agency. (n.d.). Economic Incentives. Retrieved from UK Information Commissioner’s Office. (n.d.) https://www.epa.gov/environmental-economics/ Where should you deliver privacy information to economic-incentives. individuals? Retrieved from https://ico.org.uk/for- organisations/guide-to-data-protection/privacy- Uteck, A. (2013). Reconceptualizing Spatial notices-transparency-and-control/where-should- Privacy for the Internet of Everything. you-deliver-privacy-information-to-individuals/ (Unpublished doctoral dissertation). University of Ottawa, Ottawa, Ontario. UK Information Commissioner’s Office. (2016). Privacy regulators study finds Internet of Things shortfalls. Retrieved from https://ico.org.uk/ about-the-ico/news-and-events/news-and- blogs/2016/09/privacy-regulators-study-finds- internet-of-things-shortfalls/ BIBLIOGRAPHY Clearly Opaque: Privacy Risks of the Internet of Things 141

Vaismoradi, M., Jones, J., Turunen, H., and Snelgrove, S. (2016). Theme development in qualitative content analysis and thematic analysis. Journal of Nursing Education and Practice, 6(5). 100-110.

Van den Hoven, J., Guimarães Pereira, Â., Dechesne, F., Timmermans, J., and Vom Lehn, H. (2016). Fact sheet – Ethics Subgroup IoT – version 4.0. Retrieved from http://ec.europa.eu/ information_society/newsroom/cf/dae/document. cfm?doc_id=1751

Vila, T., Greenstadt, R., and Molnar, D. (2003). Why We Can’t be Bothered to Read Privacy Policies: Models of Privacy Economics as Lemons Market. Proceedings of the 5th International Conference on Electronic Commerce. Retrieved from http://www.dmolnar.com/papers/ econprivacy.pdf

Warren, S. and Brandeis, L. (1890). The Right to Privacy. Harvard Law Review, 4(5): 193-220.

Westin, A. 1967. Privacy and Freedom. New York: Atheneum.

Woolf, N. (2016 Oct 26). DDoS attack that disrupted internet was largest of its kind in history, experts say. The Guardian. Retrieved from https://www.theguardian.com/technology/2016/ oct/26/ddos-attack-dyn-mirai-botnet

Zuboff, S. (2015). Big other: surveillance capitalism and the prospects of an information civilization. Journal of Information Technology 30(1): 75–89. 142 Clearly Opaque: Privacy Risks of the Internet of Things APPENDIX 1

APPENDIX 1 IOT PRIVACY-RELEVANT STANDARDS AND CONSORTIA

MULTI-LAYER CONSORTIA IPSO Alliance https://www.ipso-alliance.org/ Eclipse Foundation IoT https://iot.eclipse.org/ oneM2M http://www.onem2m.org/ EdgeX Foundry https://www.edgexfoundry.org/ TeleHash http://telehash.org/ IoTivity https://www.iotivity.org/ OMA LightweightM2M http://www.openmobilealliance.org/wp/overviews/ lightweightm2m_overview.html W3C https://www.w3.org/WoT/

IDENTITY MANAGEMENT User-Managed Access Protocol https://kantarainitiative.org/confluence/display/uma/ Home Consent Receipts https://kantarainitiative.org/confluence/display/ infosharing/Consent+Receipt+Specification OAuth 2.0 https://oauth.net/2/ Health Relationship Trust (HEART) https://openid.net/wg/heart/ Authorization and Authentication for https://tools.ietf.org/wg/ace/charters Constrained Environments

STANDARDS DEVELOPMENT EFFORTS ITU-T SG20 Internet of https://www.itu.int/en/ITU-T/studygroups/2013-2016/20/ Things and Smart Cities and Pages/default.aspx Communities IEEE P1912 WG - Privacy and  Security Architecture for https://standards.ieee.org/develop/wg/P1912_WG.html Consumer Wireless Devices Working Group ISO 29100 - Privacy https://www.iso.org/standard/45123.html Framework APPENDIX 1 Clearly Opaque: Privacy Risks of the Internet of Things 143

CONNECTED BODY Personal Connected http://www.pchalliance.org/ Health Alliance Wireless Life Sciences http://wirelesslifesciences.org/ Alliance

CONNECTED HOME Thread Group https://www.threadgroup.org/ Z-Wave http://www.z-wave.com/ https://www.insteon.com/ HomePlug http://www.homeplug.org/ Home Gateway Initiative http://www.homegatewayinitiative.org/ Weave https://nest.com/weave/

CONNECTED BUILDINGS Smart Buildings Alliance http://www.smartbuildingsalliance.org/ EnOcean Alliance https://www.enocean-alliance.org/ Lonworks http://www.lonmark.org/

CONNECTED CARS AND TRANSPORTATION Genivi https://www.genivi.org/ Open Automotive Alliance https://www.openautoalliance.net/#about Automotive-grade https://www.automotivelinux.org/ IEEE 1609 - Wireless Access https://standards.ieee.org/develop/wg/1609.html in Vehicular Environments Cellular V2X https://www.qualcomm.com/invention/technologies/lte/ advanced-pro/cellular-v2x SAE J2945/1 - On-Board System Requirements for https://www.sae.org/standards/content/j2945/1_201603/ V2V Safety Communications IP Wireless Access in https://tools.ietf.org/wg/ipwave/charters Vehicular Environments 144 Clearly Opaque: Privacy Risks of the Internet of Things APPENDIX 2

APPENDIX 2 IOT PRIVACY-RELEVANT ACADEMIC CENTERS

SOCIAM: The Theory and Practice of Social Machines http://sociam.org

The Information Society Project, Yale Law School https://law.yale.edu/isp

Privacy Lab, Yale Law School https://privacylab.yale.edu

Department of Science, Technology, Engineering and Public Policy, University College London http://www.ucl.ac.uk/steapp

Center on Privacy & Technology, Georgetown Law  http://www.law.georgetown.edu/academics/centers-institutes/privacy-technology/index.cfm

Center for Information Technology Policy, Princeton University https://citp.princeton.edu/

Berkman Klein Center for Internet & Society, Harvard University https://cyber.harvard.edu

Oxford Internet Institute, Oxford University https://www.oii.ox.ac.uk

Digital Ethics Lab, Oxford University http://digitalethicslab.oii.ox.ac.uk

Cyber Security Oxford, University of Oxford https://www.cybersecurity.ox.ac.uk

Tech Policy Lab, University of Washington http://techpolicylab.org

Connected & Open Research Ethics, UC San Diego http://thecore.ucsd.edu APPENDIX 2 Clearly Opaque: Privacy Risks of the Internet of Things 145

Center for Internet & Society, Stanford University http://cyberlaw.stanford.edu

Pervasive Data Ethics for Computational Research, University of Maryland https://pervade.umd.edu

Cyber Technology Institute, De Montfort University  http://www.dmu.ac.uk/research/research-faculties-and-institutes/technology/cyber- technology-institute/cti-home.aspx

Internet Ethics program at the Markkula Center for Applied Ethics, Santa Clara University https://www.scu.edu/ethics/focus-areas/internet-ethics/

Security and Privacy Research Group, University of Birmingham http://sec.cs.bham.ac.uk

CyLab Security & Privacy Institute, Carnegie Mellon University https://www.cylab.cmu.edu

Security and Privacy Research Lab, University of Washington https://seclab.cs.washington.edu

Digital Policy Institute, Ball State University http://www.digitalpolicyinstitute.org

Laboratory of Pervasive Computing, Tampere University http://twitter.com/tampereunitech

Semaphore research cluster at the iSchool, University of Toronto, http://semaphore.utoronto.ca

Pervasive Interaction Technology Lab, IT University of Copenhagen http://pitlab.itu.dk

AI Now Institute, New York University https://ainowinstitute.org 146 Clearly Opaque: Privacy Risks of the Internet of Things APPENDIX 2

Data and Society, London School of Economics,  http://www.lse.ac.uk/study-at-lse/Graduate/Degree-programmes-2018/MSc-Media-and- Communications-Data-and-Society

Trustworthy Technologies Strategic Research Initiative, Cambridge University https://www.trusttech.cam.ac.uk

Tilburg Institute for Law, Technology, and Society  http://www.tilburguniversity.edu/research/institutes-and-research-groups/tilt/

Centre for Information Rights, University of Winchester https://www.winchester.ac.uk/research/building-a-sustainable-and-responsible-future/ centre-for-information-rights/

Centre for Research into Information, Surveillance and Privacy (CRISP) http://www.crisp-surveillance.com

The Nordic Centre for Internet and Society, Norwegian Business School  https://www.bi.edu/research/find-departments-and-research-centres/research-centres/ nordic-centre-for-internet-and-society/

Meaningful Consent in the Digital Economy Project, University of Southampton http://www.meaningfulconsent.org

Cyber-Physical Lab, Newcastle University, https://research.ncl.ac.uk/cplab/

Resilient Cyber-Physical Systems Lab, University of Maryland http://www.ece.umd.edu

Cyber Security Centre, University of Warwick https://warwick.ac.uk/fac/sci/wmg/research/csc/ APPENDIX 2 Clearly Opaque: Privacy Risks of the Internet of Things 147

Cyber Security Research Center, Ben-Gurion University https://cyber.bgu.ac.il

Interdisciplinary Research Centre in Cyber Security, University of Kent https://cyber.kent.ac.uk

Centre for Cyber Security, University of Surrey https://www.surrey.ac.uk/surrey-centre-cyber-security

Internet of Things and People Research Center, Malmö University http://iotap.mah.se

Horizon Digital Economy Research Institute, University of Nottingham https://www.horizon.ac.uk

Center for Long Term Cybersecurity, UC Berkeley https://cltc.berkeley.edu

Data & Society Research Institute https://datasociety.net

Sensor City, Liverpool John Moores University https://www.ljmu.ac.uk/about-us/sensor-city

International Computer Science Institute, UC Berkeley http://www.icsi.berkeley.edu/icsi/

New America Open Technology Institute https://www.newamerica.org 148 Clearly Opaque: Privacy Risks of the Internet of Things APPENDIX 3

APPENDIX 3 IOT PRIVACY-RELEVANT CONFERENCES

STRIVE 2018: First Intl. Workshop On Safety, Security, And Privacy In Automotive Systems Sep 18, 2018, Västerås, Sweden http://www.iit.cnr.it/strive2018

PST 2018: The Sixteen International Conference on Privacy, Security and Trust Aug 28-30, 2018, Belfast, UK http://pstnet.ca/pst2018/

CENTRIC 2018: The Eleventh International Conference on Advances in Human-oriented and Personalized Mechanisms, Tech and Services Oct 14-18, 2018, Nice, France http://iaria.org/conferences2018/CENTRIC18.html

IEEE ISC2 2018: IEEE International Smart Cities Conference Sep 16-19, 2018, Kansas City, MO http://sites.ieee.org/isc2-2018/

TELERISE 2018: Fourth Intl. Workshop On Technical And Legal Aspects Of Data Privacy And Security Sep 2, 2018 Budapest, Hungary http://www.iit.cnr.it/telerise2018/

CPDP2019: Computers, Privacy and Data Protection Jan 30 – Feb 1, 2019, Brussels, Belgium http://www.cpdpconferences.org

The 6th International Symposium on Security and Privacy on Internet of Things Dec 12 - 15, 2017, Guangzhou, China http://trust.gzhu.edu.cn/conference/SPIoT2017/

1st IEEE Intl Workshop on Intelligent Multimedia Applications and Design for Quality Living Dec 11 - 13, 2017, Taichung, Taiwan http://ism2017.asia.edu.tw/imad-2017/

SITN 2018 The 2nd International Workshop on Securing IoT Networks Jul 30 - Aug 3, 2018 Halifax, Canada http://cse.stfx.ca/~iThings2018/download/SITN-2018%20CFP_ZT_v3.pdf APPENDIX 3 Clearly Opaque: Privacy Risks of the Internet of Things 149

ICI 2018 The 5th International Symposium on Intercloud and IoT Aug 6 - 8, 2018 Barcelona http://www.ficloud.org/ici2018/

IoT-SECFOR 2018 2nd International Workshop on Security and Forensics of IoT Aug 27 - 30, 2018, Hamburg, Germany https://www.ares-conference.eu/workshops/iot-secfor-2018/

FTC PrivacyCon Feb 28, 2018, Washington DC https://www.ftc.gov/news-events/events-calendar/2018/02/privacycon-2018

Amsterdam Privacy Conference Oct 5 - 8, 2018 http://apc2018.com

Cybersecurity 2017: The 6th International Workshop on Cyber Security and Privacy Oct 12 - 14, 2017, Nanjing, China http://cyberc.org/Program/Security

IEEE Symposium on Security and Privacy May 20 - 24 May 2018, San Francisco, CA http://www.ieee-security.org/TC/SP2018/

International Conference on Intelligent Information Technologies Dec 20 - 22, 2017, Chennai, India https://www.iciit.in/

ACM Symposium on Applied Computing: Special Track on Internet of Things Apr 9 - 13, 2018 Pau, France http://infohost.nmt.edu/~zheng/doku.php?id=sac2018

3rd International Conference on Internet of Things, Big Data and Security Mar 19 - 21, 2018, Funchal, Madeira, Portugal http://iotbds.org/

14th EAI International Conference on Security and Privacy in Communication Networks August 8-10, 2018, Singapore http://securecomm.org

IoT Tech Expo Global April 18 - 19, 2018, London, UK https://www.iottechexpo.com/global/

AmI 2018 European Conference on Ambient Intelligence Nov 12 - 14, 2018, Golden Bay Beach Hotel, Larnaca, Cyprus http://www.cyprusconferences.org/ami2018 150 Clearly Opaque: Privacy Risks of the Internet of Things APPENDIX 3

The Eighth International Conference on Ambient Computing, Applications, Services and Technologies Nov 18 - 22, 2018 Athens, Greece http://iaria.org/conferences2018/AMBIENT18.html

ANT 2018 9th International Conference on Ambient Systems, Networks and Technologies May 8 - 11, 2018 Porto, Portugal http://cs-conferences.acadiau.ca/ant-18/

ICMU 2018 11th International Conference on Mobile Computing and Ubiquitous Networking Oct 5 - 8, 2018, Auckland, New Zealand http://www.icmu.org/icmu2018/

ISBA 2018 IEEE International Conference on Identity, Security and Behavior Analysis Jan 10 - 12, 2018 Singapore http://www3.ntu.edu.sg/conference/ISBA2018/home.htm

FiUSE 2018 International Workshop on Fog Computing in Internet of Things and Ubiquitous Systems Engineering Jun 11 - 12, 2018, Tallinn, Estonia https://fiuse2018.cs.ut.ee/

MMEDIA 2018 The Tenth International Conference on Advances in Multimedia Apr 22 - 26, 2018 Athens, Greece http://iaria.org/conferences2018/MMEDIA18.html

WONS 2018 14th Wireless On-demand Network systems and Services Conference Feb 6 - 8, 2018 Isola, France http://2018.wons-conference.org/

UBICOMM 2018 The Twelfth International Conference on Mobile Ubiquitous Computing, Systems, Services and Technologies Nov 18 - Nov 22, 2018 Athens, Greece http://iaria.org/conferences2018/UBICOMM18.html

MUE 2018 The 12th International Conference on Multimedia and Ubiquitous Engineering Apr 23 -25, 2018, Salerno, Italy http://www.mue-conference.org/2018 APPENDIX 3 Clearly Opaque: Privacy Risks of the Internet of Things 151

WristSense 2018: Workshop on Sensing Systems and Applications Using Wrist Worn Smart Devices Mar 19, 2018, Athens, Greece https://sites.google.com/view/wristsense2018

EUSPN 2018 9th International Conference on Emerging Ubiquitous Systems and Pervasive Networks Nov 5 - 8, 2018, Leuven, Belgium http://cs-conferences.acadiau.ca/euspn-18/

PICom 2018 16th IEEE International Conference on Pervasive Intelligence and Computing Aug 12 - 15, 2018, Athens, Greece http://cyber-science.org/2018/picom/

CoWPER 2018 Toward A City-Wide Pervasive EnviRonment Jun 11, 2018, Hong Kong http://secon2018.ieee-secon.org/the-cowper-workshop/

IUPT 2018 8th International Symposium on Internet of Ubiquitous and Pervasive Things May 8 - 11, 2018, Porto, Portugal https://hud-cs-research.github.io/iupt2018/

8th ACM MobiHoc 2018 Workshop on Pervasive Wireless Healthcare Workshop Jun 25, 2018, Los Angeles, USA https://www.sigmobile.org/mobihoc/2018/workshop-mobile-health.html

PerLS 2018 The Second International Workshop on Pervasive Smart Living Spaces Mar 19 - 23, 2018, Athens, Greece http://iotap.mah.se/perls2018/

PerIoT 2018 Second International Workshop on Mobile and Pervasive Internet of Things Mar 19 - 23, 2018, Athens, Greece https://periot.github.io/2018/

PerCom 2019 IEEE International Conference on Pervasive Computing and Communications ©2018 The Internet of Things Mar 11-15, Kyoto, Japan Privacy Forum http://www.percom.org/

12th EAI International Conference on This work is licensed under a Pervasive Computing Technologies for Healthcare Creative Commons Attribution- May 21 - 24, 2018, New York, NY NonCommercial-No Derivatives 4.0 http://pervasivehealth.org/ International License. Layout & Design Ian Estevens [email protected]