IEEE SYMPOSIUM ON SECURITY AND

Privacy : Shaping an Emerging Field of Research and Practice

Seda Gürses | Princeton University Jose M. del Alamo | Universidad Politécnica de Madrid

The emerging field of privacy engineering responds to the gap between research and practice, systematizing and evaluating approaches to capture and address privacy issues while engineering information systems.

rivacy engineering is an emerging research frame- Symposium on Security and Privacy. IWPE provides a P work that focuses on designing, implementing, forum for those interested in tackling the gaps and chal- adapting, and evaluating theories, methods, techniques, lenges in privacy engineering. With its explicit focus on and tools to systematically capture and address privacy engineering techniques and its interdisciplinary program issues in the development of sociotechnical systems. committee with members from computer science, law, We primarily situate the field in policy, social sciences, humanities, and design, the work- yet expect it to build on an intradisciplinary foundation, shop complements existing venues that focus mainly on leveraging techniques and tools from various computer presenting privacy solutions, like the Symposium on science subdisciplines, such as , Usable Privacy and Security (https://cups.cs.cmu.edu human–computer interaction, and machine learning. /soups) or treat privacy as a subfield of security engi- Because law, societal norms, ethical conceptualizations, neering, like the Privacy Enhancing Technologies Sym- and technological advances inform privacy, the field is posium (https://petsymposium.org). also inevitably interdisciplinary. Furthermore, devel- The first iteration of the workshop attracted 47 -del oping a robust practice will benefit from knowledge egates from academia, industry, government, and civil of existing business practices as well as organizational society. The presentations introduced different models studies, and psychology. Finally, we expect legislative, and frameworks for understanding privacy; illustrated policy, and organizational schemes to play a role in several methods, techniques, and tools; and provided incentivizing the development and adoption of privacy case studies of privacy-engineering practice in enter- engineering in practice;1 these also require evaluation prise systems. The programs and presentations can be through an engineering-centric lens. found at the workshop website, http://ieee-security Attention to privacy engineering as a research topic .org/TC/SPW2015/IWPE. increased dramatically after 2012 (see Figure 1). To facilitate the development of this emerging field, we The Need for Privacy Engineering organized the First International Workshop on Pri- Privacy research in computer science has pro- vacy Engineering (IWPE), co-located with 36th IEEE duced a rich array of privacy solutions; however, the

2 March/April 2016 Copublished by the IEEE Computer and Reliability Societies 1540-7993/16/$33.00 © 2016 IEEE integration of these into everyday engineering prac- 120 tice has been slow. In recent years, reports of privacy violations and technology companies’ failure to ful- 100 fill basic data protection requirements have become 80 commonplace, suggesting that we’re far from apply- es ticl ing privacy design know-how in practice. The conse- 60 quences are most evident in the exorbitant number of data breaches: in the US alone, 4,700 breaches have No. of ar 40 been made public since 2005.2 But when it comes to privacy, a data breach is only 20 one concern among many. Subtle engineering decisions 0 that ignore users’ privacy needs might have far-reach- 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014 ing consequences. Recent highlights include Snapchat Year violating user expectations and privacy by not deleting users’ messages; Firefox extension NoScript’s defaults Figure 1. The growing number of published privacy-engineering articles. leading to deanonymization attacks on users; and Facebook apps allowing the sharing of users’ friend net- works with advertisers. Moreover, when these design community, some have proposed monolithic privacy- decisions concern global infrastructures, such as cloud engineering methods; 8,9 however, these tend to assume services, grids, and mobile networks, privacy protec- a one-size-fits-all approach that disregards context, such tions applied at higher layers might be rendered moot. as organization type and development practices, immi- Past reports of Apple, Google, and Microsoft collect - nent privacy threats, and informational norms. ing location information gathered by their respective mobile devices from Wi-Fi hotspots—even when users Motivating a New Field of Study turn off location tracking—and Snowden’s revelations Privacy engineering addresses the lack of generalization about the US National Security Agency and the UK in existing approaches; shortage in efforts to integrate dif- Government Communications Headquarters surveil- ferent subdisciplines’ techniques and tools; the need to lance programs illustrate such domino effects.3 evaluate proposed approaches in different social, orga- The different examples underscore that addressing nizational, technical, and legal contexts; and concrete privacy is relevant when engineering technical infra- challenges emerging from the evolution of engineering structures, implementing organizational controls, and practices, technical architectures, legal frameworks, and designing user experience (UX). They also imply that social expectations. Included in this research frame- privacy solutions are potentially unknown to - work are projects that critically assess ways to respond ing teams, not practical to integrate into engineering to regulators’ and organizations’ increasing demands activities, not of interest to the organizations, or nonex- to implement and settle policy “through architecture, istent. Which of these cases hold and when? And what configuration, interfaces, and default settings,”10 also would it take to facilitate an engineering practice that called privacy or data protection by design.11,12 The field addresses these issues? These remain open questions. intends to address these gaps by consolidating existing Researchers and practitioners who engage in the privacy research. topic have made groundbreaking contributions to the Over the past several decades, computer scientists field but have rarely attended to the development of have recognized the quest to build privacy-friendly a privacy-engineering practice. Their contributions systems as a research challenge. Most efforts have fol- include a wide array of technical solutions that help lowed three prominent approaches. The first is what protect users’ privacy from diverse adversaries and in Sarah Spiekermann and Lorrie F. Cranor identified as different social contexts.4 These solutions are informed privacy by architecture,8 which aims to minimize the col- by rigorous investigations into particular ways in which lection or inference of sensitive information by unin- new technologies threaten privacy.5,6 They also illustrate tended parties, typically service providers. Researchers the way experts successfully utilize or transform tech- develop technologies that enhance privacy by apply- niques from various computer science subfields, such as ing techniques that hard-code constraints on data col- , software engineering, and social lection and processing in systems and by ensuring that computing.7 However, few of these efforts are invested no entity can single-handedly undo these constraints. in systematizing or generalizing their approaches so Privacy-enhancing technologies (PETs), such as Tor other organizations and can adopt and inte- for anonymous communications or private information grate them into their daily practices. In the research retrieval protocols for confidential search, are developed www.computer.org/security 3 IEEE SYMPOSIUM ON SECURITY AND PRIVACY

using this approach. PETs can be used as stand-alone All three approaches fundamentally differ in what privacy technologies, like Tor, or function as the primi- they consider to be privacy problems and solutions. tives in a privacy engineer’s toolbox, as in the case of This might be seen as productive plurality in research. zero-­knowledge proofs and differential privacy. However, isolation between the research communities; Spiekermann and Cranor identified a second their varying positions on the role of technology, law, approach as privacy by policy.8 It aims at “protecting and society; and the distance between researchers and consumer data from accidental disclosure or misuses practitioners lead to gaps and vulnerabilities. and facilitating informed choice options”—in other In practice, privacy-by-policy activities are often words, enforcing measures to ensure compliance with limited to privacy policy statements and checkboxes principles of data protection laws in information sys- for consent and don’t result in changes to engineering tems. Depending on jurisdiction, these requirements practice or system design. These activities have mainly might include specifying and notifying users of the been the bailiwick of the legal team,10 members of purpose of collection; lim- which might not have an iting collection and in-depth understand- use to this purpose; ing of engineering being transparent Purely technical approaches might prove privacy mechanisms’ about additional insufficient for aligning nuanced legal potential and limita- recipients of the policies with engineering artifacts. tions. Purely techni- data; and providing cal approaches might users access to their prove insufficient data for verification, for aligning nuanced correction, and deletion. legal policies with engineer- Proposed technologies include policy specification ing artifacts and can fall short of addressing responsibil- languages, policy negotiation and enforcement mecha- ities across organizations. In the absence of normative nisms, and design techniques to improve the readabil- guiding principles and evaluation, privacy-by-policy ity of privacy policies. approaches might result in a set of procedures that ful- A third approach, let’s call it privacy by interaction, fill compliance requirements but provide little effective focuses on sociotechnical designs that would improve protection. Moreover, top-down decisions to introduce users’ agency with respect to privacy in social settings. data protection mechanisms, if insensitive to the orga- The approach captures privacy matters that arise, for nization’s engineering culture, might be met with resis- example, between peers or in a workplace due to the tance. In general, transforming existing practices might introduction of information systems. These “lateral pri- be a precondition for engaging engineers who feel that vacy” concerns are related to but often distinct from con- privacy is an abstract problem, not an immediate prob- cerns regarding organizations collecting and processing lem, not their problem, or not a problem at all.8 data, as privacy-by-policy approaches address, and unin- In contrast, in privacy-by-architecture approaches, tended inferences, as privacy by architecture tackles. the conception and implementation of privacy-­ The social computing perspective, in which information protecting measures are mainly under the purview of systems facilitate social interactions, informs the meth- technical experts with in-depth knowledge of crypto- ods and techniques the approach uses. The objective is graphic and traffic analysis techniques. The objective to design systems that respect social norms regarding is to develop privacy tools or mechanisms that offer information flows and address privacy in the context formal guarantees—that is, fulfill quantifiable pri- of collective information practices.13,14 This approach’s vacy properties such as . Developing PETs techniques can help design teams create interactions requires mastering sophisticated engineering skills respectful of social and ethical norms. Feedback mech- mainly acquired through participation in the com- anisms about system functionality might help users munity of experts. Efforts to integrate PETs into sys- evaluate the impact of system use on their privacy and tem engineers’ toolboxes or into larger systems have change their future behavior accordingly. In addition to been limited. The absence of methods to implement, attending to individuals’ concerns, researchers evaluate integrate, and maintain PETs and the scant attention the potential impact of complex information systems on given to socialization of the tools might pose obstacles groups of users and society in general. For instance, this to taking them from the lab into the wild. Even when approach attempts to answer ambitious questions such experts integrate PETs into systems, they can face as whether we can develop mechanisms that use machine backlash,16 especially if the proposed mechanisms learning to reveal discrimination and social sorting, and introduce usability or performance tradeoffs or meet what properties constitute a fair sociotechnical system.15 political resistance.4

4 IEEE Security & Privacy March/April 2016 Vulnerabilities might arise as a result of treating the concerned with all aspects of the production of informa- different approaches as if they’re solutions that can be tion systems, including the conceptualization, design, applied independently of one another. For instance, maintenance, and removal from service. Owing to the whether a social network service’s photo-tagging feature complexity of privacy as a social and legal concept, we is more acceptable when tags are made public before or also borrow knowledge and know-how from privacy after the data subject’s confirmation, and whether these research and practice. tags should be revocable varies depending on each com- Responding to the methodological shortcomings munity’s data-sharing practices. Because their focus is we described, we follow Sjaak Brinkkemper’s lead on on tag design, UX engineers might treat photo storage method engineering and define privacy engineering as and accessibility to the service provider as irrelevant the field of research and practice that designs, imple- to their task. However, such separation of concerns ments, adapts, and evaluates theories, methods, tech- assumes that those potential risks that arise due to the niques, and tools to systematically capture and address underlying system architecture are independent of the privacy issues when developing sociotechnical sys- local tagging practices. As a consequence, users might tems.17 In this context feel empowered in negotiating their privacy in social settings, while becoming increasingly vulnerable to vio- ■■ privacy-engineering methods are approaches for sys- lations of privacy by powerful service providers. Simi- tematically capturing and addressing privacy issues larly, system engineers might constrain information during information system development, manage- flows in a way that strongly complicates and limits user- ment, and maintenance; facing design. Especially in the context of PETs, such ■■ privacy-engineering techniques are procedures, possibly matters could lead to usability problems that dampen with a prescribed language or notation, to accomplish system adoption. privacy-engineering tasks or activities; and Finally, even if all three approaches are applied in ■■ privacy-engineering tools are (automated) means that concert, some privacy concerns might fall out of scope. support privacy engineers during part of a privacy- Illustrative of such shortcomings is the tendency of all engineering process. three approaches to produce solutions that scale only to a single organization—a model that doesn’t reflect The definition would benefit from some further elab- the way new services, such as software as a service, are oration. First of all, what justifies identifying an engi- provisioned or the way free software projects are orga- neering activity as pertaining to privacy? And, how can nized. Similarly, the Internet and mobile communica- we answer this question if we don’t settle on a definition tion networks are examples of global infrastructure of privacy? As Deirdre Mulligan expressed elegantly in that require a different lens. Design decisions applied her IWPE keynote, privacy-engineering work requires to infrastructures can have grave implications for pri- embracing the plurality, contextuality, and contestability vacy protections that can be applied to technologies of privacy as a social, political, and legal concept. built on them. Since the Snowden revelations, efforts A primary example illustrating privacy’s plurality is to apply privacy-by-architecture methods and tech- the work of Daniel Solove, who distinguishes between niques in digital infrastructure design have gained in the right to be left alone, limited access, control, person- prominence, for instance, considering data minimi- hood, secrecy, and intimacy based on an extensive study zation to protect against TLS client fingerprinting. of torts in the US legal system.18 These efforts have shown that addressing privacy in Contextuality is best described using the justifica- the Internet’s underlying protocols, Web browsers, or tory framework developed by Helen Nissenbaum, who GSM standards is slow, complex, and readily domi- argues that privacy isn’t about control over or confi- nated by those with the greatest resources to influence dentiality of information, but rather ensuring appro- the process. Such processes can be stalled easily if, for priate information flows respectful of social norms in a example, the parties paying the tradeoff costs for pri- given context.13 For example, during a consultation, it’s vacy protection aren’t reaping the benefits. Although appropriate for a patient to disclose health information engineering methodologies can’t solve these politi- to the doctor, but not vice versa. cal conflicts, they might help improve the process of Contestability refers to the availability of multiple developing inclusive and effective privacy solutions for concepts around which disputes exist that can’t be set- global infrastructures. tled by an appeal to “empirical evidence, linguistic usage, or the canons of logic alone.”19 Contestability provides a Building Blocks language for conversing about privacy’s meaning, allow- In defining the field of privacy engineering, we lean on ing it to be flexible enough to capture very different pri- software engineering, the subfield of computer science vacy issues in rapidly changing sociotechnical systems www.computer.org/security 5 IEEE SYMPOSIUM ON SECURITY AND PRIVACY

(plurality) introduced in different contexts with varying some tools to incentivize and evaluate the way privacy information norms (contextuality). issues are addressed during the standards-making pro- To preserve its contestability, we refrain from fold- cess.24 Gina Fisk and her colleagues presented a method ing a specific conception of privacy into the definition to minimize privacy risks in cybersecurity data sharing of the field. However, we do assume that any defend- that prompted a discussion on the appropriateness able privacy methodology will draw on some norma- of making privacy claims in infrastructures built for tive theory of privacy, be it legal, social, or political. In national security and .25 Eve Maler intro- the absence of such a normative compass, we lack the duced an Internet-scalable consent mechanism that parameters against which to judge whether a method, might prove to be the “sweet spot” that attends to both technique, or tool attends to privacy. For example, technical and regulatory challenges in the context of at IWPE, Guy Zyskind and his colleagues illustrated Internet of Things.26 In addition, our second keynote how the blockchain can combine with storage to pro- speaker, Ian Oliver from Nokia Networks, illustrated vide a data management platform that equips users how privacy engineers can leverage concepts and tech- with greater control and transparency over their per- niques from the safety-critical domain. He highlighted sonal information.20 Although the blockchain can be checklists, dataflow modeling, and organizational roles used to fulfill very different goals, the authors evaluate as aids to enabling good engineering culture. its potential as a privacy-engineering technique. The The field’s robustness depends as much on the devel- selection, conceptualization, and appropriateness of opment of methodologies as it does on their implemen- privacy definitions for a privacy-engineering task are tation, adaptation, and evaluation. Reports and case topics of substantial interest to the field. studies that expose the challenges of implementing pri- In addition to exploring relevant theories of privacy vacy technologies are elemental to the generalization and engineering, the field calls for the development and systematization of privacy-engineering knowledge. of methods, techniques, and tools. At IWPE, Nicolas In their IWPE paper on secure two-party computation, Notario and his colleagues presented and evaluated Henrik Ziegeldorf and his colleagues implemented and PRIPARE, a method to integrate existing privacy- evaluated the performance of different protocols to help engineering best practices—including privacy require- nonexpert developers pick the framework that fits their ments elicitation and architectural techniques—into needs.27 In the process, the authors documented imple- the design process.21 Marit Hansen and her colleagues mentation challenges unique to each protocol. Rainer discussed a technique that would help engineers rec- Hörbe and Walter Hötzendorfer developed an evalua- oncile tensions between potentially conflicting privacy tion technique for federated identity management sys- goals, like unlinkability and transparency.22 And finally, tems that also can aid engineers in translating normative Fateme Shirazi and her colleagues compared experi- privacy principles into architectures.28 mental techniques and tools that engineers can use to In addition to addressing gaps in research, our assess the performance of or attacks on the Tor network definition of privacy engineering is comprehensive without violating user privacy.23 enough to encapsulate recent efforts in developing Privacy engineering foresees the use of these meth- standardized processes. Ann Cavoukian and her col- ods, techniques, and tools in developing information leagues defined privacy as a nonfunctional require- systems. By information systems, we refer to not only ment in the engineering process,11 whereas MITRE the technical artifact but the greater sociotechnical and the National Institute of Science and Technol- system. In using this term, we recognize that any sys- ogy characterized privacy engineering as a form of tem exists only in interplay with a host of social, politi- risk analysis.29,30 These definitions frame privacy cal, legal, and economic arrangements.1 We argue that narrowly and constrain the type of methodologies those in the privacy-engineering field need to be cogni- that can be used to those that are risk based. They’re zant of the greater material and social networks that the skewed toward privacy-by-policy approaches and engineered artifacts exist in. They should also support barely attend to privacy-by-interaction methodolo- sociotechnical design practices that aspire to develop gies. Informed by the diversity of research and prac- efficient and effective approaches to privacy and that, in tice, our definition of privacy engineering provides the process, help improve the lives of those affected by a broader framework in which existing and future these systems. efforts can be cultivated. Three papers at IWPE beautifully teased out the sociotechnical aspects of addressing privacy in infra- Looking Ahead structures. Nick Doty, after providing an overview of Efforts to address privacy using technical means are still the methods followed by the Internet Engineering Task scattered and disconnected. Few of these efforts explic- Force and the World Wide Web Consortium, described itly attend to generalizing and systematizing associated

6 IEEE Security & Privacy March/April 2016 engineering practices so as to be accessible to a wider Acknowledgments community. Public and private organizations’ continu- We thank our Organizing and Program Committee, the ing negative track record of privacy blunders suggests authors and attendees who contributed to the fruitful discus- both would benefit from the development of a privacy- sions held during the workshop sessions, and the IEEE Sym- engineering practice. posium on Security and Privacy workshops organizers for Privacy engineering responds to these gaps, and giving us the chance to kick off the First International Work- IWPE is a forum where community members can shop on Privacy Engineering. We’re especially indebted to come together to actively engage in the nascence of Helen Nissenbaum, Carmela Troncoso, Jaap-Henk Hoepman, this new field. At its first successful iteration, IWPE and Yod-Samuel Martin as well as the anonymous reviewers participants responded to the field’s challenges, iden- of this article for their comments. This work was completed tified gaps, and agreed on three issues that require with the generous support of the Information Law Institute urgent attention. at New York University, the Center for Information Technol- First, we need to develop methodologies to address ogy and Policy at Princeton University, a grant from the Flem- concerns of parties’ increased capacity to use machine ish Research Council, and the PRIPARE project funded by learning to draw inferences from datasets. With the EU’s 7th Framework Programme under grant agreement advances in software as a service, big data infrastruc- ICT-610613. tures, and , greater inferences can be made about individuals and user populations. These References inferences can be used to profile users, organize future 1. K.A. Bamberger and D.K. Mulligan, “New Governance, interactions, and drive a shift to data-centric software Chief Privacy Officers, and the Corporate Management engineering practice. What methods, techniques, and of in the United States: An Initial tools address surveillance, discrimination, and account- Inquiry,” Law Policy, vol. 33, no. 4, 2011, pp. 477–508. ability concerns attributed to such semantic power in 2. “Chronology of Data Breaches: Security Breaches 2005– sociotechnical systems? Present,” Privacy Rights Clearinghouse, Apr. 2005; Second, we must conduct empirical studies that https://www.privacyrights.org/data-breach. reflect on different contextual challenges to applying 3. N.P.J. Larson and S. Shane, “N.S.A. Able to Foil Basic Safe- privacy-engineering methods, techniques, and tools. guards of Privacy on Web,” New York Times, 5 Sept. 2013; Implicit assumptions about system architectures, labor, www.nytimes.com/2013/09/06/us/nsa-foils-much expertise, and organization type underlie methods, -internet-encryption.html. techniques, and tools. Empirical studies that explore 4. G. Danezis and S. Gürses, “A Critical Review of 10 how privacy issues are (or aren’t) currently addressed in Years of Privacy Technology,” Proc. Surveillance Cul- different engineering contexts and that evaluate which tures: A Society, 2010; http://homes methods, techniques, and tools are more appropriate in .esat.kuleuven.be/~sguerses/papers/DanezisGuerses a given context are crucial to the field’s success. SurveillancePets2010.pdf. Finally, we need metrics and analytics to evaluate 5. A. Narayanan and V. Shmatikov, “Robust Ee-anonymiza- the efficacy of privacy-engineering activities. Metrics tion of Large Sparse Datasets,” Proc. IEEE Symp. Security can be used to indicate the number of privacy viola- and Privacy, 2008, pp. 111–125. tions, track the number of a system’s fulfilled privacy 6. G. Acar et al., “FPDetective: Dusting the Web for Finger- requirements, choose privacy tools, or evaluate privacy printers,” Proc. ACM SIGSAC Conf. Computer & Commu- and performance tradeoffs. In some cases, rather than nications Security (CCS 13), 2013, pp. 1129–1140. quantification, analytical evaluation based on interdis- 7. S. Gürses and C. Diaz, “Two Tales of Privacy in Online ciplinary methodologies might be more appropriate. Social Networks,” IEEE Security & Privacy, vol. 11, no. 3, Both approaches are hot topics of future research. 2013, pp. 29–37. 8. S. Spiekermann and L.F. Cranor, “Engineering Privacy,” IEEE Trans. Software Eng., vol. 35, no. 1, 2009, pp. 67–82. hese are a subset of the exciting challenges at the 9. C. Kalloniatis, E. Kavakli, and S. Gritzalis, “Addressing Pri- T core of privacy engineering. We welcome the vacy Requirements in System Design: The PriS Method,” growing community of privacy-engineering research Requirements Eng., vol. 13, no. 3, 2008, pp. 241–255. and practice to join us in further shaping this field at 10. D.K. Mulligan and J. King, “Bridging the Gap between the next IWPE, to be held 25–26 May 2016, in San Jose, Privacy and Design,” Univ. Pennsylvania J. Constitutional California, co-located with the 37th IEEE Symposium Law, vol. 14, no. 4, 2011, p. 989. on Security and Privacy. Further information on IWPE 11. A. Cavoukian, S. Shapiro, and R.J. Cronk, “Privacy Engi- 2016 can be found at http://ieee-security.org/TC/ neering: Proactively Embedding Privacy, by Design,” Infor- SPW2016/IWPE. mation and Privacy Commissioner Office, Government www.computer.org/security 7 IEEE SYMPOSIUM ON SECURITY AND PRIVACY

of Ontario, 2014; https://www.ipc.on.ca/images Workshops, 2015, pp. 185–192 /Resources/pbd-priv-engineering.pdf. 25. G. Fisk et al., “Privacy Principles for Sharing Cyber Secu- 12. “Proposal for a Regulation of the European Parliament rity Data,” Proc. IEEE Security and Privacy Workshops, and of the Council on the Protection of Individuals with 2015, pp. 193–197. Regard to the Processing of and on the Free 26. E. Maler, “Extending the Power of Consent with User- Movement of Such Data (General Data Protection Regu- Managed Access: A Standard Architecture for Asynchro- lation),” COM/2012/011, European Commission, 2012. nous, Centralizable, Internet-Scalable Consent,” Proc. 13. H. Nissenbaum, Privacy in Context: Technology, Policy and IEEE Security and Privacy Workshops, 2015, pp. 175–179. the Integrity of Social Life,” Stanford Univ. Press, 2009. 27. J.H. Ziegeldorf et al, “Choose Wisely: A Comparison of 14. L. Palen and P. Dourish, “Unpacking Privacy for a Net- Secure Two-Party Computation Frameworks,” Proc. IEEE worked World,” Proc. SIGCHI Conf. Human Factors in Security and Privacy Workshops, 2015, pp. 198–205. Computing Systems, 2003, pp. 129–136. 28. R. Hörbe and W. Hötzendorfer, “ in 15. C. Dwork et al., “Fairness through Awareness,” Proc. 3rd Federated Identity Management,” Proc. IEEE Security and Conf. Innovations in Theoretical Computer Science, 2012, Privacy Workshops, 2015, pp. 167–174 pp. 214–226. 29. S. Shapiro et al., “Privacy Engineering Framework,” 16. R. Dingledine and N. Mathewson, “Anonymity Loves MITRE, Aug. 2014; www.mitre.org/publications Company: Usability and the Network Effect,”Security and /technical-papers/privacy-engineering-framework. Usability: Designing Secure Systems that People Can Use, L. 30. “NISTIR 8062: Privacy for Fed- Cranor and S. Garfinkel, eds., 2005, pp. 547–559. eral Information Systems,” S. Brooks and E. Nadeau, 17. S. Brinkkemper, “Method Engineering: Engineering of eds., Nat’l Inst. Standards and Technology, May 2015; Information Systems Development Methods and Tools,” http://csrc.nist.gov/publications/drafts/nistir-8062 Information and Software Technology, vol. 38, no. 4, 1996, /nistir_8062_draft.pdf. pp. 275–280. 18. D. Solove, “A Taxonomy of Privacy,” Univ of Pennsylvania Seda Gürses is a postdoctoral research associate at Princ- Law Rev., vol. 154, no. 3, 2006, p. 477. eton University’s Center for Information Technol- 19. D.K. Mulligan and C. Koopman, “Theorizing Privacy’s ogy Policy and an FWO (Fonds Wetenschappelijk Contestability: A Multi-Dimensional Analytic of Privacy,” Onderzoek–Vlaanderen) fellow at COSIC, Univer- iConferences Proc. Special Workshop on Information Privacy, sity of Leuven. She works on privacy and require- 2013, pp. 1026–1029. ments engineering, privacy enhancing technologies, 20. G. Zyskind et al., “Decentralizing Privacy: Using Block- and surveillance. Gürses received a PhD in computer chain to Protect Personal Data,” Proc. IEEE Security and science at the University of Leuven, Belgium. Contact Privacy Workshops, 2015, pp. 180–184. her at [email protected].. 21. N. Notario et al., “PRIPARE: Integrating Privacy Best Practices into a Privacy Engineering Methodology,” Proc. Jose M. del Alamo is an associate professor in the Informa- IEEE Security and Privacy Workshops, 2015, pp. 151–158. tion and Communications Technology (ICT) Systems 22. M. Hansen et al., “Protection Goals for Privacy Engineer- Engineering Department at the Universidad Politéc- ing,” Proc. IEEE Security and Privacy Workshops, 2015, pp. nica de Madrid. His research focuses on personal data 159–166. management issues, including privacy and identity 23. F. Shirazi et al., “Tor Experimentation Tools,” Proc. IEEE management, in the context of software and systems Security and Privacy Workshops, 2015, pp. 206–213. engineering. Del Alamo received a PhD in ICT sys- 24. N. Doty, “Reviewing for Privacy in Internet and Web tems engineering from the Universidad Politécnica de Standard-Setting,”­ Proc. IEEE Security and Privacy Madrid in 2009. Contact him at [email protected].

8 IEEE Security & Privacy March/April 2016