Quick viewing(Text Mode)

Hof Koops Leenes Anonymity and the Law in the Netherlands 090224

Hof Koops Leenes Anonymity and the Law in the Netherlands 090224

Tilburg University

Anonymity and the law in the van der Hof, S.; Koops, E.J.; Leenes, R.E.

Published in: Lessons from the identity trail

Publication date: 2009

Document Version Publisher's PDF, also known as Version of record

Link to publication in Tilburg University Research Portal

Citation for published version (APA): van der Hof, S., Koops, E. J., & Leenes, R. E. (2009). Anonymity and the law in the Netherlands. In V. Steeves, C. Lucock, & I. Kerr (Eds.), Lessons from the identity trail: Anonymity, privacy and identity in a networked society (pp. 503-521). Oxford University Press.

General rights Copyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners and it is a condition of accessing publications that users recognise and abide by the legal requirements associated with these rights.

• Users may download and print one copy of any publication from the public portal for the purpose of private study or research. • You may not further distribute the material or use it for any profit-making activity or commercial gain • You may freely distribute the URL identifying the publication in the public portal Take down policy If you believe that this document breaches copyright please contact us providing details, and we will remove access to the work immediately and investigate your claim.

Download date: 26. sep. 2021 lessons from the identity trail lessons from the identity trail anonymity, privacy and identity in a networked society

edited by ian kerr, valerie steeves, and carole lucock

oxford university press 1

Oxford University Press, Inc., publishes works that further Oxford University’s objective of excellence in research, scholarship, and education.

Oxford New York Auckland Cape Town Dar es Salaam Hong Kong Karachi Kuala Lumpur Madrid Melbourne Mexico City Nairobi New Delhi Shanghai Taipei Toronto

With offi ces in Argentina Austria Brazil Chile Czech Republic France Greece Guatemala Hungary Italy Japan Poland Portugal Singapore South Korea Switzerland Thailand Turkey Ukraine Vietnam

Copyright © 2009 by Oxford University Press, Inc.

Published by Oxford University Press, Inc. 198 Madison Avenue, New York, New York 10016

Oxford is a registered trademark of Oxford University Press Oxford University Press is a registered trademark of Oxford University Press, Inc.

All rights reserved. Subject to the Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 Canadian License, no part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior permission of Oxford University Press, Inc.

______

Library of Congress Cataloging-in-Publication Data Lessons from the identity trail : anonymity, privacy and identity in a networked society / Editors : Ian Kerr, Valerie Steeves, Carole Lucock. p. cm. Includes bibliographical references and index. ISBN 978-0-19-537247-2 ((hardback) : alk. paper) 1. Data protection—Law and legislation. 2. Identity. 3. Privacy, Right of. 4. Computer security—Law and legislation. 5. Freedom of information. I. Kerr, Ian (Ian R.) II. Lucock, Carole. III. Steeves, Valerie M., 1959- K3264.C65L47 2009 342.08’58—dc22

2008043016 ______

1 2 3 4 5 6 7 8 9 Printed in the United States of America on acid-free paper

Note to Readers This publication is designed to provide accurate and authoritative information in regard to the subject matter covered. It is based upon sources believed to be accurate and reliable and is intended to be current as of the time it was written. It is sold with the understanding that the publisher is not engaged in rendering legal, accounting, or other professional services. If legal advice or other expert assistance is required, the services of a competent professional person should be sought. Also, to confi rm that the information has not been affected or changed by recent developments, traditional legal research techniques should be used, including checking primary sources where appropriate.

(Based on the Declaration of Principles jointly adopted by a Committee of the American Bar Association and a Committee of Publishers and Associations.)

You may order this or any other Oxford University Press publication by visiting the Oxford University Press website at www.oup.com To Ejay – “And wait for it, There are only two of us now This great black night, scooped out And this fi reglow”

To David – for everything. contents

About this Book xi Acknowledgements xiii Contributors xv The Strange Return of Gyges’ Ring : An Introduction xxiii i. privacy 1 Chapter 1. Soft Surveillance, Hard Consent: The Law and Psychology of Engineering Consent 5 ian kerr, jennifer barrigar, jacquelyn burkell, and katie black Chapter 2. Approaches to Consent in Canadian Data Protection Law 23 philippa lawson and mary o’donoghue Chapter 3. Learning from Data Protection Law at the Nexus of Copyright and Privacy 43 alex cameron Chapter 4. A Heuristics Approach to Understanding Privacy-Protecting Behaviors in Digital Social Environments 65 robert carey and jacquelyn burkell Chapter 5. Ubiquitous Computing and Spatial Privacy 83 anne uteck Chapter 6. Core Privacy: A Problem for Predictive Data Mining 103 jason millar Chapter 7. Privacy Versus National Security: Clarifying the Trade-Off 121 jennifer chandler Chapter 8. Privacy’s Second Home: Building a New Home for Privacy Under Section 15 of the Charter 139 daphne gilbert Chapter 9. What Have You Done for Me Lately? Reflections on Redeeming Privacy for Battered Women 157 jena mcgill Chapter 10. Genetic Technologies and Medicine: Privacy, Identity, and Informed Consent 173 marsha hanen viii contents

Chapter 11. Reclaiming the Social Value of Privacy 191 valerie steeves ii. identity 209 Chapter 12. A Conceptual Analysis of Identity 213 steven davis Chapter 13. Identity: Difference and Categorization 227 charles d. raab Chapter 14. Identity Cards and Identity Romanticism 245 a. michael froomkin Chapter 15. What’s in a Name? Who Benefits from the Publication Ban in Sexual Assault Trials? 265 jane doe Chapter 16. Life in the Fish Bowl: Feminist Interrogations of Webcamming 283 jane bailey Chapter 17. Ubiquitous Computing, Spatiality, and the Construction of Identity: Directions for Policy Response 303 david j. phillips Chapter 18. Dignity and Selective Self-Presentation 319 david matheson Chapter 19. The Internet of People? Reflections on the Future Regulation of Human-Implantable Radio Frequency Identification 335 ian kerr Chapter 20. Using Biometrics to Revisualize the Canada–U.S. Border 359 shoshana magnet Chapter 21. Soul Train: The New Surveillance in Popular Music 377 gary t. marx Chapter 22. Exit Node Repudiation for Anonymity Networks 399 jeremy clark, philippe gauvin, and carlisle adams Chapter 23. TrackMeNot: Resisting Surveillance in Web Search 417 daniel c. howe and helen nissenbaum contents ix iii. anonymity 437 Chapter 24. Anonymity and the Law in the United States 441 a. michael froomkin Chapter 25. Anonymity and the Law in Canada 465 carole lucock and katie black Chapter 26. Anonymity and the Law in the United Kingdom 485 ian lloyd Chapter 27. Anonymity and the Law in the Netherlands 503 simone van der hof, bert-jaap koops, and ronald leenes Chapter 28. Anonymity and the Law in Italy 523 giusella finocchiaro Index 539 acknowledgments

This book, like the ID Trail project itself, owes its existence to significant funding and other equally important forms of support from the Social Sciences and Humanities Research Council and a number of private and public sector partners, including the Alberta Civil Liberties Association Research Centre, Bell University Labs, the British Columbia Civil Liberties Association, Canadian Internet Policy and Public Interest Clinic, Centre on Values and Ethics, the Department of Justice, Entrust Technologies, Electronic Privacy Information Center, IBM Canada Ltd., Management Board Secretariat Ontario, Microsoft, the Office of the Information and Privacy Commission of Ontario, the Office of the Privacy Commissioner of Canada, the Ontario Research Network in Electronic Commerce, Privacy International, and the Sheldon Chumir Foundation for Leadership in Ethics. We are thankful for the support provided and could not have produced this volume or any of our other key research outcomes without the help of these organizations. As a project that has sought to mobilize key research outputs in accessible language and across a variety of venues in order to assist policymakers and the broader public, we have worked closely with Canada’s Federal and Provincial Information and Privacy Commissioners. We thank them for their invaluable time, effort, and contributions to our work and for their general interest and support. Thanks also to Stephanie Perrin, a longtime member of Canada’s privacy advocacy community, for her role in putting us on the map during the early years of the project. A number of universities and groups have collaborated with us on this book. In particular, we would like to show appreciation to Giusella Finocchiaro and the Università di Bologna, as well as Bert-Jaap Koops, Ronald Leenes, and members of the Tilburg Institute for Law, Technology, and Society for wonderful and animating discussions on the topics we all treasure so dearly. As our base of operations, the University of Ottawa has been central to everything we have accomplished. Special thanks are due to Gilles Morier and Daniel Lefebvre for their superb and supererogatory efforts in helping us run the show, and to Common Law Dean Bruce Feldthusen, for always saying, “How can I help make it happen?” With more than a hundred undergraduate, graduate, and postgradu- ate students working with the project over a four-year period, it is not possible to thank each by name. There are two, however, whose extraordinary involvement at the outset of the project made it possible for all of the others to participate. Our heartfelt appreciation goes to Milana Homsi and Nur Muhammed-Ally for getting the ball rolling. Similar thanks is owed to Francine Guenette, our first project administrator, for paving the road to success. Your special efforts were deeply valued. xiv acknowledgments

Finding harmony among so many different voices while preserving the distinctness of each is no small task. To Amanda Leslie, our über-talented, lightning-quick, and highly reliable copyeditor, we feel privileged to have had the opportunity to work with you on this project. Thanks also to our acquisitions editor, Chris Collins, the ever-helpful Isel Pizarro, and to Jaimee Biggins and all members of the production team at Oxford University Press who have helped improve the quality of this book. We are also grateful to thirty or so anonymous reviewers for their effort and good judgment, upon which we greatly relied. Finally, our deepest and everlasting gratitude is owed to Julia Ladouceur, our project administrator, whose playful smile and gentle manner belie her stunning ability to do just about anything. Always with kindness, always with generosity, and always with a bedazzling perfection that somehow renders invisible any remnant or recollection of the fact that the world was never that way on its own. contributors carlisle adams Carlisle Adams is a Full Professor in the School of Information Technology and Engineering (SITE) at the University of Ottawa. In both his private sector and his academic work, he has focused on the standardization of cryptology and security technologies for the Internet, and his technical contributions include encryption algorithms, security authentication protocols, and a comprehensive architecture and policy language for access control in electronic environments. He can be reached at [email protected]. jane bailey Jane Bailey is an Associate Professor at the Faculty of Law, University of Ottawa. Her ongoing research focuses on the impact of evolving technology on signifi- cant public commitments to equality rights, freedom of expression, and multi- culturalism, as well as the societal and cultural impact of the Internet and emerging forms of private technological control, particularly in relation to mem- bers of socially disadvantaged communities. She can be reached at jane.bailey@ uottawa.ca. jennifer barrigar A doctoral candidate in the Law and Technology Program at the University of Ottawa, Faculty of Law, Jennifer Barrigar previously worked as legal counsel at the Office of the Privacy Commissioner in Canada. Her current privacy research builds on her interest in the creation, performance, and regulation of identities in online environments, focusing on the creation of the exclusively online “self” and its implications for privacy law and identity management technologies. She can be reached at [email protected]. katie black Katie Black is an LLB Candidate at the University of Ottawa, Faculty of Law. Her interest in privacy rights has led her to conduct research on the human rights implications of Canada’s no-fly list, the effects of battered women’s support pro- grams on personal identity, soft paternalism and soft surveillance in consent- gathering processes, and the impact of opening up adoption records on women’s reproductive autonomy. She can be reached at [email protected]. jacquelyn burkell Jacquelyn Burkell is an Associate Professor at the Faculty of Information and Media Studies, University of Western Ontario. Trained in psychology, she conducts xvi contributors empirical research on the interaction between people and technology, with a particular emphasis on the role of cognition in such interactions. Much of her work focuses on anonymity in online communication, examining how the pseudonymity offered by online communication is experienced by online com- municators and how this experience changes communication behavior and interpretation. She can be reached at [email protected]. alex cameron Alex Cameron is a doctoral candidate in the Law and Technology Program at the University of Ottawa, Faculty of Law. He was previously an associate at the law firm of Fasken Martineau Dumoulin LLP. His current studies focus on privacy and copyright with a focus on the interplay between privacy and digital rights management. He can be reached at [email protected]. robert carey Robert Carey is a Postdoctoral Fellow at the Faculty of Information and Media Studies, the University of Western Ontario. His research at the Faculty concen- trates on four different strands of anonymity-related research: conceptual and behavioral models of anonymity on the Internet; behavioral effects of anonymity in computer-mediated communication; the conceptualization of anonymity; and mass media’s configuration of anonymity and information technology. He can be reached at [email protected]. jennifer chandler Jennifer Chandler is an Associate Professor at the University of Ottawa, Faculty of Law. She focuses on law, science, and technology, particularly the social and environmental effects of emerging technologies and the interaction of emerging technologies with law and regulation. She has written extensively in the areas of cybersecurity and cybertorts. She can be reached at [email protected]. jeremy clark Jeremy Clark is a doctoral candidate with the Centre for Applied Cryptographic Research (CACR) and the Cryptography, Security, and Privacy (CrySP) Group at the University of Waterloo. His research focuses on cryptographic voting, as well as privacy enhancing technologies, applied cryptography, the economics of infor- mation security, and usable security and privacy. He can be reached at j5clark@ cs.uwaterloo.ca. steven davis Steven Davis is Professor Emeritus of Philosophy at Simon Fraser University and the former Director of the Centre on Values and Ethics at Carleton University. He researches philosophy of language and philosophy of mind, and his recent research has focused on privacy, identifying, and identification. He can be reached at [email protected]. contributors xvii jane doe Jane Doe successfully sued the Toronto Police Force for negligence and discrimi- nation in the investigation of her rape, a case that set legal precedent and is taught in law schools across Canada. Jane Doe is an author (The Story of Jane Doe, Random House), teacher, and community organizer. She is currently completing research on the use and efficacy of the Sexual Assault Evidence Kit (SEAK) and police practices of “warning” women regarding stranger and serial rapists. giusella finocchiaro Giusella Finocchiaro is Professor of Internet Law and Private Law at the University of Bologna. She specializes in Internet law both at her own Finocchiaro law firm and as a consultant for other law firms. She also acts as a consultant for the European Union on Internet law issues. She can be reached at giusella.finocchiaro @unibo.it. a. michael froomkin A. Michael Froomkin is Professor of Law at the University of Miami, Faculty of Law. He is on the advisory boards of several organizations including the Electronic Freedom Foundation and BNA Electronic Information Policy and Law Report, is a member of the Royal Institute of International Affairs in London, and writes the well-known discourse.net blog. His research interests include Internet governance, privacy, and electronic democracy. He can be reached at [email protected]. philippe gauvin Philippe Gauvin completed his Master’s Degree in Law and Technology at the University of Ottawa and is currently working as counsel, regulatory affairs, for Bell Canada. His work consists of ensuring the company’s regulatory compliance with privacy, copyright, telecommunications, and broadcasting laws. He can be reached at [email protected]. daphne gilbert Daphne Gilbert is an Associate Professor at the University of Ottawa, Faculty of Law. Her privacy research focuses on the constitutionalized protection of online expression and privacy, and she has an interest in the ethics of compelling cooperation between private organizations and law enforcement and in the expectations of user privacy online. She can be reached at [email protected]. marsha hanen Marsha Hanen is a former president of the University of Winnipeg and an Adjunct Professor of Philosophy at the University of Victoria. She was the presi- dent of the Sheldon Chumir Foundation for Ethics in Leadership from 1999 to 2006. Throughout her career, she has had a broad and deep interest in ethics, philosophy of science, and philosophy of law. Recently she has published on xviii contributors patient privacy and anonymity in medical contexts. She can be reached at [email protected]. daniel c. howe Daniel C. Howe is a digital artist and researcher at NYU’s Media Research Lab where he is completing his PhD thesis on generative literary systems. His pri- vacy research has led to TrackMeNot, an artware intervention addressing data profiling on the Internet, which he has worked on with Helen Nissenbaum. He can be reached at [email protected]. ian kerr Ian Kerr holds the Canada Research Chair in Ethics, Law, and Technology at the University of Ottawa, Faculty of Law with cross-appointments to the Faculty of Medicine, the Department of Philosophy, and the School of Information Studies. He is also the principal investigator of the ID Trail project. Among other things, he is interested in human-machine mergers and the manner in which new and emerging technologies alter our perceptions, conceptions, and expectations of privacy. He can be reached at [email protected]. bert-jaap koops Bert-Jaap Koops is Professor of Regulation and Technology and the former academic director of the Tilburg Institute for Law, Technology, and Society (TILT) at Tilburg University. His privacy research primarily focuses on cryptography, identity-related crime, and DNA forensics. He can be reached at [email protected]. philippa lawson Philippa Lawson is the former Director of the Canadian Internet Policy and Public Interest Clinic (CIPPIC), based at the University of Ottawa. Through her research and advocacy work, she has represented consumer interests in privacy issues before policy and law-making bodies. She can be reached at lawson. [email protected]. ronald leenes Ronald Leenes is an Associate Professor at the Tilburg Institute for Law, Technology, and Society (TILT) at Tilburg University. His primary research interests are privacy and identity management, and regulation of, and by, tech- nology. He is also involved in research in ID fraud, biometrics, and online dispute resolution. He can be reached at [email protected]. ian lloyd Ian Lloyd is Professor of Information Technology Law at the University of Strathclyde Law School. He has researched and written extensively on data contributors xix protection and has recently worked on the Data Protection Programme (DAPRO), funded by the European Union. He can be reached at [email protected]. carole lucock Carole Lucock is a doctoral candidate in the Law and Technology Program at the University of Ottawa, Faculty of Law, and is project manager of the ID Trail. Her research interests include the intersection of privacy, anonymity, and identity, and the potential distinctions between imposed versus assumed anonymity. She can be reached at [email protected]. shoshana magnet Shoshana Magnet is a Postdoctoral Fellow at McGill University. She has been appointed as an Assistant Professor at the Institute of Women’s Studies at the University of Ottawa, commencing in 2009. Her privacy research includes biometrics, borders, and the relationship between privacy and equality. She can be reached at [email protected]. gary t. marx Gary T. Marx is Professor Emeritus of Sociology at MIT and recently held the position of Hixon-Riggs Professor of Science, Technology, and Society at Harvey Mudd College, Claremont, California. He has written extensively about surveil- lance, and his current research focuses on new forms of surveillance and social control across borders. He can be reached at [email protected]. david matheson David Matheson is an Assistant Professor in the Department of Philosophy at Carleton University. Through his privacy-related research, he has written about privacy and knowableness, anonymity and responsible testimony, layperson authentication of contested experts, privacy and personal security, the nature of personal information, and the importance of privacy for friendship. He can be reached at [email protected]. jena mcgill Jena McGill holds an LLB from the University of Ottawa Faculty of Law and an MA from the Norman Paterson School of International Affairs, Carleton University, Ottawa, Canada. She is currently clerking at the Supreme Court of Canada. She has a strong interest in equality and privacy rights, particularly as they relate to gender issues on a national and international scale. She can be reached at [email protected]. jason millar Jason Millar is a doctoral candidate in the Department of Philosophy at Queens University. He is interested in the intersection of ethics, philosophy of technology, xx contributors and philosophy of mind. His research interests also include cryptography and personal area networks in cyborg applications, particularly their impact on the concept of identity and anonymity. He can be reached at [email protected]. helen nissenbaum Helen Nissenbaum is Professor of Media, Culture, and Communication at New York University and a Faculty Fellow of the Information Law Institute. She researches ethical and political issues relating to information technology and new media, with a particular emphasis on privacy, the politics of search engines, and values embodied in the design of information technologies and systems. She can be reached at [email protected]. mary o’donoghue Mary O’Donoghue is Senior Counsel and Manager of Legal Services at the Office of the Information and Privacy Commissioner of Ontario. She is currently an executive member of the Privacy Law section of the Ontario Bar Association and her privacy research has focused on the constitutional and legal aspects of anonymity. She can be reached at mary.o’[email protected]. david j. phillips David J. Phillips is Associate Professor at the Faculty of Information, University of Toronto. His research focuses on the political economy and social shaping of information and communication technologies, in particular surveillance and identification technologies. He can be reached at [email protected]. charles d. raab Charles D. Raab is Professor Emeritus of Government at the School of Social and Political Studies at the University of Edinburgh. His privacy research focuses on surveillance and information policy, with an emphasis on privacy protection and public access to information. He can be reached at [email protected]. valerie steeves Valerie Steeves is an Assistant Professor in the Department of Criminology and the Faculty of Law at the University of Ottawa. Her main research focus is human rights and technology issues. She has written and spoken extensively on privacy from a human rights perspective and is an active participant in the privacy policymaking process in Canada. She can be reached at [email protected]. anne uteck Anne Uteck is a doctoral candidate in the Law and Technology Program at the University of Ottawa, Faculty of Law. Her research is on spatial privacy and other privacy interests outside the informational context implicated by emerging contributors xxi surveillance technologies. Recently, she has published on the topic of radio frequency identification (RFID) and consumer privacy. She can be reached at [email protected]. simone van der hof Simone van der Hof is a Senior Research Fellow and Assistant Professor at the Tilburg Institute for Law, Technology, and Society (TILT) of Tilburg University. Her main research interests are social and legal issues with respect to identity management in electronic public service delivery, regulatory issues concerning profiling technologies and personalization of public and private services, regulation of electronic authentication, electronic evidence and information security, and private (international) law aspects of electronic commerce. She can be reached at [email protected]. the strange return of gyges’ ring: an introduction

Book II of Plato’s Republic tells the story of a Lydian shepherd who stumbles upon the ancient Ring of Gyges while minding his flock. Fiddling with the ring one day, the shepherd discovers its magical power to render him invisible. As the story goes, the protagonist uses his newly found power to gain secret access to the castle where he ultimately kills the king and overthrows the kingdom. Fundamentally, the ring provides the shepherd with an unusual opportunity to move through the halls of power without being tied to his public identity or his personal history. It also provided Plato with a narrative device to address a classic question known to philosophers as the “immoralist’s challenge”: why be moral if one can act otherwise with impunity? the network society

In a network society—where key social structures and activities are organized around electronically processed information networks—this question ceases to be the luxury of an ancient philosopher’s thought experiments. With the estab- lishment of a global telecommunications network, the immoralist’s challenge is no longer premised on mythology. The advent of the World Wide Web in the 1990s enabled everyone with access to a computer and modem to become unknown, and in some cases invisible, in public spaces—to communicate, emote, act, and interact with relative anonymity. Indeed, this may even have granted users more power than did Gyges’ Ring, because the impact of what one could say or do online was no longer limited by physical proximity or corporeality. The end-to-end architecture of the Web’s Transmission Control Protocol, for example, facilitated unidentified, one-to-many interactions at a distance. As the now-famous cartoon framed the popular culture of the early 1990s, “On the Internet, nobody knows you’re a dog.”1 Although this cartoon resonated deeply on various levels, at the level of architecture it reflected the simple fact that the Internet’s original protocols did not require people to identify themselves, enabling them to play with their identities—to represent themselves however they wished. In those heady days bookmarking the end of the previous millennium, the rather strange and abrupt advent of Gyges’ Ring 2.0 was by no means an unwel- come event. Network technologies fostered new social interactions of various sorts and provided unprecedented opportunities for individuals to share their

1. Peter Steiner, “On the Internet, Nobody Knows You’re a Dog,” The New Yorker (July 5, 1993), http://www.cartoonbank.com/item/22230. xxiv the strange return of gyges’ ring: an introduction thoughts and ideas en masse. Among other things, the Internet permitted robust political speech in hostile environments, allowing its users to say and do things that they might never have dared to say or do in places where their identity was more rigidly constrained by the relationships of power that bracket their experi- ence of freedom. Anonymous browsers and messaging applications promoted frank discussion by employees in oppressive workplaces and created similar opportunities for others stifled by various forms of social stigma. Likewise, new cryptographic techniques promised to preserve personal privacy by empowering individuals to make careful and informed decisions about how, when, and with whom they would share their thoughts or their personal information. At the same time, many of these new information technologies created opportunities to disrupt and resist the legal framework that protects persons and property. Succumbing to the immoralist’s challenge, there were those who exploited the network to defraud, defame, and harass; to destroy property; to distribute harmful or illegal content; and to undermine national security. In parallel with both of these developments, we have witnessed the proliferation of various security measures in the public and private sectors designed to under- mine the “ID-free” protocols of the original network. New methods of authentica- tion, verification, and surveillance have increasingly allowed persons and things to be digitally or biometrically identified, tagged, tracked, and monitored in real time and in formats that can be captured, archived, and retrieved indefinitely. More recently, given the increasing popularity of social network sites and the pervasive- ness of interactive media used to cultivate user-generated content, the ability of governments, not to mention the proliferating international data brokerage indus- tries that feed them, to collect, use, and disclose personal information about every- one on the network is increasing logarithmically. This phenomenon is further exacerbated by corporate and government imperatives to create and maintain large- scale information infrastructures to generate profit and increase efficiencies. In this new world of ubiquitous handheld recording devices, personal webcams, interconnected closed circuit television (CCTV) cameras, radio frequency identifica- tion (RFID) tags, smart cards, global satellite positioning systems, HTTP cookies, digital rights management systems, biometric scanners, and DNA sequencers, the space for private, unidentified, or unauthenticated activity is rapidly shrinking. Many worry that the regulatory responses to the real and perceived threats posed by Gyges’ Ring have already profoundly challenged our fundamental commitments to privacy, autonomy, equality, security of the person, free speech, free movement, and free association. Add in the shifting emphasis in recent years toward public safety and national security, and network technologies appear to be evolving in a manner that is transforming the structures of our communications systems from architectures of freedom to architectures of control. Are we shifting away from the original design of the network, from spaces where anonymity and privacy were once the default position to spaces where nearly every human transaction is subject to tracking, monitoring, and the possibility of authentication and identification? the strange return of gyges’ ring: an introduction xxv

The ability or inability to maintain privacy, construct our own identities, control the use of our identifiers, decide for ourselves what is known about us, and, in some cases, disconnect our actions from our identifiers will ultimately have profound implications for individual and group behavior. It will affect the extent to which people, corporations, and governments choose to engage in global electronic commerce, social media, and other important features of the network society. It will affect the way that we think of ourselves, the way we choose to express ourselves, the way that we make moral decisions, and our will- ingness and ability to fully participate in political processes. Yet our current philosophical, social, and political understandings of the impact and importance of privacy, identity, and anonymity in a network society are simplistic and poorly developed, as is our understanding of the broader social impact of emerging network technologies on existing legal, ethical, regulatory, and social structures. This book investigates these issues from a number of North American and European perspectives. Our joint examination is structured around three core organizing themes: (1) privacy, (2) identity, and (3) anonymity. privacy

The jurist Hyman Gross once described privacy as a concept “infected with per- nicious ambiguities.”2 More recently, Canadian Supreme Court Justice Ian Binnie expressed a related worry, opining that “privacy is protean.”3 The judge’s metaphor is rather telling when one recalls that Proteus was a shape-shifter who would transform in order to avoid answering questions about the future. Perhaps U.S. novelist Jonathan Franzen had something similar in mind when he charac- terized privacy as the “Cheshire cat of values.”4 One wonders whether privacy will suffer the same fate as Lewis Carroll’s enigmatic feline—all smile and no cat. Certainly, that is what Larry Ellison seems to think. Ellison is the CEO of Oracle Corporation and the fourteenth richest person alive. In the aftermath of September 11, 2001, Ellison offered to donate to the U.S. Government software that would enable a national identification database, boldly stating in 2004 that “The privacy you’re concerned about is largely an illusion. All you have to give up is your illusions, not any of your privacy.”5 As someone who understands the power of network databases to profile people right down to their skivvies

2. Hyman Gross, “The Concept of Privacy,” N.Y.U. L. REV. 43 (1967): 34–35. 3. R. v Tessling, 2004 SCC 67, [2004] 3 S.C.R. 432, per Justice Binnie, at 25. 4. Jonathan Franzen, How to Be Alone: Essays (New York: Farrar, Straus and Giroux, 2003), 42. 5. Larry Ellison, quoted in L. Gordon Crovitz, “Privacy? We Got Over It” The Wall Street Journal, A11, August 25, 2008, http://online.wsj.com/article/SB121962391804567765. html?mod=rss_opinion_main. xxvi the strange return of gyges’ ring: an introduction

(and not only to provide desirable recommendations for a better brand!), Ellison’s view of the future of privacy is bleak. Indeed, many if not most contemporary discussions of privacy are about its erosion in the face of new and emerging technologies. Ellison was, in fact, merely reiterating a sentiment that had already been expressed some five years earlier by his counterpart at Sun Microsystems, Scott McNealy, who advised a group of journalists gathered to learn about Sun’s data-sharing software: “You have zero privacy anyway. Get over it.”6 To turn Hyman Gross’s eloquent quotation on its head—the Ellison/McNealy conception of privacy is infected with ambiguous perniciousness. It disingenu- ously—or perhaps even malevolently—equivocates between two rather different notions of privacy in order to achieve a self-interested outcome: it starts with a descriptive account of privacy as the level of control an individual enjoys over her or his personal information and then draws a prescriptive conclusion that, because new technologies will undermine the possibility of personal control, we therefore ought not to expect any privacy. Of course, the privacy that many of us expect is not contingent upon or conditioned by the existence or prevalence of any given technology. Privacy is a normative concept that reflects a deeply held set of values that predates and is not rendered irrelevant by the network society. To think otherwise is to commit what philosopher G. E. Moore called the “naturalistic fallacy,”7 or as Lawrence Lessig has restyled it, the “is-ism”: The mistake of confusing how something is with how it must be. There is certainly a way that cyberspace is. But how cyberspace is is not how cyber- space has to be. There is no single way that the net has to be; no single architecture that defines the nature of the net. The possible architectures of something that we would call “the net” are many, and the character of life within those different architectures are [sic] diverse.8 Although the “character of life” of privacy has, without question, become more diverse in light of technologies of both the privacy-diminishing and privacy- preserving variety, the approach adopted in this book is to understand privacy as a normative concept. In this approach, the existence of privacy rights will not simply depend on whether our current technological infrastructure has reshaped our privacy expectations in the descriptive sense. It is not a like-it-or-lump-it proposition. At the same time, it is recognized that the meaning, importance, impact, and implementation of privacy may need to evolve alongside the emer- gence of new technologies. How privacy ought to be understood—and fostered—in

6. Ibid., Scott McNealy quote. 7. G. E. Moore, Principia Ethica (Cambridge: Cambridge University Press, 1903). 8. Lawrence Lessig, Code: And Other Laws of Cyberspace, Version 2.0 (New York: Basic Books, 2006): 32. the strange return of gyges’ ring: an introduction xxvii a network society certainly requires an appreciation of and reaction to new and emerging network technologies and their role in society. Given that the currency of the network society is information, it is not totally surprising that privacy rights have more recently been recharacterized by courts as a kind of “informational self-determination.”9 Drawing on Alan Westin’s classic definition of informational privacy as “the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent informa- tion about them is communicated to others,”10 many jurisdictions have adopted fair information practice principles11 as the basis for data protection regimes.12 These principles and the laws that support them are not a panacea, as they have been developed and implemented on the basis of an unhappy compromise between those who view privacy as a fundamental human right and those who view it as an economic right.13 From one perspective, these laws aim to protect privacy, autonomy, and dignity interests. From another, they are the lowest common denominator of fairness in the information trade. Among other things, it is thought that fair information practice principles have the potential to be technology neutral, meaning that they apply to any and all technologies so that privacy laws do not have to be rewritten each time a new privacy-implicating technology comes along. A number of chapters in this book challenge that view. Our examination of privacy in Part I of this book begins with the very fulcrum of the fair information practice principles—the doctrine of consent. Consent is often seen as the legal proxy for autonomous choice and is therefore anchored in the traditional paradigm of the classical liberal individual, which is typically thought to provide privacy’s safest harbor. As an act of ongoing agency, consent can also function as a gatekeeper for the collection, use, and disclosure of per- sonal information. As several of our chapters demonstrate, however, consent can also be manipulated, and reliance on it can generate unintended conse- quences in and outside of privacy law. Consequently, we devote several chapters

9. Known in German as “Informationelles selbstbestimmung,” this expression was fi rst used jurisprudentially in Volkszählungsurteil vom 15. Dezember 1983, BVerfGE 65, 1, German Constitutional Court (Bundesverfassungsgerichts) 1983. 10. Alan Westin, Privacy and Freedom (New York: Atheneum, 1967): 7. 11. Organization for Economic Cooperation and Development, Guidelines Governing the Protection of Privacy and Transborder Flows of Personal Data, Annex to the Recommendation of the Council of 23 September 1980, http://www.oecd.org/document/18/0,3343,en_2649_ 34255_1815186_1_1_1_1,00.html. 12. Article 29 of Directive EC, Council Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, [1995] O.J. L. 281: 31; Personal Information Protection and Electronic Documents Act, S.C. 2000, c. 5. 13. Canada. House of Commons Standing Committee on Human Rights and the Status of Persons with Disabilities. 35th Parliament, 2nd Session. Privacy: Where Do We Draw the Line? (Ottawa: Public Works and Government Services Canada, 1997). xxviii the strange return of gyges’ ring: an introduction to interrogations of the extent to which the control/consent model is a sufficient safeguard for privacy in a network society. Does privacy live on liberal individualism alone? Some of our chapters seek out ways of illuminating privacy in light of other cherished collective values such as equality and security. Although the usual temptation is to understand these values as being in conflict with privacy, our approach in this book casts privacy as complementary to and in some cases symbiotic with these other important social values. Privacy does not stand alone. It is nested in a number of social relationships and is itself related to other important concepts, such as identity and anonymity. We turn to those concepts in Parts II and III of the book. identity

Although lofty judicial conceptions of privacy such as “informational self- determination” set important normative standards, the traditional notion of a pure, disembodied, and atomistic self, capable of making perfectly rational and isolated choices in order to assert complete control over personal information, is not a particularly helpful fiction in a network society. If a fiction there must be, one that is perhaps more worthy of consideration is the idea of identity as a theft of the self. Who we are in the world and how we are identified is, at best, a con- cession. Aspects of our identities are chosen, others assigned, and still others accidentally accrued. Sometimes they are concealed at our discretion, other times they are revealed against our will. Identity formation and disclosure are both complex social negotiations, and in the context of the network society, it is not usually the individual who holds the bargaining power. Because the network society is to a large extent premised on mediated interac- tion, who we are (and who we say we are) is not a self-authenticating proposition in the same way that it might be if we were close kin or even if we were merely standing in physical proximity to one another. Although we can be relatively certain that it is not a canine on the other end of an IM chat, the identity of the entity at the other end of a transaction may be entirely ambiguous. Is it a business partner, an imposter, or an automated software bot? The same could be true of someone seeking to cross an international border, order an expensive product online, or fly an airplane—assuming she or he is able to spoof the appropriate credentials or identifiers. As we saw in the extreme example of the shepherd in possession of Gyges’ Ring, those who are able to obfus- cate their identities sometimes take the opportunity to act with limited account- ability. This is one of the reasons why network architects and social policymakers have become quite concerned with issues of identity and identification. However, it is important to recognize that identification techniques can preserve or diminish privacy. Their basic function is to make at least some aspects of an unknown entity known by mapping it to a knowable attribute. the strange return of gyges’ ring: an introduction xxix

An identification technique is more likely to be privacy preserving if it takes a minimalist approach with respect to those attributes that are to become known. For example, an automated highway toll system may need to authenticate certain attributes associated with a car or driver in order to appropriately debit an account for the cost of the toll. But to do so, it need not identify the car, the driver, the passengers, or for that matter the ultimate destination of the vehicle. Instead, anonymous digital credentials14 could be assigned that would allow cryptographic tokens to be exchanged through a network in order to prove statements about them and their relationships with the relevant organization(s) without any need to identify the drivers or passengers themselves. Electronic voting systems can do the same thing. In Part II of the book we explore these issues by investigating different philo- sophical notions of identity and discussing how those differences matter. We also address the role of identity and identification in achieving personal and public safety. We consider whether a focus on the protection of “heroic” cowboys who refuse to reveal their identities in defiance of orders to do so by law enforce- ment officers risks more harm than good, and whether unilateral decisions by the State to mandate control over the identities of heroic sexually assaulted women as a protective measure risk less good than harm. We examine the inter- action of self and other in the construction of identity and demonstrate in several chapters why discussions of privacy and identity cannot easily be disentangled from broader discussions about power, gender, difference, and discrimination. We also examine the ways in which identity formation and identification can be enabled or disabled by various technologies. A number of technologies that we discuss—data-mining, automation, ID cards, ubiquitous computing, biomet- rics, and human-implantable RFID—have potential narrowing effects, reducing who we are to how we can be counted, kept track of, or marketed to. Other tech- nologies under investigation in this book—mix networks and data obfuscation technologies—can be tools for social resistance used to undermine identification and the collection of personal information, returning us to where our story began. anonymity

We end in Part III with a comparative investigation of the law’s response to the renaissance of anonymity. Riffing on Andy Warhol’s best known turn of phrase, an internationally (un)known British street artist living under the pseudonym “Banksy”15 produced an installation with words on a retro-looking pink screen

14. David Chaum, “Achieving Electronic Privacy,” Scientifi c America (August 1992): 96–101; Stefan A. Brands, Rethinking Public Key Infrastructures and Digital Certifi cates: Building in Privacy (Cambridge, MA: MIT Press, 2000). 15. Banksy, “By Banksy,” http://www.banksy.co.uk/- (accessed September 10, 2008). xxx the strange return of gyges’ ring: an introduction that say, “In the future, everyone will have their 15 minutes of anonymity.”16 Was this a comment on the erosion of privacy in light of future technology? Or was it a reflection of Banksy’s own experience regarding the challenges of living life under a pseudonym in a network society? Whereas Warhol’s “15 minutes of fame” recognized the fleeting nature of celebrity and public attention, Banksy’s “15 minutes of anonymity” recognizes the long-lasting nature of information ubiquity and data retention. Although privacy and anonymity are related concepts, it is important to realize that they are not the same thing. There are those who think that anonymity is the key to privacy. The intuition is that a privacy breach cannot occur unless the information collected, used, or disclosed about an individual is associated with that individual’s identity. Many anonymizing technologies exploit this notion, allowing people to control their personal information by obfuscating their identities. Interestingly, the same basic thinking underlies most data protection regimes, which one way or another link privacy protection to an identifiable individual. According to this approach, it does not matter if we collect, use, or disclose infor- mation, attributes, or events about people so long as the information cannot be (easily) associated with them. Although anonymity, in some cases, enables privacy, it certainly does not guarantee it. As Bruce Schneier has pointed out17 and as any recovering alcoholic knows all too well, even if Alcoholics Anonymous does not require you to show ID or to use your real name, the meetings are anything but private. Anonymity in public is quite difficult to achieve. The fact that perceived anonymity in public became more easily achieved through the end-to-end architecture of the Net is part of what has made the Internet such a big deal, creating a renaissance in anonymity studies not to mention new markets for the emerging field of identity management. The AA example illustrates another crucial point about anonym- ity. Although there is a relationship between anonymity and invisibility, they are not the same thing. Though Gyges’ Ring unhinged the link between the shep- herd’s identity and his actions, the magic of the ring18 was not merely in enabling him to act anonymously (and therefore without accountability): the real magic was his ability to act invisibly. As some leading academics have recently come to

16. Banksy, interviewed by Shepard Fairey in “Banksy,” Swindle Magazine, no. 8 (2008), http://swindlemagazine.com/issue08/banksy/ (accessed September 10, 2008). 17. Bruce Schneier, “Lesson From Tor Hack: Anonymity and Privacy Aren’t the Same,” Wired (September 20, 2007), http://www.wired.com/politics/security/commentary/ securitymatters/2007/09/security_matters_0920?currentPage=2 (accessed September 10, 2008). 18. Arthur C. Clarke’s famous third law states, “Any suffi ciently advanced technology is indistinguishable from magic.” See Arthur C. Clarke, http://www.quotationspage.com/ quotes/Arthur_C._Clarke/, Profi les of the Future; An Inquiry Into the Limits of the Possible (Toronto: Bantam Books, 1971). the strange return of gyges’ ring: an introduction xxxi realize, visibility and exposure are also important elements in any discussion of privacy, identity, and anonymity.19 Indeed, many argue that the power of the Internet lies not in the ability to hide who we are, but in freeing some of us to expose ourselves and to make ourselves visible on our own terms. Given its potential ability to enhance privacy on one hand and to reduce accountability on the other, what is the proper scope of anonymity in a network society? Although Part III of the book does not seek to answer this question directly, it does aim to erect signposts for developing appropriate policies by offering a comparative investigation of anonymity and the law in five European and North American jurisdictions. How the law regards anonymity, it turns out, is not a question reducible to discrete areas of practice. As we shall see, it is as broad ranging as the law itself. Interestingly, despite significant differences in the five legal systems and their underlying values and attitudes regarding privacy and identity, there seems to be a substantial overlap in the way that these legal systems regard anonymity, which is not generally regarded as a right and certainly not as a foundational right. In the context of these five countries, it might even be said that the law’s regard for anonymity is to some extent diminishing. When one considers these emerging legal trends alongside the shifting technological landscape, it appears that the answer to our question posed at the outset is clear: the architecture of the network society seems to be shifting from one in which anonymity was the default to one where nearly every human transaction is subject to monitoring and the possibility of identity authentication. But what of the strange return of Gyges’ Ring and the network society in which it reemerged? And what do we wish for the future of privacy, identity, and anonymity? Let us begin the investigation.

19. Hille Koskela, “‘In Visible City’: Insecurity, Gender, and Power Relations in Urban Space,” in Voices from the North. New Trends in Nordic Human Geography, eds. J. Öhman and K. Simonsen (Burlington: Ashgate, Aldershot, 2003): 283–294; Julie Cohen, “Privacy, Visibility, Transparency, and Exposure,” University of Chicago Law Review 75, no. 1 (2008); Kevin D. Haggerty and Richard V. Ericson, The New Politics of Surveillance and Visibility (Toronto: University of Toronto Press, 2005).

27. anonymity and the law in the netherlands simone van der hof, bert-jaap koops, and ronald leenes

I. Introduction 503 II. Anonymity in Constitutional Law 504 A. A General Right to Anonymity 504 B. Anonymity as Part of Other Constitutional Rights 505 III. Anonymity in Criminal Law 506 IV. Anonymity in Private Law 509 A. Civil Proceedings 509 B. Contract Law 510 V. Anonymity in Public 512 VI. Anonymity in Citizen-Government Relationships 514 A. Service Delivery 514 B. e-Voting 516 C. Anonymized Case-Law and “Naming and Shaming” 517 VII. Conclusion 518 A. Does a Right to Anonymity Exist in Dutch Law? 518 B. Should a Right to Anonymity Be Created in Dutch Law? 519 i. introduction

Anonymity is important in current society. The feeling that anonymity is disappearing has raised the question of whether a right to anonymity exists, or whether such a right should be created given technological and societal developments. In this chapter, we address this question from a Dutch legal perspective. Our analysis of the hypothetical legal right to anonymity in the Netherlands may contribute to the overall research into the status and impor- tance of a right to anonymity in contemporary society. The core of this chapter consists of an overview of relevant areas in Dutch law where a right to anonymity may be found, construed, or contested. Section 2 discusses anonymity in constitutional law. Sections 3 and 4 explore the status of anonymity in criminal law and private law, respectively. Section 5 provides an overview of anonymity in public spaces. In section 6, we focus on anonymity in citizen-government relationships: service delivery, e-voting, anonymized case-law, and naming and shaming. Finally, we draw conclusions regarding the right to anonymity in current Dutch law—something that turns out to be only piecemeal, and rather weak. The chapter concludes with a reflection on these 504 simone van der hof, bert-jaap koops, and ronald leenes conclusions: should a right to anonymity (or, at least, a more powerful right than the current one) be created in Dutch law? ii. anonymity in constitutional law

A right to anonymity in the Dutch Constitution (Grondwet, hereafter: DC)1 could be construed in several ways: as a general right to anonymity (i.e., as a separate constitutional right), or as a right subsidiary to or included in other constitutional rights, such as the rights to privacy, secrecy of communications, and freedom of expression. In the following, we explore the DC; the scope of this chapter does not allow us to discuss the equally important European Convention on Human Rights (ECHR)2 as a source of constitutional rights.

A. A General Right to Anonymity There is no general right to anonymity in the DC, and it is unlikely that one will be created in the foreseeable future. Anonymity was discussed in the late 1990s in relation to the amendment of Art. 13, DC (confidential communications).3 After the amendment floundered in the First Chamber, the Dutch government decided to investigate a broader update of the fundamental rights in the Constitution in light of information and communications technologies. To this aim, the Committee on Fundamental Rights in the Digital Age was instituted to advise the government. The Committee, surprisingly, considered a right to anonymity as an alternative to the right to privacy. Predictably, this was not found to be a sound alternative, anonymity being further-reaching than privacy and therefore requiring more exceptions.4 The Cabinet, in its reaction, agreed, adding that a right to anonymity is unnecessary to substitute or complement the right to privacy. Art. 10, DC (the general right to privacy), provides sufficient protection even if anonymity were considered to be a starting point in society.5 Moreover, the cabinet stated that knowability rather than anonymity is the norm in society, and although there is sometimes a need for anonymity, this need does

1. An English version of the Dutch Constitution is available at http://www.minbzk.nl/ contents/pages/6156/grondwet_UK_6-02.pdf. 2. European Convention on Human Rights, (November 4, 1950). 3. Dutch Constitution, 1983 (Grondwet), Art. 13. See generally Bert-Jaap Koops and Marga Groothuis, “Constitutional Rights and New Technologies in the Netherlands,” in Constitutional Rights and New Technologies: A Comparative Study, eds. Ronald Leenes, Bert-Jaap Koops, and Paul de Hert (The Hague: T.M.C. Asser Press, 2008), 159. 4. Committee on Fundamental Rights in the Digital Age, “Grondrechten in het digitale tijdperk” (Fundamental Rights in the Digital Age), Kamerstukken II 27460, 2000/01, no. 1, p.125 (appendix), http://www.minbzk.nl/actueel?ActItmIdt=6427. 5. Ibid., 20. anonymity and the law in the netherlands 505 not warrant safeguards at a constitutional level.6 This view generally reflects academic literature.7

B. Anonymity as Part of Other Constitutional Rights Although the DC lacks a proper right to anonymity, the government holds the view that anonymity often goes hand in hand with privacy and data protection.8 (Art. 10, DC). Furthermore, other rights can “shelter” anonymity, such as the right to confidential communications (Art. 13, DC) and the right to freedom of expression (Art. 10, ECHR).9 For example, the anonymity of a whistleblower can be protected by a journalist’s right to protection of sources.10 In light of the “shelter” provided by these constitutional rights, there seems to be no need to establish a separate right for anonymity. However, it is unclear how far this protection stretches, for conditions for sheltering are lacking. Presumably, these conditions will be fairly strict, in light of the repeated state- ment by the government that identification rather than anonymity is the norm in current society: [A]gainst a certain desirability of anonymity, there is the fact that the functioning of our society is based, rather, on identifiability. In order to meet obligations and for law enforcement, knowability is appropriate and necessary in order to adequately protect the interests of third parties. In these frameworks, it is important that the responsibility for acts can be attributed to identifiable persons.11 Also, in Dutch academic literature there is little support for explicit con- stitutional protection of anonymity, even though anonymity itself is seen as important. The subsumption of anonymity under other constitutional provisions seems sufficient, freedom of expression being a more likely candidate than the right to privacy or the right to secrecy of communications.

6. Kamerstukken II 27460, 2000/01, no. 2, p. 44 (n. 3). 7. For an extensive discussion on the constitutional grounds for anonymity in public speech, see A. H. Ekker, Anoniem communiceren: van drukpers tot weblog (Den Haag: Sdu, 2006). 8. The General Right to Privacy, Dutch Constitution, 1983 (Grondwet), Art. 10. 9. Dutch Constitution, 1983 (Grondwet), Art. 13; European Convention on Human Rights, Art. 10. 10. Explanatory Memorandum, p. 5, from Letter from the Minister, October 29, 2004, with a Draft Bill and Explanatory Memorandum to amend Art. 10 Dutch Constitution (n. 8), and the advice of the Council of State, http://www.minbzk.nl/aspx/get.aspx?xdl=/ views/corporate/xdl/page&VarIdt=109&ItmIdt=101328&ActItmIdt=12755; see Voskuil v The Netherlands ECHR November 22, 2007, for a case in point. 11. Ibid., 4. Unless otherwise stated, all translations in this chapter are the authors’. 506 simone van der hof, bert-jaap koops, and ronald leenes iii. anonymity in criminal law

Anonymity plays a clear role in criminal law enforcement. First, anonymity is of interest when reporting a crime. Generally, reporting a crime is done in writing or orally, put to paper by an officer and signed by the person reporting (Art. 163 Dutch Code of Criminal Procedure (hereafter: DCCP)).12 Signing implies identi- fication that negatively affects people’s willingness to report crimes. To stimulate crime reporting by people who fear retribution, an anonymous reporting system, “M.,” was introduced in 2002; its catch-phrase, “Report Crime Anonymously” (Meld Misdaad Anoniem), is actively promoted in the media.13 M. is a toll-free number (0800-7000) that can be called to report serious crimes that will then be forwarded to the police or other law-enforcement agencies with a guarantee of anonymity. The reporting system is likely to be supplemented by anonymous reporting by victims—of intimidation, for example. This requires changes in the DCPP in order for anonymous reports to be admissible as evidence in court.14 Second, and more important, witnesses can remain anonymous in specific situations. The Witness Protection Act of 1994 has introduced the concept of “threatened witness” (bedreigde getuige)—a witness whose identity is kept secret during interrogation at the court’s order (Art. 136c, DCCP).15 The court first has to determine whether a witness really requires anonymity, something judged to be the case only if there is reasonable fear for the life, health, security, family life, or social-economic subsistence of the witness, and if the witness has declared his or her intent to abstain from witnessing because of this threat (Art. 226a, DCCP).16 If anonymity is granted, the witness is heard by the investigating judge (who knows the witness’s identity but makes sure that the interrogation safeguards anonymity, Art. 226c, DCCP), if necessary in the absence of the defendant, attor- ney, and prosecutor (Art. 226d, DCCP).17 The judge investigates and reports the witness’s reliability (Art. 226e, DCCP).18 The Act also provides a witness-protection program.19

12. Dutch Code of Criminal Procedure (Wetboek van Strafvordering), Art. 163. 13. See http://www.meldmisdaadanoniem.nl/ (in Dutch), http://www.meldmisdaa- danoniem.nl//article.aspx?id=203 (English). C.f., Gerechtshof [Court of Appeal] Amsterdam, February 7, 2005, LJN AS5816. 14. http://www.nu.nl/news/1118441/14/rss/Ministers_werken_aan_anonieme_ aangifte_bij_politie.html. 15. Witness Protection Act (Wet getuigenbescherming) of November 11, 1993, Staatsblad 1993 (Netherlands) 603, entry into force February 1, 1994. The provision is included in the Dutch Code of Criminal Procedure as Art. 136c. 16. Dutch Code of Criminal Procedure (Wetboek van Strafvordering), Art. 226a. 17. Dutch Code of Criminal Procedure (Wetboek van Strafvordering), Art. 226c and 226d. 18. Dutch Code of Criminal Procedure (Wetboek van Strafvordering), Art. 226e. 19. See Art. 226f, DCCP, the Witness Protection Decree (Besluit getuigenbescherm- ing), and the Ruling on the Witness Protection Police Register (Reglement politieregister getuigenbescherming). Note that the latter implies that many identifying data of threatened anonymity and the law in the netherlands 507

A recent change in the DCCP enables intelligence officers to testify anonymously as a “shielded witness” (afgeschermde getuige, Art. 136d, DCPP) in criminal court proceedings.20 The identity of a shielded witness is kept secret in a way similar to that of a threatened witness if interests of state security or a con- siderable interest of the witness or another party so requires (Art. 226g and 226h, DCPP).21 Only the Rotterdam investigating judge is authorized to hear shielded witnesses (Art. 178a(3), DCCP).22 Testimony reports should contain no information that undermines the interests of the witness or the state and are only shown to the defense and included in the case records if the witness con- sents (Art. 226j(2) and (3), and 226m, DCCP).23 A result of these far-reaching provisions is that the defense has limited possibilities to question the evidence given by intelligence officers. Here, the right to anonymity as a safeguard of state security seems to prevail over the right of the defendant to a fair trial. A right to anonymity for suspects is absent in Dutch law. Fingerprinting sus- pects to facilitate identification is deemed lawful on the basis of the Police Act (Art. 2, Police Act).24 Recent “measures in the interest of the investigation” go even further to identify anonymous suspects. They are “indirect means of coercion to force the suspect to reveal identifying data himself.”25 Among other measures, Art. 61a DCCP allows the police to take photographs and fingerprints, bring about a witness confrontation, conduct a smell-identification test, and cut hair or let it grow.26 To facilitate identification, these measures can only be used in cases involving crime allowing custody. Anonymous suspects who are stopped or arrested can also be asked for their social-fiscal number and frisked (Art. 55b DCPP), and suspects may be held for interrogation, with the purpose of determining their identity, for a maximum of 6 to 12 hours (Art. 61, DCCP).27 Parties in criminal proceedings have very limited rights to remain anonymous. In contrast, law-enforcement agencies have abundant powers to collect identifying data that bear on the overall picture of anonymity in Dutch law. witnesses can be registered by the police, including old and new identity, address, description and photograph, birth data, and transport and communication data, in order to execute the witness-protection program. 20. Shielded Witnesses Act (Wet afgeschermde getuigen), Staatsblad 2006, 460 (Netherlands), in force since November 1, 2006. 21. Dutch Code of Criminal Procedure (Wetboek van Strafvordering), Art. 226 and 226h. 22. Dutch Code of Criminal Procedure (Wetboek van Strafvordering), Art. 178a(3). 23. Dutch Code of Criminal Procedure (Wetboek van Strafvordering), Art. 226j(2) and (3), and 226m. 24. Police Act (Politiewet), Art. 2. 25. According to the legislator, as quoted in C. P. M. Cleiren and J. F. Nijboer, Tekst & Commentaar Strafvordering, 2nd edition (Deventer: Kluwer, 1997), note 1 to Dutch Code of Criminal Procedure, Art. 61a. 26. Dutch Code of Criminal Procedure (Wetboek van Strafvordering), Art. 61a. 27. Dutch Code of Criminal Procedure (Wetboek van Strafvordering), Art. 55b. 508 simone van der hof, bert-jaap koops, and ronald leenes

As of January 1, 2006, a broad range of data-production orders have been put in place allowing any investigating officer to order the production of identifying data in case of any crime (although not misdemeanors),28 provided that the data are processed for purposes other than personal use (Art. 126nc, DCCP). The production order can also be given in case of “indications” of a terrorist crime— a lower standard than the “probable cause” normally required for investigation (Art. 126zk, DCCP).29 Identifying data that are processed for personal use (e.g., a citizen’s address book) can be ordered by a public prosecutor for a crime for which preliminary detention is allowed (Art. 126nd).30 A separate rule with similar conditions (Art. 126na, 126ua, and 126zi, DCCP) allows the identification of telecommunications data, such as IP addresses.31 Separate powers provide for the identification of prepaid-card users, because even telecom providers do not know their identity. A mandatory registration and identification scheme for prepaid-card buyers was briefly considered in the 1990s, but, this being considered too extensive, two less infringing measures were taken instead. Art. 126na(2), DCCP, allows providers to be ordered to retrieve the phone number of a pre-paid card user by means of data mining if the police provide them with two or more dates, times, and places from which the person in question is known to have called.32 To make sure that providers have these data available, a three-month data retention obligation is in place.33 If data mining by the telecommunications provider is impossible or overly inefficient, the police can also use an IMSI catcher—a device resembling a mobile phone base station that attracts the mobile phone traffic in its vicinity (Art. 126nb and 126ub, DCCP, Art. 3.10(4), Telecommunications Act).34 An IMSI catcher may only be used to collect an unknown telephone number (or IMSI number), not to collect traffic data or to listen in on communications.

28. Or in cases of planned organized crime, on the basis of the Dutch Code of Criminal Procedure (Wetboek van Strafvordering), Art.126uc. 29. Dutch Code of Criminal Procedure (Wetboek van Strafvordering), Art. 126zk. 30. Or in cases of planned organized crime (Dutch Code of Criminal Procedure (Wetboek van Strafvordering), Art. 126ud) or of ‘indications’ of a terrorist crime (Dutch Code of Criminal Procedure (Wetboek van Strafvordering), Art. 126zl). 31. Dutch Code of Criminal Procedure (Wetboek van Strafvordering), Art. 126na, 126ua, and 126zi. 32. Dutch Code of Criminal Procedure (Wetboek van Strafvordering), Art. a126na(2). 33. Art. 13.4(2) Telecommunications Act (Telecommunicatiewet) juncto Decree on Special Collection of Telecommunications Number Data (Besluit bijzondere vergaring num- mergegevens telecommunicatie), Staatsblad 2002, 31, in force since March 1, 2002. Note that the Data Retention Directive, 2006/24/EC, has not yet been implemented in Dutch law. This directive requires electronic-communications providers to store traffic data for a period of 6 to 24 months. A bill is pending in Parliament with a retention period of 12 months. 34. Dutch Code of Criminal Procedure, Art. 126nb, 126ub; Telecommunications Act (Telecommunicatiewet), Art. 3.10(4). anonymity and the law in the netherlands 509 iv. anonymity in private law

A. Civil Proceedings Identification is the cornerstone of the enforcement of citizens’ rights in civil proceedings. Serving a summons, for example, is very difficult if a person’s identity, including his or her address, is unknown.35 Anonymity is, however, not a priori excluded in the Dutch Civil Procedure Code (DCPC). Under exceptional circumstances (e.g., in case of genuine fear of retaliation), anonymous witness statements are admissible in civil proceedings.36 In these cases, the identity of the witness is unknown to the other party in the proceedings but is known to the court. Statements of anonymous witnesses differ from those of regular witnesses but produce the same result; it is up to the court to weigh them in the case at hand. Another example concerns summonses to quit vacant properties, which can be given to anonymous persons under certain conditions.37 First, such a summons must relate to (a part of) real estate. Second, the name and the place of residence of the person(s) concerned must not be able to be identified with reasonable effort. This case obviously applies to squatters, whose identity can only be retrieved with great difficulty—if at all. Their anonymity cannot be maintained in appeal; by then, their identity must be known. Internet Service Providers (ISPs) may be requested to identify subscribers in civil proceedings as a result of the Dutch Electronic Commerce Act.38 In civil proceedings, the court may order ISPs to disclose the identity of users who post information on sites hosted by the ISPs.39 The Dutch High Court confirmed this position in Lycos v. Pessers.40 It decided that ISPs may have a duty to provide a third party with identity information, the non-observance of which may amount to tort,41 even if the allegations on that person’s Web site are not prima facie illegal or unjust. The following considerations are relevant: • The possibility that the information is unjust and damaging to a third party is sufficiently reasonable. • The third party has a reasonable interest in receiving the identity information.

35. See Dutch Civil Procedure Code (Wetboek van Burgerlijke Rechtsvordering), Art. 45(2). 36. See Dutch Civil Procedure Code (Wetboek van Burgerlijke Rechtsvordering), Art. 165ff. 37. See Dutch Civil Procedure Code (Wetboek van Burgerlijke Rechtsvordering), Art. 45(3) and 61. 38. Dutch Electronic Commerce Act (Aanpassingswet elektronische handel). 39. Kamerstukken II 28197, 2001/2, no. 3, p. 28. See also Art. 15(2) of Directive 2000/31/ EC on e-commerce (which was, however, not implemented into Dutch law). 40. Hoge Raad [Dutch High Court] November 25, 2005, LJN AU4019. (LJN refers to the publication number at the Dutch official case-law publication Web site, http://www. rechtspraak.nl). 41. See Dutch Civil Code (Burgerlijk Wetboek), Art. 6:162. 510 simone van der hof, bert-jaap koops, and ronald leenes

• It is reasonably certain that less invasive possibilities of obtaining the information do not exist. • The third party’s interests carry more weight than those of the ISP and the Web site owner (if known). In the 2006 case Brein v. UPC, the court added as a further requirement that the person whose identity information is requested must be, beyond reasonable doubt, the person who conducted the allegedly illegal activities.42 In the same year, the court ruled in Stokke v. Marktplaats that online marketplaces do not have to provide their users’ identifying information to third parties unless with- holding such information would be unreasonable.43

B. Contract Law Contracting parties are free to stipulate the conditions of a contract, which means they can decide not to exchange identifying information or to remain mutually anonymous. Anonymity can be purposeful, even instrumental, to the transaction process, as witnessed by the popularity of online marketplaces and brokers remunerated for the matching of the orders and offering contracts that guarantee buyer and seller anonymity (even as the broker knows the identity of both). Anonymity in contract law is often restricted for practical and legal purposes. From a practical perspective, identity information may be required to perform contractual obligations, such as when physical delivery or payment of the products is involved (there are limited possibilities for delivering or paying anonymously). Moreover, identity information plays a role in building trust; the perceived trust- worthiness of an identified business partner appears to be greater than that of an anonymous one. The risks and importance of a transaction determines the required level of assurance regarding the provided identity information. If payment is made up front, there may be no need to know the buyer’s identity. In other cases, a check of the trade register will suffice. And yet, in others, digital certificates may be required to build a high level of trust. Identity data is also col- lected for the purpose of personalizing services and in determining the value of customers. And, furthermore, identity data is nowadays considered economi- cally valuable—a business asset that provides businesses with incentives to not do business anonymously, even though they could. From a legal perspective, the identity of a defaulting party may be necessary to be able to hold the party accountable. There are also legal obligations requiring

42. Rechtbank Amsterdam [District Court], August 24, 2006, LJN AY6903. Note that Ekker, Anoniem communiceren (n. 7), offers some recommendations to shape the right to anonymity in legislation, with a procedure for providing identifying data in civil proceedings and a provision in the Telecommunications Act, Ch. 10. 43. Rb. Zwolle, May 3, 2006, LJN AW6288. anonymity and the law in the netherlands 511 online businesses to provide identity information.44 “Identity” in this respect means the name of the natural person or of the business providing the online service.45 In the case of businesses, the identity of the owner need not be dis- closed on the Web site; this information can be obtained for a marginal fee from the trade register, something that was established long ago to provide data about businesses in order to facilitate trust in commerce. Dutch contract law is ruled by the principle of consensualism (see Art. 3:37(1), DCivC), yet, in some instances, a certain form (e.g. a signed writing) is required, without which a contract may be void or annulled. Because, according to Dutch case law, signatures are constituted by individualization and identification under Dutch law, signed writings undercut anonymity; a cross, drawn picture or a stamp (unless equivalent to the hand-written signature) are not considered legally valid signatures.46 Electronic signatures are legally valid. The DCivC defines an electronic signa- ture as data in electronic form that are attached to or logically associated with other electronic data and that serve as a method of authentication. Authentication does not necessarily mean identity authentication but may be restricted to data (e.g., the contents of the contract) authentication. In our view, the law, therefore, does not necessarily require an electronic signature to fulfill the identification function. Hence, signing electronically (contrary to what is legally allowed in respect to hand-written signatures) can be done anonymously, although possibly bringing with it less legal certainty about the evidential value such a signature holds in court.47 The court may consider nonidentifying electronic

44. See Art. 7:46c(1a), Dutch Civil Code (Burgerlijk Wetboek). In this respect, this provi- sion overlaps with the Dutch Civil Code (Burgerlijk Wetboek), Art. 3:15d(1a), although these provisions address different kinds of information to be provided by online businesses. Another difference between these provisions is the scope of application: the Dutch Civil Code (Burgerlijk Wetboek), Art. 7:46c, is not restricted to online consumer contracts but covers distance contracts more generally. 45. Kamerstukken II 28197, 2001/02, no. 3, p. 38. 46. S. M. Huydecoper and R. E. van Esch, Geschriften en handtekeningen: een achterhaald concept?, ITeR Series, Vol. 7 (Alphen aan den Rijn: Samsom Bedrijfsinfor- matie, 1997). 47. The principle of contractual freedom, however, also allows contracting parties to determine the evidential value of an electronic signature between them in a probative contract; see also Dutch Civil Code (Burgerlijk Wetboek), Art. 3:15a(6), and consideration 16 of Directive 1999/93/EC. Moreover, the law expressly leaves room for such signatures, which may, nonetheless, be legally equivalent to handwritten signatures so long as the method of authentication is sufficiently reliable in view of the purpose for which the electronic signature was used, and of all other circumstances (see Dutch Civil Code (Burgerlijk Wetboek), Art. 3:15a(1)). This is also called the functional approach. Compare Art. 7 of the UNCITRAL Model Law on Electronic Commerce of 1996, which also takes a functional approach with respect to electronic signatures but expressly requires a method for identifying the signer. 512 simone van der hof, bert-jaap koops, and ronald leenes signatures less reliable than identifying electronic signatures or secure identifying electronic signatures. Directive 1999/93/EC on e-signatures, on which the Dutch Electronic Signature Act is based,48 explicitly permits the use of pseudonyms, yet both the Dutch act and the explanatory memorandum to the act are silent on this point.49 The principle of contractual freedom also allows parties to agree to use pseud- onymous certificates in their transactions. If the law stipulates a specific form of contract encompassing identified parties, pseudonymous digital certificates are not allowed, unless the identity of the certificate’s holder can be obtained from the CSP when necessary. Pseudonym use may also be restricted as a result of the aforementioned information obligations regarding online services. In light of the lack of case law in the area of electronic signatures, the legal status of pseudonymous certificates and identification requirements is not entirely clear. The identification of the signer is one of the basic requirements of advanced and qualified electronic signatures,50 which provide strong and nearly conclusive evidential value, respectively, in court.51 Identification of the parties is a requirement in the regulation concerning the equalization of written and electronic contracts (see Art. 7: 227a(1d), DCivC). This provision requires the identity of the parties to be determinable with suffi- cient certainty. The requirement also has to be interpreted with respect to the reason why a written contract is obligatory in a certain case. If a written document is required solely to provide evidence of the contents of the document, and not identification of the parties, then identification in an electronic environment is not required (as in the offline world). v. anonymity in public

In recent years, the ability to move anonymously in public spaces has decreased as a result of a combination of legal and de facto technological developments. Until January 1, 2005, the Compulsory Identification Act contained identification obligations only in special circumstances (such as during soccer matches and in public transport, when boarding without a ticket). After this date, a general identification obligation applies: Dutch citizens aged 14 and older have to show

48. Which is (mainly) incorporated in the Dutch Civil Code (Burgerlijk Wetboek). 49. The Dutch Minister of Justice has pointed out that special attention to the identification of individuals in an electronic environment is necessary in view of, among other considerations, identity fraud risks and the expected development of anonymous and pseudonymous interaction on the Internet. See Kamerstukken II 27400 VI, 2000/01, no. 2, p. 8. 50. See Dutch Civil Code (Burgerlijk Wetboek), Art. 3:15a(2b). 51. See Dutch Civil Code (Burgerlijk Wetboek), Art. 3:15a(2) and (3). anonymity and the law in the netherlands 513 an official ID when ordered by a police officer or a public supervisor, provided this is necessary to fulfill the police task (Art. 2, Compulsory Identification Act juncto Art. 8a, Police Act 1993).52 Failure to comply is punishable with a fine of up to €3,350 (Art. 447e, Dutch Criminal Code, henceforth DCC). The extended compulsory identification was somewhat controversial when introduced; for example, the group ID Nee (“ID No”) campaigned against it. This group also runs an Identification Abuse Hotline.53 Nevertheless, Parliament accepted the act without much opposition. The effect of the general compulsory identification on public safety has not been empirically studied so far, but there are indications that the law is often used to fine people committing banal offenses, or found in apparently “suspect” circumstances. For example, the police ordered a man who sat idly on the window ledge of a post office to show his ID.54 Although less omnipresent than in the UK, CCTV surveillance is constantly increasing. Different laws apply to camera surveillance depending on the context in which it is used and the person or organization responsible. According to Art. 151c, Municipal Act, the city council can authorize the mayor to decide upon camera surveillance in public areas for purposes of public order for a specified period of time.55 In case of a justified interest (protection of employees, customers, and property), private parties, such as employers and shop owners, may also install cameras for surveillance at the work place, in stores, and so on, so long as these do not cover public areas (e.g., a whole street or public square) and do not infringe upon privacy interests (e.g., by monitoring private places). Camera surveillance at the work place requires the consent of the employees’ council (Art. 27(1l), Employees’ Councils Act).56 In all instances, camera surveillance must be clearly indicated at the respective locations. Secret camera surveillance of individuals in public and private places is illegal pursuant to Art. 441b and 139f, DCC respectively.57 The public prosecutor can order camera surveillance without the recording of confidential information (monitoring) in criminal investigations on suspicion of a crime (Art. 126g(3), DCCP).58 In addition to camera surveillance, other anonymity-decreasing technologies are used or experimented with in different areas. For example, shopping areas in Amsterdam and Utrecht are experimenting with facial recognition technologies aiming at identifying shoplifters. The city of Groningen and the Netherlands

52. Compulsory Identification Act (Wet op de identificatieplicht). Extended Compulsory Identification Act (Wet op de uitgebreide identificatieplicht), Staatsblad 2004, 300. 53. See http://www.id-nee.nl/English.html. 54. Algemeen Dagblad March 2, 2007, see http://www.id-nee.nl/Actueel.html#499. 55. Municipal Act (Gemeentewet), Art. 151c. 56. Employees’ Councils Act (Wet op de ondernemingsraden), Art. 27(11). 57. Dutch Criminal Code (Wetboek van Strafrecht), Art. 441b and 139f. 58. Dutch Code of Criminal Procedure (Wetboek van Strafvordering), Art. 126g(3). 514 simone van der hof, bert-jaap koops, and ronald leenes

Railways, amongst others, have installed systems to detect aggressive behavior. Soccer club ADO Den Haag has introduced a crowd control system to help locate hooligans (relying on facial recognition at the entrance and in the stadium and microphone detection of aggressive or otherwise undesirable speech). Furthermore, a public transport chipcard (OV-chipkaart) is being introduced in the Netherlands that facilitates massive tracking and tracing of travelers through- out the country. The card scheme allows for anonymous and personalized cards. The former option is more expensive and excludes card usage for age discounts, season tickets, and so on. vi. anonymity in citizen-government relationships

A. Service Delivery Citizens do not have an explicit right to anonymity with respect to public service delivery, but the rule of law requires the government to have specific legal grounds to oblige citizens to identify themselves. If there are no such legal grounds, citizens actually have a right to be or stay anonymous in their relation- ship with the government, or at least have no obligation to identify themselves. Irrespective of the requirement for legal grounds, identification plays a key role in government-citizen relationships regarding service delivery, because entitle- ment to particular services is sometimes hard to establish without having access to personal data contained in government records. This requires identification of the individual. Anonymous interaction with the government is increasingly difficult. A recent research report discusses two important causes: the use of new technologies (CCTVs, data-mining, data-sharing, and data-linking) and the ongoing informa- tization of government (and private-sector) administrations. As a result, new kinds of personal information are collected and used, and on a bigger scale than before.59 Although the report focused on the impact of trends in criminal investi- gations and national security on ordinary (nonsuspect) citizens, similar patterns can be expected in public service delivery. The newly introduced biometric pass- ports, which allow automatic identity verification, may spur more obligations to identify for services. Personal data is increasingly linked in order to detect fraudu- lent use of government benefits, for optimizing children’s aid, and for providing proactive and personalized public services. A crucial aspect of the public sector identification web is likely to be the recently introduced Citizen Service Number (Burger Service Nummer; hereafter CSN), which will be used in all communications between citizen and public

59. A. Vedder et al., Van privacyparadijs tot controlestaat? Misdaad—en terreurbestri- jding in Nederland aan het begin van de 21ste eeuw (The Hague: Rathenau Instituut, February 2007), available at http://www.rathenau.nl/showpage.asp?steID=1&item=2097. anonymity and the law in the netherlands 515 administration. After decades of opposition to a unique identification number for citizens in the Netherlands tracing back to the aftermath of World War II, the legislature adopted the CSN with little discussion in July 2007.60 The name CSN is a misnomer, because it is a registration number rather than an instrument to improve service delivery for citizens. The CSN facilitates the sharing and linking of personal data across the public sector and may turn out to be an important inhibiter of anonymity. The CSN consists of the existing social-fiscal number but may be used by many more public entities and in fact replaces other sector- specific numbers. Most public services, especially transaction services, require citizens to prove their identity, whether actively (by showing an ID card or using a digital identifier called DigiD) or passively (by, for example, using their postal address as recorded in the Municipal Registry). Information services can usually be obtained anony- mously, although even in this sector numerous instances exist in which individuals have to identify themselves in order to receive public-sector information. The Dutch Government Information (Public Access) Act 1991 embraces the principle that public information should be public to all.61 In practice, public-sector infor- mation is sometimes disclosed selectively (e.g., for scientific purposes only), implying the identification of the requester.62 From an identification perspective, conventional and electronic communica- tions, in principle, should adhere to the same requirements.63 The emerging electronic identification infrastructure for electronic public service delivery and the potential for the eNIK (an electronic national identity card, which was expected to be deployed in 2008 but the status of which is presently unknown) to introduce advanced electronic signatures therein will likely promote secure identification in public service delivery.64 Because of the limited possibilities for using pseudonymous digital certificates, these ID schemes could increase citizen identification.

60. Staatsblad 2007, 288. 61. Dutch Government Information (Public Access) Act 1991 (Wet openbaarheid van bestuur). 62. S. van der Hof et al., Over wetten en praktische bezwaren, Een evaluatie en toekomstvisie op de Wet openbaarheid van bestuur (Tilburg: Tilburg University, 2004), 19. 63. Electronic communications between citizens and government are regulated by the Act on Electronic Government Communications (Wet elektronisch bestuurlijk verkeer), which has amended the General Administrative Law Act (Algemene Wet Bestuursrecht). Pursuant to this law, electronic messages should be as reliable and confidential as conven- tional communications can be (functional approach). Additionally, the electronic signature provisions in the DCivC apply to electronic communications within the public sector. 64. C.f. B. J. Koops, H. Buitelaar, and M. Lips, eds., D5.4: Anonymity in electronic government: a case-study analysis of governments’ identity knowledge, FIDIS report, May 2007, available at http://www.fidis.net. 516 simone van der hof, bert-jaap koops, and ronald leenes

B. e-Voting Anonymity in voting is closely related to secrecy of the vote, which is an important principle in Dutch democracy. Public elections should reflect the voters’ choices and should be free from undue influence. Secrecy and privacy are, therefore, essential characteristics of public elections. Secrecy of the vote is established in Art. 53(2), DC, and is further detailed in the Voting Act for the different Dutch types of casting votes: ballot boxes (Art. J 15), voting machines (Art. J 33), and postal votes (Art. M 7).65 Although voting secrecy could be seen as optional—the voter may opt to vote anonymously—it is usually taken as a strict mandatory lawful duty, meaning that voter anonymity has to be guaranteed (by the state) for all voters.66 The effect is that voters may tell anyone whom they voted for without being able to prove their claims. Interestingly, voting secrecy is traditionally guaranteed by social control. The polling station officials observe that voters enter the voting booth alone, where they can cast their vote without anyone looking over their shoulder. The ballot paper is subsequently deposited in an opaque box. Electronic voting machines have been in operation in Dutch polling stations since the early 1980s. The voting secrecy is safeguarded in the same manner as for paper ballots. These electronic voting machines have recently caused controversy. The machines leave no paper trail, nor is their software open (or certified), which basically means that only the manufacturer knows what the machines actually count.67 Also a group of concerned people, WijVertrouwenStem- computersNiet (“We don’t trust voting machines”), have proven that the votes cast in the machines used in the Netherlands can be remotely monitored through interception of residual screen radiation, which undermines the secrecy of the vote and therefore affects the anonymity of the voting process. Voter anonymity is even more problematic in postal and Internet voting. Postal voting is an option for Dutch nationals living or residing abroad at the time of general elections (Art. M 1 Voting Act).68 The voter can be coerced, and the voting ballot inspected, before being sent to the election officer. Furthermore, the vote can be linked to the voter because it is sent with a separate letter stating that the voter personally cast her vote. Therefore, voter anonymity depends on the election officers who receive the postal votes.

65. Dutch Constitution (Grondwet), Art. 53(2); Voting Act (Kieswet), Art. J 15, J 33 and M 7. 66. See H. Buchstein, “Online Democracy, Is It Viable? Is It Desirable? Internet Voting and Normative Democratic Theory,” in Electronic Voting and Democracy— A Comparative Analysis, eds. N. Kersting and H. Baldersheim (London: Palgrave, 2004). 67. See http://www.wijvertrouwenstemcomputersniet.nl/English. 68. Dutch Voting Act (Kieswet), Art. M 1. anonymity and the law in the netherlands 517

Since the late 1990s, the Dutch government has actively pursued the introduction of Internet voting. Experiments with Internet voting for expatriates were conducted during the Dutch elections for the European Parliament in 2004 and the November 2006 general elections. Internet voting is even more prob- lematic than postal ballots are. Not only is family-voting unavoidable, but also the voting process itself is highly opaque. An interesting question regarding postal ballots and Internet voting is whether the European Court of Human Rights will sanction them if challenged in the light of Art. 3, protocol 1, of the European Convention on Human Rights (secrecy of the vote).69

C. Anonymized Case-Law and “Naming and Shaming” Anonymity, for privacy and data-protection reasons, features in administrative law regarding the publishing of court cases. The official court cases Web site, www.rechtspraak.nl, generally anonymizes cases by replacing names with neutral indications such as “plaintiff” or “defendant.” Names of professionals (such as judges, lawyers, interpreters, and expert witnesses) are not, however, anonymized. Names of legal persons, such as companies, are also published, unless these are directly linkable to individual persons.70 However anonymous the cases thus published may be, there are several legal provisions that require complete publication of a case, which may be viewed as a form of “naming and shaming.” Examples are • Fines or penalties delivered in competition cases (implying that a case is open for inspection at the Dutch Competition Authority (NMa)) must be published in the official journal Staatscourant (Art. 65, Competition Act); • The Netherlands Authority for the Financial Markets (AFM) can publish the names and addresses of people fined for violating financial-market legislation (see, for example, Art. 48m, Stock Trade Supervision Act). Moreover, the AFM can, in order to “promote compliance with the Act,” apart from imposing possible fines, publish facts that violate the Act (Art. 48n); • The yearly publishing in the Staatscourant of a list of companies who have submitted insufficient emission rights in the context of environmental-law obligations (Art. 18.16p(1), Environmental Protection Act).71 Whereas these “naming sanctions” largely affect legal persons, a similar provision affecting citizens can be found in criminal law. The publication of a criminal sentence can be ordered in criminal cases as an additional

69. L. Pratchett, The implementation of electronic voting in the UK (London: Local Government Association, 2002). 70. See http://www.rechtspraak.nl/Over+deze+site/ under “Anonimiseringsrichtlijnen.” 71. Competition Act (Mededingingswet), Art. 65; Stock Trade Supervision Act (Wet toezicht effectenverkeer), Art. 48m; Environmental Protection Act (Wet milieubeheer), Art. 18.16p(1). 518 simone van der hof, bert-jaap koops, and ronald leenes sanction (Art. 9(1)(b)(3) DCC) or as an alternative sanction (Art. 9(1)(b)(3) juncto Art. 9(5), DCC); this is also possible with economic offences (Art. 7(g), Economic Offences Act).72 Whether publishing a criminal verdict with the name of the convicted person is justified depends on the kind rather than the seriousness of the crime. For example, it may be a useful response to convictions for selling health-threatening food, death by neglect in official functions, embezzlement, or fraudulent bankruptcy. The sanction of pub- lishing is, however, hardly ever given in practice.73 vii. conclusion

A. Does a Right to Anonymity Exist in Dutch Law? Our overview clearly shows that no general right to anonymity exists in Dutch law. Rather, identification is the default. In criminal law, this is unsurprising in light of the obvious importance of identification in crime detection and prosecu- tion. In civil and administrative law, however, anonymity might have been expected to be more important than it actually is. After all, there is often no intrinsic need to know the identity of someone engaged with in a legal act. In today’s society, however, with its pervasive ICT infrastructures that increas- ingly facilitate the performance of legally relevant acts at a distance, the trust that often used to come with face-to-face relationships must be reconstructed by other means. Identification is a prime tool for re-establishing trust between unfamiliar parties who often lack other indicators of the other party’s likelihood of keeping his or her part of the deal. This may at least partially explain why anonymity currently does not feature in the legal framework for e-commerce and e-government. On the contrary, powerful identification infrastructures are built in the private sector, with e-signatures and PKIs, and even stronger ones in the public sector, with the Citizen Service Number and the eNIK. Furthermore, the legal frameworks for these infrastructures are far from sympathetic to anonymity: the possibility of pseudonymous e-signature certificates is not embedded in Dutch law (and is even prohibited in e-government relationships), and the CSN will be used across all areas of government. Moreover, identification duties are on the rise, with recent laws requiring e-commerce providers and—most importantly—citizens aged 14 and older, in general, to identify themselves when requested by an officer. Another significant detail is that many “naming and shaming” provisions, particularly in

72. Dutch Criminal Code (Wetboek van Strafrecht), Art. 9(1)(b)(3) and Art. 9(1)(b)(3) juncto Art. 9(5); Economic Offences Act (Wet op de economische delicten), Art. 7(g). 73. C. P. M. Cleiren and J. F. Nijboer, Tekst & Commentaar Strafrecht (Deventer: Kluwer, 2000); Dutch Criminal Code (Wetboek van Strafrecht), Art. 9 (n. 3). anonymity and the law in the netherlands 519 administrative law, were introduced over the past years, adding to the significance that modern society apparently attaches to identification in public. This is not to say that anonymity is non-existent in Dutch law. There are various legal fields in which forms of anonymity rights exist. Occasionally, anonymity is the default (in voting, and official case-law publication) or a strong right (in relation to freedom of expression). More often, however, a right to anonymity is the excep- tion to the rule: threatened witnesses, both in civil and criminal cases, can testify anonymously—a right which, ironically, can be claimed both by organized crimi- nals and by intelligence officers—and contracts can be concluded anonymously. The rationale underlying these anonymity rights is diverse. An intrinsic reason for protecting anonymity clearly seems absent. In all fields where some form of anonymity right exists, this right is instrumental in furthering the purpose of the law and policy in those fields. Anonymity is an important tool for fair voting procedures, for stimulating the public exchange of (unpopular) ideas to enhance the freedom of expression and of thought, and, sometimes, for protecting the life and limb of persons at risk. Anonymity thus is a servant to several masters. Altogether, we conclude that in Dutch law, there is merely a very piecemeal and rather weak right to anonymity. Only in some very specific areas of law is there a strong claim to anonymity, but the default position in Dutch law is that identification is preferred and, increasingly, mandated.

B. Should a Right to Anonymity Be Created in Dutch Law? If a substantial right to anonymity does not exist, should it be created? After all, the discussions in the Netherlands about anonymity (for example, in the debate over digital constitutional rights) indicate that there is some reason to cherish anonymity in the current, ICT-pervaded society—if only to a certain extent. The key questions are whether the need for anonymity is significant enough for a full-bodied right to anonymity, and if so, how and to what extent such a right should be defined. We think there is insufficient reason to answer the first question, generally, in the affirmative. There is some need for anonymity in numerous situations, and this need is perhaps growing, but the contexts in which anonymity plays a part are so diverse, that speaking of “a” right to “anonymity” is hardly justified as such. Instead, the question should be asked in terms of which contexts, sectors of society, and legal areas there exists a substantial need to protect anonymity nowadays. Ekker, for example, has pleaded for a right to anonymous communications, which should take the form of a procedure for providing identifying data in civil proceedings and a provision in the Telecommunications Act.74 Much pleads in favor of this suggestion, for (tele)communication is nowadays a crucial enabler for almost all activities. People increasingly generate traces when communicating

74. Ekker, Anoniem communiceren (n. 7). 520 simone van der hof, bert-jaap koops, and ronald leenes via ICT—not only on the Internet, but also with mobile telecommunications, which increasingly have to be stored as a result of regulation such as the Data Retention Directive. The variety of parties that can somehow access, legitimately or unlawfully, these traces, combined with the deficit of data-protection law to effectively limit the wide-scale processing of personal data in practice, warrants the conclusion that privacy is slowly disappearing.75 To stop this process, a right to anonymity, rather than reliance on data-protection law only, could be a valuable add-on in an integral effort to save privacy. A right to anony- mous communications would also foster other rights, not least the freedom of expression in an online environment. Naturally, such a right could and should not be absolute: it would only be relative to certain parties and could be infringed when sufficient interests required identification. But the important added value would be that the default position would be reversed: rather than the current assumption of identification, there would be an assumption of anonymity, revoked only when necessary. A right to anonymous communications—something worthy of much more elaboration than we may present here—is one example of an area in which a right to anonymity makes sense. When surveying the developments currently taking place in ICT, and also in DNA fingerprinting and bio-banking, we see that large-scale identification infrastructures are being built.76 Experience shows that infrastructures persist: they can be adapted or rebuilt, but they are rarely removed. This means that we should carefully analyze the consequences of large-scale identification, for these will become the norm in the coming decades. A result may well be that in situations where previously people used to act anonymously, identification will be a standard instead, whether mandatorily (because of legal obligations) or in practice (because technology just happens to implement identification). And because anonymity used to be taken for granted in such situations—buying groceries in a supermarket, walking in another city, travelling by train, visiting a soccer match—there is no history of a “right to” anonymity in these cases: if something is natural, there is no need to protect it by law. Now that the world is changing, with anonymity no longer a matter of course, it is time to consider embedding the protection of anonymity in law for those areas of social life that entail no intrinsic need for identification. Although anonymity hardly has a history as a right, it may well have a future as a right, and in many more fields than is the case today. It is perhaps one of the

75. See B. J. Koops and R. E. Leenes, “‘Code’ and the Slow Erosion of Privacy,” Michigan Telecommunications & Technology Law Review 12, no. 1 (2005): 115–188, http://www.mttlr.org/voltwelve/koops&leenes.pdf. 76. See also Koops, Buitelaar, and Lips, Anonymity in electronic government, 76–77 (n. 42). anonymity and the law in the netherlands 521 few available “tools of opacity”77 that ensure an acceptable balance between the powerful and the power-poor in today’s technology-pervaded, identification- driven society.

77. See S. Gutwirth and P. de Hert, “Privacy and Data Protection in a Democratic Constitutional State,” in D7.4: Implications of profiling practices on democracy and rule of law, eds. M. Hildebrandt et al., FIDIS report, September 2005, available at http://www.fidis.net, 11–28.