From Anonymity to Identification A

Total Page:16

File Type:pdf, Size:1020Kb

From Anonymity to Identification A University of Miami Law School University of Miami School of Law Institutional Repository Articles Faculty and Deans 2015 From Anonymity to Identification A. Michael Froomkin University of Miami School of Law, [email protected] Follow this and additional works at: https://repository.law.miami.edu/fac_articles Part of the Privacy Law Commons, and the Science and Technology Law Commons Recommended Citation A. Michael Froomkin, From Anonymity to Identification, 1 Journal of Self-Regulation and Regulation 120 (2015). This Article is brought to you for free and open access by the Faculty and Deans at University of Miami School of Law Institutional Repository. It has been accepted for inclusion in Articles by an authorized administrator of University of Miami School of Law Institutional Repository. For more information, please contact [email protected]. Journal of Self-Regulation and Regulation Volume 01 (2015) From Anonymity to Identification A. Michael Froomkin Abstract This article examines whether anonymity online has a future. In the early days of the In- ternet, strong cryptography, anonymous remailers, and a relative lack of surveillance cre- ated an environment conducive to anonymous communication. Today, the outlook for online anonymity is poor. Several forces combine against it: ideologies that hold that ano- nymity is dangerous, or that identifying evil-doers is more important than ensuring a safe mechanism for unpopular speech; the profitability of identification in commerce; govern- ment surveillance; the influence of intellectual property interests and in requiring hard- ware and other tools that enforce identification; and the law at both national and supra- national levels. As a result of these forces, online anonymity is now much more difficult than previously, and looks to become less and less possible. Nevertheless, the ability to speak truly freely remains an important 'safety valve' technology for the oppressed, for dissidents, and for whistle-blowers. The article argues that as data collection online merg- es with data collection offline, the ability to speak anonymously online will only become more valuable. Technical changes will be required if online anonymity is to remain possi- ble. Whether these changes are possible depends on whether the public comes to appre- ciate value the option of anonymous speech while it is still possible to engineer mecha- nisms to permit it. Keywords Internet, Privacy, Anonymity, Surveillance, Speech © A. Michael Froomkin DOI: 10.11588/josar.2015.0.23480 A. Michael Froomkin From Anonymity to Identification A. Michael Froomkin 1 Introduction This is not a cheerful paper, which mirrors the lecture from which it is adapted.1 It is not cheerful because the prognosis for effective online anonymity has become pro- gressively less and less cheerful. By anonymity, I mean something strong: the ability to speak without anybody being able to identify you. This untraceable anonymity is the highest level of technical pro- tection for speech (cf. Froomkin 1995). That it is untraceable brings with it the ability to speak without fear of jail, harm, or other retaliation. This strong version of anonymi- ty is controversial. The arguments for why untraceable anonymity is a good thing in- clude the idea that it contributes to human flourishing; people want to experiment, and the ability to experiment with less fear contributes to human self-realization. In places that are less free, avoiding retribution for saying the wrong thing may be a mat- ter of life and death. Political dissidents, ethnic minorities, religious splinter groups, people campaigning for women’s rights or gay rights, and many others are, or have been, subject to the risk of genuine and very palpable violence. If they wish to speak or write for their causes they need a means to protect themselves. Anonymity is one such tool. For those of us who do not face the danger of personal violence in retaliation for our writings, there are nonetheless other more subtle dangers. One is the danger of profiling. Profiling enables stereotypical discrimination on the basis of sexual orienta- tion, political opinion, or other characteristics. Less serious, but annoying, is the effort commercial entities make to profile for marketing purposes. Anonymity is a way to defend against that profiling, a protection that is desirable not just because some find it creepy or uncomfortable to be profiled, but also because the people doing the profiling could actually exercise market power by doing price dis- crimination (DeLong et al. 2000). When the Internet started, one byproduct of the architecture of the Internet was that online anonymity was easy to achieve. It required only minimal technical knowledge, or fairly simple tools, or assistance from the right people. In those days you 1 The lecture from which this paper is adapted was delivered at the University of Heidelberg on Dec 11, 2015, under the auspices of the Netzpolitik AG of the University of Heidelberg. I am very grateful to Dr. Wolf J. Schünemann and all my hosts for the invitation and the warm welcome. Unless other- wise noted, this paper attempts to reflect legal and technical developments up to May 1, 2015. 121 From Anonymity to Identification could be anonymous and have a great deal of faith in being successfully untraceable. Cryptography made this possible. PGP – "Pretty Good Privacy" – was one of the early and important tools, perhaps the first consumer-oriented cryptography. Another very important tool was the secure anonymous remailer (cf. Froomkin 1997: 129). An anonymous remailer works as follows: Alice sends out an email destined for Bob which contains within it one or more layers. Each layer consists of an encrypted mes- sage, which I will call the payload, and unencrypted delivery instructions, making the entire message something like a sealed paper letter with an address written on the front. Although destined for Bob, Alice addresses the message to a remailer operator. The remailer operator strips off the headers of Alice's email – the part that identified it as coming from her – and forwards the payload message as directed. Alice could direct that the payload go to a second remailer, who would decrypt the payload only to find in it another layer of address and (to him) unreadable payload. By chaining the mes- sage in this manner through two or more remailers, Alice could ensure that before the message reached Bob its origins would be thoroughly obfuscated. What is more, if Al- ice and Bob used encryption to exchange messages, then none of the remailer opera- tors would ever know what Alice was saying. Even if Bob was not so sophisticated, only the last remailer in the chain would be capable of reading the cleartext, and that per- son would have no way of finding out who originally sent the message unless every remailer in the chain kept logs and cooperated in unraveling the message's path. Sending messages this way was never easy, and would be harder today because there are fewer reliable remailers. Remailers ran into two serious problems. The first is that spammers abused the remailer network. Strangely, a small amount of spam is a good thing for the remailer network because the spammers can be relied on to create a certain volume of traffic and this makes it harder for any observer to trace the genu- ine messages as they move through the network (ib.). But in short order the volume of spam increased to the point that so much spam went through the network as to first burden it and then choke many of the remailers entirely (Canter 2003). Worse, the remailers faced a serious legal problem. The end point in any chain of remailers – the exit node – carries legal risk for misuse of the network. If, for example, your computer was used to send a death threat to the US President, ignorance of the message's content was not a comforting defense, and certainly was no shield against an investigation. To say that the entire process was automated did not necessarily pro- vide a sufficient defense either.2 As governments and police departments became more technologically savvy, the email operators increasingly decided it was just too unsafe to run a remailer, or at least to run an exit node. As fewer and fewer remailers were willing to be exits, the ones that persisted became overwhelmed with all the 2 Notably, the protections of the CDA against civil liability for third-party postings do not apply to criminal law. 47 U.S.C. §230(e)(1). 122 A. Michael Froomkin spam that sought release into the greater Internet, thus accelerating the network's death spiral. Almost all these anonymity systems depended on some kind of cryptography. Western governments, and in particular the U.S. government, worked very hard to slow the spread of consumer cryptography. Governments were basically able to re- strict the spread of cryptography, originally through export control (Froomkin 1996: 15–75). By preventing standardization, they made it easier to maintain the wiretapping and interception capabilities of both national security services and ordinary law en- forcement. For many years this project of prevention was successful, and consumer cryptography was rare; indeed, only specialists used it – the presence of strongly en- crypted traffic was sufficiently unusual to raise the possibility that observers might use it as a signal that one had something to hide, drawing extra attention from the authori- ties. All these things undermine access to anonymity (in the strong sense). We can di- vide the history of Internet communications into three periods when using anonymity as a yardstick. In the first period, as noted above, anonymity was easy if you knew what you were doing, but it was not available easily to the consumer. This period end- ed when the remailers died off and strong anonymity was reduced to what I would call a 'safety valve' technology.
Recommended publications
  • Anonymity and Encryption in Digital Communications, February 2015
    Submission to the UN Special Rapporteur on freedom of expression – anonymity and encryption in digital communications, February 201 1! Introduction Privacy International submits this information for the Special Rapporteur's report on the use of encryption and anonymity in digital communications. Anonymity and the use of encryption in digital communications engage both the right to freedom of expression and right to privacy very closely: anonymity and encryption protects privacy, and ithout effective protection of the right to privacy, the right of individuals to communicate anonymously and ithout fear of their communications being unla fully detected cannot be guaranteed. As noted by the previous mandate holder, !rank #aRue, $the right to privacy is essential for individuals to express themselves freely. Indeed, throughout history, people%s illingness to engage in debate on controversial sub&ects in the public sphere has al ays been linked to possibilities for doing so anonymously.'( Individuals' right to privacy extends to their digital communications.) As much as they offer ne opportunities to freely communicate, the proliferation of digital communications have signi*cantly increased the capacity of states, companies and other non+state actors to interfere ith individual's privacy and free expression. Such interference ranges from mass surveillance of communications by state intelligence agencies, to systematic collection and storage of private information by internet and telecommunication companies, to cybercrime. Anonymity and encryption are essential tools available to individuals to mitigate or avert interferences ith their rights to privacy and free expression. In this ay, they are means of individuals exercising to the fullest degree their rights hen engaging in digital communications.
    [Show full text]
  • Privacy on the Internet: the Vole Ving Legal Landscape Debra A
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by Santa Clara University School of Law Santa Clara High Technology Law Journal Volume 16 | Issue 2 Article 10 January 2000 Privacy on the Internet: The volE ving Legal Landscape Debra A. Valentine Follow this and additional works at: http://digitalcommons.law.scu.edu/chtlj Part of the Law Commons Recommended Citation Debra A. Valentine, Privacy on the Internet: The Evolving Legal Landscape , 16 Santa Clara High Tech. L.J. 401 (2000). Available at: http://digitalcommons.law.scu.edu/chtlj/vol16/iss2/10 This Symposium is brought to you for free and open access by the Journals at Santa Clara Law Digital Commons. It has been accepted for inclusion in Santa Clara High Technology Law Journal by an authorized administrator of Santa Clara Law Digital Commons. For more information, please contact [email protected]. PRIVACY ON THE INTERNET: THE EVOLVING LEGAL LANDSCAPE Prepared Remarks of Debra A. Valentinet TABLE OF CONTENTS I. Introduction ................................................................................ 401 II. Privacy on the Internet-The Evolving Legal Landscape ............. 403 A. Federal Trade Commission Act and Informational Privacy ........... 404 B. Other Federal Statutes and Informational Privacy ......................... 408 C. Federal Intemet-Law ................................................................... 410 III. Current Federal Policy on Internet Privacy .................................. 412 IV. The U.S. Approach
    [Show full text]
  • The Right to Privacy and the Future of Mass Surveillance’
    ‘The Right to Privacy and the Future of Mass Surveillance’ ABSTRACT This article considers the feasibility of the adoption by the Council of Europe Member States of a multilateral binding treaty, called the Intelligence Codex (the Codex), aimed at regulating the working methods of state intelligence agencies. The Codex is the result of deep concerns about mass surveillance practices conducted by the United States’ National Security Agency (NSA) and the United Kingdom Government Communications Headquarters (GCHQ). The article explores the reasons for such a treaty. To that end, it identifies the discriminatory nature of the United States’ and the United Kingdom’s domestic legislation, pursuant to which foreign cyber surveillance programmes are operated, which reinforces the need to broaden the scope of extraterritorial application of the human rights treaties. Furthermore, it demonstrates that the US and UK foreign mass surveillance se practices interferes with the right to privacy of communications and cannot be justified under Article 17 ICCPR and Article 8 ECHR. As mass surveillance seems set to continue unabated, the article supports the calls from the Council of Europe to ban cyber espionage and mass untargeted cyber surveillance. The response to the proposal of a legally binding Intelligence Codexhard law solution to mass surveillance problem from the 47 Council of Europe governments has been so far muted, however a soft law option may be a viable way forward. Key Words: privacy, cyber surveillance, non-discrimination, Intelligence Codex, soft law. Introduction Peacetime espionage is by no means a new phenomenon in international relations.1 It has always been a prevalent method of gathering intelligence from afar, including through electronic means.2 However, foreign cyber surveillance on the scale revealed by Edward Snowden performed by the United States National Security Agency (NSA), the United Kingdom Government Communications Headquarters (GCHQ) and their Five Eyes partners3 1 Geoffrey B.
    [Show full text]
  • What Data Is Best Kept Private and Why?
    What data is best kept private and why? Hi, My name is Catherine Gicheru. I am an ICFJ Knight Fellow and the Director of the African Women Journalism Project, an initiative that works to support, train and mentor women journalists in Africa in covering underreported communities and issues. Last week Arzu Geybulla took you through what online harassment looks like, how to look after your emotional wellbeing and shared resources for psychosocial support. This week, l will be telling you about online privacy...what is it and why it's important. You will learn what information you should or shouldn't share online and how the information you share online can make you vulnerable to online attacks and targeted by bad actors. We shall also share some practical tips on what you can do to protect your privacy online. This week in addition to the videos there will be additional reading materials. I encourage you all to participate in the discussions that we shall be having throughout the week. Thank you for joining me and I look forward to working with you to better understand Online Privacy. I would like to start this second module by quickly talking about how the internet works so that you can better understand what data you share knowingly or unknowingly, and why some information should be kept private. There’s a lot of technical explanations of how the internet works, but the simplest explanation is this, the Internet is a vast, sprawling collection of networks that connect to each other. In fact, the word “Internet” comes from this concept: interconnected networks.
    [Show full text]
  • 1 Before the U.S. COPYRIGHT OFFICE, LIBRARY of CONGRESS
    Before the U.S. COPYRIGHT OFFICE, LIBRARY OF CONGRESS In the Matter of Section 512 Study Docket No. 2015-7 Additional Comments of Electronic Frontier Foundation February 21, 2017 Submitted by: Corynne McSherry Mitch Stoltz Kerry Maeve Sheehan1 Electronic Frontier Foundation 815 Eddy Street San Francisco, CA 94109 Telephone: (415) 436-9333 [email protected] The Electronic Frontier Foundation (EFF) appreciates the Copyright Office’s efforts to consider the impact of section 512 of the Digital Millennium Copyright Act. EFF is a member-supported, nonprofit, public interest organization dedicated to ensuring that copyright law advances the progress of science and the arts and enhances freedom of expression. Founded in 1990, EFF represents tens of thousands of dues-paying members, including consumers, hobbyists, computer programmers, entrepreneurs, students, teachers, and researchers, who are united in their desire for a balanced copyright system that provides adequate incentives for creators, facilitates innovation, and ensures broad access to information in the digital age. EFF has been involved, as amicus or party counsel, in numerous court cases interpreting section 512, including Lenz v Universal, Lessig v. Liberation Music, MoveOn v. Viacom, Tuteur v. Wesley Corcoran, Sapient v. Geller, SHARK v. PRCA, Viacom v. YouTube, UMG Recordings v. Veoh, and Columbia Pictures v. Fung. We regularly counsel 1 Kerry Maeve Sheehan is not admitted to practice law. 1 individuals who have been targeted by takedown notices pursuant to the DMCA regarding their rights and options. We have also represented the interests of users and innovators on multiple formal and informal public processes assessing the notice and takedown system, in the United States and abroad.
    [Show full text]
  • A Call from the Panopticon to the Judicial Chamber “Expect Privacy!”
    Journal of International Commercial Law and Technology Vol.1, Issue 2 (2006) A Call From the Panopticon to the Judicial Chamber “Expect Privacy!” Julia Alpert Gladstone Bryant University Smithfield, RI,USA Abstract Privacy is necessary in order for one to develop physically, mentally and affectively. Autonomy and self definition have been recognized by the United States Supreme Court as being values of privacy. In our technologically advanced, and fear driven society, however, our right to privacy has been severely eroded. Due to encroachment by government and business we have a diminished expectation of privacy. This article examines the detriments to self and society which result from a reduced sphere of privacy, as well as offering a modest suggestion for a method to reintroduce an improved conception of privacy into citizens’ lives. Keywords: Privacy, Technology, Judiciary, Autonomy, Statute INTRODUCTION Citizens often pay scant attention to their privacy rights until a consciousness raising event, such as the late 2005 exposure of the unauthorized wiretapping performed by the United States Executive Branch, heightens their interest and concern to protect previously dormant privacy rights. The scholarly literature on privacy is plentiful which broadly explains privacy as a liberty or property right, or a condition of inner awareness, and explanations of invasions of privacy are also relevant. This article seeks to explain why privacy remains under prioritized in most people’s lives, and yet, privacy is held as a critical, if not, fundamental right. Privacy is defined according to social constructs that evolve with time, but it is valued, differently. It is valued as a means to control one’s life.
    [Show full text]
  • Is the Market for Digital Privacy a Failure?1
    Is the Market for Digital Privacy a Failure?1 Caleb S. Fuller2 Abstract Why do many digital firms rely on collecting consumer information–a practice that survey evidence shows is widely disliked? Why don’t they, instead, charge a fee that would protect privacy? This paper empirically adjudicates between two competing hypotheses. The first holds that firms pursue this strategy because consumers are ill-informed and thus susceptible to exploitation. The second holds that this strategy reasonably approximates consumer preferences. By means of survey, I test a.) the extent of information asymmetry in digital markets, b.) consumers’ valuation of privacy, and c.) whether government failure contributes to consumer mistrust of information collection. My results indicate that a.) the extent of information asymmetry is minimal, b.) there is significant divergence between “notional” and “real” demand for privacy and c.) that government contributes to consumer distrust of information collection by private firms. Significantly, almost 82% of Google users are unwilling to pay anything for increased digital privacy. JEL-Classification: D23, K29, Z18 Keywords: privacy paradox, digital privacy, survey, market failure 1 I wish to thank Alessandro Acquisti, Peter Leeson, Chris Coyne, Peter Boettke, David Lucas, Noah Gould, and Nicholas Freiling for helpful suggestions. All errors are my own. I am also indebted to the Mercatus Center for providing funding for the survey conducted by Haven Insights LLC. 2 Assistant professor of economics, Grove City College, Email: 1 INTRODUCTION Google’s motto is “Don’t Be Evil.” But the fact that the company surreptitiously collects the information of over one billion individuals annually leads some to question whether the firm’s business model runs afoul of its dictum (Hoofnagle 2009).
    [Show full text]
  • Principles of Internet Privacy
    Maurer School of Law: Indiana University Digital Repository @ Maurer Law Articles by Maurer Faculty Faculty Scholarship 2000 Principles of Internet Privacy Fred H. Cate Indiana University Maurer School of Law, [email protected] Follow this and additional works at: https://www.repository.law.indiana.edu/facpub Part of the Computer Law Commons, and the Law and Society Commons Recommended Citation Cate, Fred H., "Principles of Internet Privacy" (2000). Articles by Maurer Faculty. 243. https://www.repository.law.indiana.edu/facpub/243 This Article is brought to you for free and open access by the Faculty Scholarship at Digital Repository @ Maurer Law. It has been accepted for inclusion in Articles by Maurer Faculty by an authorized administrator of Digital Repository @ Maurer Law. For more information, please contact [email protected]. Principles of Internet Privacy FRED H. CATE* I. INTRODUCTION Paul Schwartz's InternetPrivacy and the State makes an important and original contribution to the privacy debate that is currently raging by be- ginning the process of framing a new and more useful understanding of what "privacy" is and why and how it should be protected.' The definition developed by Brandeis, Warren,2 and Prosser,3 and effectively codified by Alan Westin in 1967---"the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others"---worked well in a world in which most privacy concerns involved physical intrusions (usually by the government) or public disclosures (usually by the media), which, by their very nature, were comparatively rare and usually discovered.
    [Show full text]
  • The Right to Privacy in the Digital Age
    The Right to Privacy in the Digital Age April 9, 2018 Dr. Keith Goldstein, Dr. Ohad Shem Tov, and Mr. Dan Prazeres Presented on behalf of Pirate Parties International Headquarters, a UN ECOSOC Consultative Member, for the Report of the High Commissioner for Human Rights Our Dystopian Present Living in modern society, we are profiled. We accept the necessity to hand over intimate details about ourselves to proper authorities and presume they will keep this information secure- only to be used under the most egregious cases with legal justifications. Parents provide governments with information about their children to obtain necessary services, such as health care. We reciprocate the forfeiture of our intimate details by accepting the fine print on every form we sign- or button we press. In doing so, we enable second-hand trading of our personal information, exponentially increasing the likelihood that our data will be utilized for illegitimate purposes. Often without our awareness or consent, detection devices track our movements, our preferences, and any information they are capable of mining from our digital existence. This data is used to manipulate us, rob from us, and engage in prejudice against us- at times legally. We are stalked by algorithms that profile all of us. This is not a dystopian outlook on the future or paranoia. This is present day reality, whereby we live in a data-driven society with ubiquitous corruption that enables a small number of individuals to transgress a destitute mass of phone and internet media users. In this paper we present a few examples from around the world of both violations of privacy and accomplishments to protect privacy in online environments.
    [Show full text]
  • 13 Principles for a Human Rights Respecting State Surveillance Framework
    13 Principles for a Human Rights Respecting State Surveillance Framework Reporters Without Borders (RWB) joins the Association for Progressive Communications and more than 215 other organizations in calling for the adoption of the “International Principles on the Application of Human Rights to Communications Surveillance,” which were developed by Access, Electronic Frontier Foundation and Privacy International in consultation with a group of international experts. Privacy is a fundamental human right, and is central to the maintenance of democratic societies. It is essential to human dignity, reinforces other rights, such as freedom of expression, information and association, and is recognised under national, regional and international human rights instruments. This Council recognized with A/HRC/RES/20/8 “the same rights that people have offline must also be protected online.” However, in recent years, many States have developed national legal frameworks and practices that exploit the capacities of new technologies to engage in mass surveillance. Those frameworks reflect a shift from surveillance of communications based on the rule of law (in particular legally authorised targeted surveillance based on clear criteria) to mass surveillance through untargeted collection of communications data of ordinary citizens where no lawful grounds for surveillance exists. These national legal frameworks often contain outmoded definitions of communications that purport to distinguish between the content of communication, versus data about the communication. Typically, the content of communications is strongly protected whereas non-content transactional data, traffic data, or “meta-data” is typically given much less protection, even though it can be just as revealing and have serious privacy implications. Activities that restrict the right to privacy, including communications surveillance, can only be justified when prescribed by law, are necessary to achieve a legitimate aim, and are proportionate to the aim pursued.
    [Show full text]
  • Internet Privacy and the State
    UC Berkeley UC Berkeley Previously Published Works Title Internet Privacy and the State Permalink https://escholarship.org/uc/item/37x3z12g Author Schwartz, Paul M Publication Date 2021-06-27 Peer reviewed eScholarship.org Powered by the California Digital Library University of California Berkeley Law Berkeley Law Scholarship Repository Faculty Scholarship 1-1-1999 Internet Privacy and the State Paul M. Schwartz Berkeley Law Follow this and additional works at: http://scholarship.law.berkeley.edu/facpubs Part of the Law Commons Recommended Citation Paul M. Schwartz, Internet Privacy and the State, 32 Conn. L. Rev. 815 (1999), Available at: http://scholarship.law.berkeley.edu/facpubs/766 This Article is brought to you for free and open access by Berkeley Law Scholarship Repository. It has been accepted for inclusion in Faculty Scholarship by an authorized administrator of Berkeley Law Scholarship Repository. For more information, please contact [email protected]. Interet Privacy and the State PAUL M. SCHWARTZ" INTRODUCTION "Of course you are right about Privacy and Public Opinion. All law is a dead letter without public opinion behind it. But law and public opinion in-1 teract-and they are both capable of being made." Millions of people now engage in daily activities on the Internet, and under current technical configurations, this behavior generates finely grained personal data. In the absence of effective limits, legal or other- wise, on the collection and use of personal information on the Internet, a new structure of power over individuals is emerging. This state of affairs has significant implications for democracy in the United States, and, not surprisingly, has stimulated renewed interest in information privacy? Yet, the ensuing debate about Internet privacy has employed a deeply flawed rhetoric.
    [Show full text]
  • Mixminion: Design of a Type III Anonymous Remailer Protocol
    Mixminion: Design of a Type III Anonymous Remailer Protocol G. Danezis, R. Dingledine, N. Mathewson IEEE S&P 2003 Presented by B. Choi in cs6461 Computer Science Michigan Tech Motivation ● Anonymous email only – High latency vs. near real-time (onion routing) ● Anonymous email implementations – Type 1: Cypherpunk (80’s) ● vulnerable to replay attacks – Type 2: Mixmaster(92) ● message padding and pooling – Type 3: Mixminion (2003) ● Anonymous Replies! Reply block? ● Most or many systems support sender anonymity ● Pynchon Gate supports receiver anonymity in an interesting way (P2P file sharing: 2005) – Send everything to everywhere (everyone) ● Is receiver anonymity too hard to achieve? – First of all, receiver has to use pseudonyms ● Pseudonym policy: how many, valid period, ... Reply blocks ● Chaum(‘81), BABEL (‘96), Mixmaster (92) .. – Entire path is chosen by the sender ● Variations are possible ● BABEL RPI is invisible to passive external attackers ● BABEL RPI is visible to internal passive attackers (mix) – Can be used multiple times? ● Good for communication efficiency ● Bad for anonymity due to potential path information leaking ● Adversary could utilize the pattern of the same reply block Fundamental solution to the reply block problem? ● One way is to use single-use reply blocks (SURB) ● Reply messages are indistinguishable from forward messages even to mix nodes ● Effect: both reply and forward messages share the same anonymity set ● SURB ● How to design SURB? – Sender generates SURB – To defeat replay, each intermediate node
    [Show full text]