Andrea M. Matwyshyn* & Stephanie K. Pell
Total Page:16
File Type:pdf, Size:1020Kb
Harvard Journal of Law & Technology Volume 32, Number 2 Spring 2019 BROKEN1 Andrea M. Matwyshyn* & Stephanie K. Pell** TABLE OF CONTENTS I. INTRODUCTION .............................................................................. 481 II. DEBUGGING REQUIRED: THE LIMITATIONS OF THE COMPUTER FRAUD AND ABUSE ACT ............................................ 483 A. The Problem of “Double Whammy” Conduct: Doctrinal Limitations ............................................................................. 484 1. Void for Vagueness ............................................................... 484 2. Damaging Contract ............................................................... 487 B. The Problem of Doctrinal Swapping: Harms to Innovation and National Security .......................................... 492 C. The Problem of Contagion: Botnets and Malware .................. 497 1. Post-Morris Malware and the Need for Security Epidemiology .................................................................. 498 2. Public-Private Malware Outbreak Management ................... 502 III. THE NEXT RELEASE: A SECURITY EPIDEMIOLOGY MODEL AND THE NEW COMPUTER INTRUSION AND ABUSE ACT (“CIAA”) ...................................................................................... 508 A. The CIAA .................................................................................. 510 1. “Broken” is a term of art in computing that refers to a system that is not working properly due to malfunction or problems with design. Eric S. Raymond, broken, THE JARGON FILE, VERSION 4.4.8, http://www.catb.org/jargon/html/B/broken.html [https://perma.cc/H88S- BVUW]; Paul Dourish, The Original Hacker's Dictionary, PAUL DOURISH, https://www.dourish.com/goodies/jargon.html [https://perma.cc/5H9F-RCEW]. The authors wish to thank Kendra Albert, Steven M. Bellovin, Matt Blaze, Jen Ellis, Jeremy Epstein, Bob Gellman, Jim Green, James Grimmelmann, Marcia Hoffman, Brian Martin, Terrell McSweeny, Alexander Nally, Allison Nixon, Brendan O’Connor, Kurt Opsahl, Jay Radcliffe, Abigail Slater, Mara Tam, Marcia Tiersky, Tarah M. Wheeler, Beau Woods, Chris Wysopal, the Indiana University Center for Applied Cybersecurity Research, and the participants of EstesCon, Narwahlcon, the DEFCON Ethics Village, the 2018 Privacy Law Scholars Confer- ence, and the Data, Technology and Criminal Law Workshop organized by Duke Law School and the University of Wurzburg for their comments and critiques. Any bugs are our own. * Andrea M. Matwyshyn is a professor of law and professor of computer science (by cour- tesy) at Northeastern University, an affiliate scholar of the Center for Internet and Society at Stanford Law School, and a Senior Fellow of the Cyber Statecraft Initiative of the Atlantic Council. ** Stephanie K. Pell is an Assistant Professor and Cyber Ethics Fellow at West Point’s Army Cyber Institute, with a joint appointment to the Department of English and Philosophy, and an affiliate scholar of the Center for Internet and Society at Stanford Law School. The views expressed are the author’s personal views and do not represent the position of West Point, the Army, or the United States Government. 480 Harvard Journal of Law & Technology [Vol. 32 1. The Trespass Fixation ........................................................... 510 2. Technical Harms + Intent + Consent ..................................... 514 a. Technical harms ................................................................. 515 b. Defendant intent ................................................................. 517 c. Consent: Kerr’s Paradox and Grimmelmann’s Resolution ................................................................. 519 i. Kerr’s Trespass Norms and Grimmelmann’s Consent ................................................................. 520 ii. Consent Dualism: Factual versus Legal Consent ........... 522 iii. Why the Consent Dualism Distinction Matters ............. 524 3. The New Language ............................................................... 526 a. Change 1: 1030(a)(1) - Criminal Computer Intrusion ................................................................... 527 b. Change 2: 1030(a)(2) - Criminal Impersonation with a Credential ...................................................... 536 c. Change 3: 1030(a)(3) - Abuse of Government Position of Trust ....................................................... 540 d. Change 4: 1030(a)(4) - Epidemic Malware ....................... 541 e. Change 5: Elimination of the Civil Provisions .................. 552 B. How the CIAA Would Work in Practice ................................... 558 1. Hypothetical #1: The Malicious Third-Party Intruder ........... 558 2. Hypothetical #2: The Infrastructure Disrupter ...................... 559 3. Hypothetical #3: The Security Researcher ............................ 559 4. Hypothetical #4: The Scared Consumer ................................ 561 5. Hypothetical #5: The Script Kiddie ....................................... 562 6. Hypothetical #6: The DDoS Participants .............................. 563 7. Hypothetical #7: The Fibbing Consumer .............................. 563 8. Hypothetical #8: The Artful CAPTCHA Dodger .................. 564 9. Hypothetical #9: The Grabby User........................................ 564 10. Hypothetical #10: The Nosy Aggregator ............................ 565 11. Hypothetical #11: The (Un)Advanced Persistent User ....... 566 12. Hypothetical #12: The Competitor Aggregator ................... 567 13. Hypothetical #13: The Rogue Corporate Insider ................. 568 14. Hypothetical #14: The Password Sharer ............................. 570 15. Hypothetical #15: The Rogue Government Insider ............. 571 16. Hypothetical #16: Bots for Tots, Silver Spears, and Research Recon ............................................................... 572 17. Hypothetical #17: The Silverphishing Botnet Harpoon ........................................................................... 573 IV. CONCLUSION .............................................................................. 573 No. 2] Broken 481 I. INTRODUCTION Sometimes the “secret ingredient” is a dash of typhoid fever. In an (in)famous moment in the dramatic history of epidemiology, a cook named Mary Mallon, whose hygiene practices were allegedly subopti- mal, accidentally transmitted typhoid fever to diners through the meals she prepared. Mallon, better known to history as “Typhoid Mary,” was a single carrier of typhoid fever.2 By the time she was identified as the source of the New York City outbreak, Mallon had allegedly infected approximately a dozen other individuals with typhoid fever between 1900 and 1907.3 But by doing so, she also spurred the evolution of modern epidemiology as a discipline in the United States.4 Just as Mallon’s transmissions of typhoid fever revealed the need for a rigorous study of epidemiology in the U.S., so too do the relentless security compromises of public and private sector organizations today signal the need to revisit our current legal paradigms for computer in- trusion. Our traditional criminal law paradigms have proven inadequate to stem the tide of computer intrusion crimes in the U.S. In particular, a fatal flaw in the law lies in a conceptual disconnect: our existing ap- proach to computer intrusion and our attempts at encouraging prophy- lactic security conduct to prevent malware infections are not effectively working in tandem. Specifically, our definitive computer intrusion statute, the Com- puter Fraud and Abuse Act (“CFAA”), belies its last-century crafting, as it strains under the new threat vectors leveraged by this century’s formidable attackers. Thousands of pages of jurists’ opinions and scholars’ law review articles have pointed out the CFAA’s doctrinal limitations and struggled to interpret the statute’s core provisions.5 The 2. See Principles of Epidemiology in Public Health Practice, Third Edition, CTRS. FOR DISEASE CONTROL & PREVENTION (May 18, 2012), https://www.cdc.gov/ophss/ csels/dsepd/ss1978/lesson1/section10.html [https://perma.cc/A3AS-NUX8]; see also Ty- phoid Mary, ENCYCLOPEDIA BRITANNICA, https://www.britannica.com/biography/Typhoid- Mary [https://perma.cc/X6LN-TMU9]. 3. See Typhoid Mary, supra note 2. 4. Id. 5. See, e.g., Mark A. Lemley, Place and Cyberspace, 91 CALIF. L. REV. 521, 528–29 (2003) (distinguishing cases where “courts forgot that the information at issue in these cases is a public good to which we have never applied the ‘inviolability’ rules of real property” from “true cases of unauthorized access, in which crackers exploit software bugs to gain ac- cess to a computer system or part thereof that the owner never intended to open to the outside world”); Michael J. Madison, Rights of Access and the Shape of the Internet, 44 B.C. L. REV. 433, 436 (2003) (arguing that “CFAA cases so far suggest that courts have failed to appreciate the depth and complexity of the Internet-as-place metaphor, particularly in light of how users actually experience places on the Internet”). 482 Harvard Journal of Law & Technology [Vol. 32 CFAA has generated heated policy debate,6 circuit splits,7 and much public outcry,8 but, alas, none of the attempted solutions have success- fully remedied its flaws over thirty years’ time. This Article admits defeat. It argues that the CFAA as currently written is unsalvageable and thus requires a rewrite of its core provi- sions. Then, shifting paradigms to an approach driven by principles from computer security and