Artificial Intelligence and Law Enforcement

Total Page:16

File Type:pdf, Size:1020Kb

Artificial Intelligence and Law Enforcement STUDY Requested by the LIBE committee Artificial Intelligence and Law Enforcement Impact on Fundamental Rights Affairs Directorate-General for Internal Policies PE 656.295 July 2020 EN Artificial Intelligence and Law Enforcement Impact on Fundamental Rights Abstract This request of the LIBE Committee, examines the impact on fundamental rights of Artificial Intelligence in the field of law enforcement and criminal justice, from a European Union perspective. It presents the applicable legal framework (notably in relation to data protection), and analyses major trends and key policy discussions. The study also considers developments following the Covid-19 outbreak. It argues that the seriousness and scale of challenges may require intervention at EU level, This document was requested by the European Parliament's Committee on Civil Liberties, Justice and Home Affairs. AUTHOR Prof. Dr. Gloria GONZÁLEZ FUSTER, Vrije Universiteit Brussel (VUB) ADMINISTRATOR RESPONSIBLE Alessandro DAVOLI EDITORIAL ASSISTANT Ginka TSONEVA LINGUISTIC VERSIONS Original: EN ABOUT THE EDITOR Policy departments provide in-house and external expertise to support EP committees and other parliamentary bodies in shaping legislation and exercising democratic scrutiny over EU internal policies. To contact the Policy Department or to subscribe for updates, please write to: European Parliament B-1047 Brussels Email: [email protected] Manuscript completed in July 2020 © European Union, 2020 This document is available on the internet at: http://www.europarl.europa.eu/supporting-analyses DISCLAIMER AND COPYRIGHT The opinions expressed in this document are the sole responsibility of the authors and do not necessarily represent the official position of the European Parliament. Reproduction and translation for non-commercial purposes are authorised, provided the source is acknowledged and the European Parliament is given prior notice and sent a copy Artificial Intelligence and Law Enforcement - Impact on Fundamental Rights CONTENTS LIST OF ABBREVIATIONS 5 LIST OF BOXES 7 LIST OF FIGURES 7 EXECUTIVE SUMMARY 8 GENERAL INFORMATION 10 1.1. Objectives and structure 11 1.2. Methodology 12 EU LEGAL FRAMEWORK 13 2.1. Primary law 13 2.2. Secondary law 14 2.2.1. Rules on the processing of personal data 14 2.2.2. Rules on the processing of non-personal data 19 AI IN LAW ENFORCEMENT AND CRIMINAL JUSTICE 21 3.1. 22 3.2. Facial recognition 24 3.3. AI and criminal justice 27 3.4. AI and borders 29 3.4.1. General trends 29 3.4.2. PNR 31 3.4.3. ETIAS 32 IMPACT ON FUNDAMENTAL RIGHTS 37 4.1. Privacy and data protection 38 4.2. Freedom of expression and information, and freedom of assembly and association 40 4.3. Non-discrimination 40 4.4. Right to an effective remedy and to a fair trial 43 4.5. Rights of the child 43 POLICY DEVELOPMENTS 45 5.1. Introduction 47 5.2. European Commission 48 5.3. European Parliament 54 5.4. The role of ethics 54 PE 656.295 3 IPOL | 5.5. EU-funded support of AI 57 COVID-19 OUTBREAK RESPONSE 61 6.1. Data-driven responses 63 6.2. Law enforcement initiatives 66 POLICY RECOMMENDATIONS 67 7.1. Assess fully the relevance and gaps of the existing legal framework 67 7.2. Adapt initiatives to the specificities of the field 68 7.3. Consider the need for special rules for certain practices 68 7.4. Clarify the role of ethics 69 7.5. Reinforce trust in EU-funded AI efforts 69 7.6. Screen developments related to the Covid-19 outbreak 69 REFERENCES 71 ANNEX A COMPARISON E GDPR AND THE LED 83 4 PE 656.295 Artificial Intelligence and Law Enforcement - Impact on Fundamental Rights ACLU American Civil Liberties Union ADM Automated Decision Making AI Artificial Intelligence CJEU Court of Justice of the European Union DAPIX Working Party on Information Exchange and Data Protection DPA Data Protection Authority EASO European Asylum Support Office EC European Commission ECHR European Convention on Human Rights ECRIS-TCN European Criminal Records Information System for Third-Country Nationals ECtHR European Court of Human Rights EDPB European Data Protection Board EDPS European Data Protection Supervisor EDRi European Digital Rights Initiative EES Entry/Exit System EIO European Investigation Order EPIC Electronic Privacy Information Center EPPO EU European Union FAT Fairness, Accountability and Transparency FIU Financial Intelligence Unit PE 656.295 5 IPOL | GDPR General Data Protection Regulation GSMA Global System for Mobile Communications association GVM Gangs Violence Matrix ICO IT Information Technology JRC Joint Research Centre LED Law Enforcement Directive LFR Live Facial Recognition OECD Organisation for Economic Co-operation and Development PNR Passenger Name Record RBI Remote Biometric Identification SIS Schengen Information System UNICRI United Nations Interregional Crime and Justice Research Institute VIS Visa Information System 6 PE 656.295 Artificial Intelligence and Law Enforcement - Impact on Fundamental Rights Example 1: Real-time surveillance of public space 59 Example 2: Autonomous robots at the borders 59 Example 3: Pre-occurring crime prediction and prevention 60 Example 4: Intelligent dialogue-empowered bots for early terrorist detection 60 Figure 1: EDPB work on police and justice 19 Figure 2: AI HLEG Framework for Trustworthy AI 50 PE 656.295 7 IPOL | Background Artificial Intelligence (AI) is high on the agenda of the European Union (EU). Discussions on its possible regulation have been particularly prominent since Ursula von der Leyen announced, already before her nomination as President of the European Commission (EC), a strong will to situate AI among the s top priorities. The endorsement of AI as an EU-level policy priority has been accompanied by a reflection on how to promote trust in AI technologies, and how to guarantee that AI systems do not compromise, during their development or deployment, EU fundamental rights. This specific reflection is actually not completely novel, but entrenched in prior legal and policy debates on fundamental rights , more generally, related to the regulation of the processing of both personal and non-personal data. Aim This study aims at analysing the impact on EU fundamental rights of AI in the field of law enforcement and criminal justice. It approaches the subject from an EU law and policy perspective. It first succinctly presents the main features of the applicable legal framework. Second, it introduces recent and ongoing developments in the use of AI in criminal justice, and AI and borders. It discusses the impact on fundamental rights of these trends, and reviews relevant policy discussions around AI regulation. After such exploration, the study puts forward some concrete recommendations. Findings The field of law enforcement and criminal justice is particularly sensitive, as it touches upon core issues of the relation between the individual and the State. There is a broad consensus on the fact that reliance on AI systems in this area can substantially impact EU fundamental rights, and more generally democracy itself. The issue thus deserves special consideration in the context of any discussion on the future of trustworthy AI. This study: Shows that the advent of AI in the field of law enforcement and criminal justice is already a reality, as AI systems are increasingly being adopted or considered; such systems might take many different shapes, and occur at the cross-roads of different sustained trends to increase the use of data, algorithms, and computational power; these developments are connected to pre-existing trends, some of which had been previously broadly framed under ; Documents that such developments have already generated significant controversies, AI and criminal justice, and AI and borders (including a reflection on the European Travel Information and Authorization System, ETIAS), for instance in litigation and calls from civil society to better prevent or mitigate associated risks, both in the EU and beyond; Points out that discussions around AI regulation in the EU have been deeply embedded in the EU Digital Single Market agenda, and that generally speaking, although they might refer to the need to consider law enforcement and criminal justice specificities, such policy discussions are most often not based on a detailed review and taking into account of concrete applicable rules (and, especially, of applicable restrictions and derogations); 8 PE 656.295 Artificial Intelligence and Law Enforcement - Impact on Fundamental Rights Warns that such policy discussions suffer from in relation to the safeguarding of fundamental rights, generating not only lack of conceptual precision but most importantly uncertainty as to whether effective protection of fundamental rights will be delivered; Emphasises that the current EU data protection legal framework shall not be assumed to offer enough solid safeguards for individuals in light of the increased uses of automated decision-making and profiling for law enforcement and criminal justice purposes, as: o the general safeguards provided by the General Data Protection Regulation (GDPR) do not necessarily apply when the processing is for such purposes, as restrictions and derogations might be applicable; o the Law Enforcement Data Protection Directive (LED Directive), which might be the applicable relevant instruments, provides for safeguards that are similar to those of the GDPR, but nevertheless not exactly equivalent, and equally provides for possible restrictions and derogations. Reviews a variety of fundamental rights considerations
Recommended publications
  • Clearview AI Has Billions of Our Photos. Its Entire Client List Was Just Stolen by Jordan Valinsky, CNN Business Updated 1:24 PM ET, Wed February 26, 2020
    Clearview AI has billions of our photos. Its entire client list was just stolen By Jordan Valinsky, CNN Business Updated 1:24 PM ET, Wed February 26, 2020 New York (CNN Business)Clearview AI, a startup that compiles billions of photos for facial recognition technology, said it lost its entire client list to hackers. The company said it has patched the unspecified flaw that allowed the breach to happen. In a statement, Clearview AI's attorney Tor Ekeland said that while security is the company's top priority, "unfortunately, data breaches are a part of life. Our servers were never accessed." He added that the company continues to strengthen its security procedures and that the flaw has been patched. Clearview AI continues "to work to strengthen our security," Ekeland said. In a notification sent to customers obtained by Daily Beast, Clearview AI said that an intruder "gained unauthorized access" to its customer list, which includes police forces, law enforcement agencies and banks. The company said that the person didn't obtain any search histories conducted by customers, which include some police forces. The company claims to have scraped more than 3 billion photos from the internet, including photos from popular social media platforms like Facebook, Instagram, Twitter and YouTube. The firm garnered controversy in January after a New York Times investigation revealed that Clearview AI's technology allowed law enforcement agencies to use its technology to match photos of unknown faces to people's online images. The company also retains those photos in its database even after internet users delete them from the platforms or make their accounts private.
    [Show full text]
  • Report Legal Research Assistance That Can Make a Builds
    EUROPEAN DIGITAL RIGHTS A LEGAL ANALYSIS OF BIOMETRIC MASS SURVEILLANCE PRACTICES IN GERMANY, THE NETHERLANDS, AND POLAND By Luca Montag, Rory Mcleod, Lara De Mets, Meghan Gauld, Fraser Rodger, and Mateusz Pełka EDRi - EUROPEAN DIGITAL RIGHTS 2 INDEX About the Edinburgh 1.4.5 ‘Biometric-Ready’ International Justice Cameras 38 Initiative (EIJI) 5 1.4.5.1 The right to dignity 38 Introductory Note 6 1.4.5.2 Structural List of Abbreviations 9 Discrimination 39 1.4.5.3 Proportionality 40 Key Terms 10 2. Fingerprints on Personal Foreword from European Identity Cards 42 Digital Rights (EDRi) 12 2.1 Analysis 43 Introduction to Germany 2.1.1 Human rights country study from EDRi 15 concerns 43 Germany 17 2.1.2 Consent 44 1 Facial Recognition 19 2.1.3 Access Extension 44 1.1 Local Government 19 3. Online Age and Identity 1.1.1 Case Study – ‘Verification’ 46 Cologne 20 3.1 Analysis 47 1.2 Federal Government 22 4. COVID-19 Responses 49 1.3 Biometric Technology 4.1 Analysis 50 Providers in Germany 23 4.2 The Convenience 1.3.1 Hardware 23 of Control 51 1.3.2 Software 25 5. Conclusion 53 1.4 Legal Analysis 31 Introduction to the Netherlands 1.4.1 German Law 31 country study from EDRi 55 1.4.1.1 Scope 31 The Netherlands 57 1.4.1.2 Necessity 33 1. Deployments by Public 1.4.2 EU Law 34 Entities 60 1.4.3 European 1.1. Dutch police and law Convention on enforcement authorities 61 Human Rights 37 1.1.1 CATCH Facial 1.4.4 International Recognition Human Rights Law 37 Surveillance Technology 61 1.1.1.1 CATCH - Legal Analysis 64 EDRi - EUROPEAN DIGITAL RIGHTS 3 1.1.2.
    [Show full text]
  • EPIC FOIA Request 1 Facial Recognition PIA June 10, 2020 ICE C
    VIA EMAIL June 10, 2020 Fernando Pineiro FOIA Officer U.S. Immigration and Customs Enforcement Freedom of Information Act Office 500 12th Street, S.W., Stop 5009 Washington, D.C. 20536-5009 Email: [email protected] Dear Mr. Pineiro: This letter constitutes a request under the Freedom of Information Act (“FOIA”), 5 U.S.C. § 552(a)(3), and is submitted on behalf of the Electronic Privacy Information Center (“EPIC”) to U.S. Immigration and Customs Enforcement (“ICE”). EPIC seeks records related to the agency’s use of facial recognition software (“FRS”). Documents Requested 1. All contracts between ICE and commercial FRS vendors; 2. All documents related to ICE’s approval process for new commercial FRS vendors, including but not limited to: a. Document detailing the approval process and requirements for commercial FRS vendors b. Documents related to the approval of all commercial FRS vendors currently in use by ICE c. Documents related to reviewing commercial FRS vendors that have been used under “exigent circumstances” without prior approval; 3. All training materials and standard operating procedures for Homeland Security Investigations (“HSI”) agents regarding use of FRS, including but not limited to the following: a. Existing training materials and standard operating procedures around HSI agent use of FRS b. Existing training materials and standard operating procedures regarding restrictions on the collection of probe photos EPIC FOIA Request 1 Facial Recognition PIA June 10, 2020 ICE c. Training materials and standard operating procedures being developed for HSI agent use of FRS d. Training materials and standard operating procedures being developed regarding the use of probe photos; 4.
    [Show full text]
  • 1:21-Cv-00135 Document #: 31 Filed: 04/09/21 Page 1 of 18 Pageid #:232
    Case: 1:21-cv-00135 Document #: 31 Filed: 04/09/21 Page 1 of 18 PageID #:232 IN THE UNITED STATES DISTRICT COURT FOR THE NORTHERN DISTRICT OF ILLINOIS EASTERN DIVISION Civil Action File No.: 1:21-cv-00135 In re: Clearview AI, Inc. Consumer Privacy Litigation Judge Sharon Johnson Coleman Magistrate Judge Maria Valdez PLAINTIFFS’ MEMORANDUM OF LAW IN SUPPORT OF MOTION FOR PRELIMINARY INJUNCTION Case: 1:21-cv-00135 Document #: 31 Filed: 04/09/21 Page 2 of 18 PageID #:233 INTRODUCTION Without providing notice or receiving consent, Defendants Clearview AI, Inc. (“Clearview”); Hoan Ton-That; and Richard Schwartz (collectively, “Defendants”): (a) scraped billions of photos of people’s faces from the internet; (b) harvested the subjects’ biometric identifiers and information (collectively, “Biometric Data”); and (c) created a searchable database of that data (the “Biometric Database”) which they made available to private entities, friends and law enforcement in order to earn profits. Defendants’ conduct violated, and continues to violate, Illinois’ Biometric Information Privacy Act (“BIPA”), 740 ILCS 14/1, et seq., and tramples on Illinois residents’ privacy rights. Unchecked, Defendants’ conduct will cause Plaintiffs David Mutnick, Mario Calderon, Jennifer Rocio, Anthony Hall and Isela Carmean and Illinois class members to suffer irreparable harm for which there is no adequate remedy at law. Defendants have tacitly conceded the need for injunctive relief. In response to Plaintiff Mutnick’s preliminary injunction motion in his underlying action, Defendants desperately sought to avoid judicial oversight by claiming to have self-reformed. But Defendants have demonstrated that they cannot be trusted. While Defendants have represented that they were cancelling all non- law enforcement customer accounts, a recent patent application reveals Defendants’ commercial aspirations.
    [Show full text]
  • EPIC FOIA Request 1 Clearview AI March 6, 2020 FBI VIA FACSIMILE
    VIA FACSIMILE March 6, 2020 David M. Hardy, Section Chief Record/Information Dissemination Section Federal Bureau of Investigation 170 Marcel Drive Winchester, VA 22602-4843 Fax: (540) 868-4997 Attn: FOIA Request Dear Mr. Hardy: This letter constitutes a request under the Freedom of Information Act (“FOIA”), 5 U.S.C. § 552(a)(3), and is submitted on behalf of the Electronic Privacy Information Center (“EPIC”) to the Federal Bureau of Investigation (“FBI”). EPIC seeks records related to the agency’s use of Clearview AI technology.1 Documents Requested 1. Emails and communications about Clearview AI, Inc., including but not limited to the following individuals and search terms: a. Clearview b. Clearview AI c. Hoan Ton-That d. Tor Ekeland e. Third parties with an email address ending in “@clearview.ai” 2. All contracts or agreements between the FBI and Clearview AI, Inc.; 3. All presentations mentioning Clearview; 4. All Clearview sales materials; 5. All privacy assessments, including but not limited to Privacy Threshold Analysis and/or Privacy Impact Assessments, that discuss the use of Clearview AI technology. Background A recent New York Times investigation revealed that a secretive New York-based company, Clearview AI, Inc., has developed an application that allows law enforcement agencies to identify people in public spaces without actually requesting identity documents or without a legal basis to 1 Clearview AI, Inc., https://clearview.ai/. EPIC FOIA Request 1 Clearview AI March 6, 2020 FBI determine the person’s identity.2 Clearview
    [Show full text]
  • Edri Submission to UN Special Rapporteur David Kaye's Call on Freedom of Expression and the Private Sector in the Digital Age
    EDRi submission to UN Special Rapporteur David Kaye's call on freedom of expression and the private sector in the digital age Introduction EDRi is a not-for-profit association of digital civil rights organisations. Our objectives are to promote, protect and uphold civil rights in the field of information and communication technology. European Digital Rights (EDRi) welcomes the UN Special Rapporteur on freedom of opinion and expression David Kaye’s public consultation on the role of ICT companies vis-à-vis freedom of expression in the digital environment to identify: I. the categories of actors in the ICT sector whose activities implicate the freedom of opinion and expression; II. the main legal issues raised for freedom of opinion and expression within the ICT sector; and III. the conceptual and normative work already done to develop corporate responsibility and human rights frameworks in these spaces, including governmental, inter-governmental, civil society, corporate and multistakeholder efforts. I. ICT actors with a potential to impact freedom of expression and opinion In order to communicate online, it is necessary to rely on a chain of different service providers, often based in different jurisdictions and with whom one may have no direct relationship at all. For example, if "zero-rating" or even data caps are imposed by Internet access providers in a country and you have no organisational and financial capacity to be "zero-rated", your ability to communicate with people that are using such restricted services is limited and there is no "leverage" with the ISPs that connect your intended audience with the Internet.
    [Show full text]
  • Green Views on Digital Rights and Digital Commons
    Adopted Policy Paper GREEN VIEWS ON DIGITAL RIGHTS AND DIGITAL COMMONS Introduction As information, digital technologies and the internet play an increasingly important role in today’s society, new forms of cooperation and work can emerge, as well as alternative economic models and new possibilities for involving citizens in society and political participation. Meanwhile, a whole array of new laws since the end of the 90s – from “intellectual property” to security – keep raising deep concerns. We Greens are taking a stand to defend individual rights and serve public interests. The European Greens want to take a leading role in building and strengthening digital commons and enabling all people to take part in digital society. As Greens we like to see the internet, the worldwide digital infrastructure as digital commons1. That means a free and open common ground for all citizens to communicate, share and profit from available resources. However, we know that in reality the internet is far from being free, open and for the benefit of all citizens. We also recognise the limits of governmental intervention in a global infrastructure and the dangers of state interference in the free flow of information. But to safeguard digital rights we need the intervention by national states, the European Union and the entire global community. We Greens want to achieve an internet and digital society where civil rights are respected and all citizens can profit equally from the rich amounts of information and culture that are now accessible while acknowledging author rights. In view of rapid technological developments these goals must also be applicable to devices and tools outside the internet where the right to privacy is at stake.
    [Show full text]
  • California Activists Challenge Clearview AI on Biometric Surveillance
    For Immediate Release March 9, 2021 Press contact: Joe Rivano Barros, [email protected] ​ Sejal Zota, [email protected] ​ Ellen Leonida, [email protected] ​ California Activists Challenge Clearview AI on Biometric Surveillance Mijente, NorCal Resist, and local activists sued the facial recognition company Clearview AI today over its unlawful surveillance activities. The lawsuit alleges that the company’s ​ ​ surveillance technology violates privacy rights and facilitates government monitoring of protesters, immigrants, and communities of color. Clearview’s facial recognition tool—which allows instantaneous identification and tracking of people targeted by law enforcement—chills political speech and other protected activities, the suit argued. The lawsuit was filed in the Alameda County Superior Court on behalf of Mijente, NorCal Resist, and four individual plaintiffs, including a DACA recipient, who allege that Clearview AI violated their rights under the California constitution, California’s consumer protection and privacy laws, and California common law. “Privacy is enshrined in the California constitution, ensuring all Californians can lead their lives without the fear of surveillance and monitoring,” said Sejal Zota, a lead attorney in the case and ​ ​ the legal director of Just Futures Law, a law project working to support grassroots organizations. “Clearview AI upends this dynamic, making it impossible to walk down the street without fear your likeness can be captured, stored indefinitely by the company, and used against you any time in the future. There can be no meaningful privacy in a society with Clearview AI.” Clearview AI has amassed a database of over three billion facial scans by scraping photos from websites like Facebook, Twitter, and Venmo, violating these companies’ terms of service and storing user images without permission.
    [Show full text]
  • Observer5-2015(Health&Beauty)
    the Jewish bserver www.jewishobservernashville.org Vol. 80 No. 5 • May 2015 12 Iyyar-13 Sivan 5775 Nashville crowd remembers Israel’s fallen and celebrates its independence By CHARLES BERNSEN atching as about 230 people gath- ered on April 23 for a somber remem- brance of Israel’s fallen soldiers and Wterrors victims followed immediately by a joyful celebration of the 67th anniversary of the Jewish’ state’s birth, Rabbi Saul Strosberg couldn’t help but marvel. After all, it has been only eight years since the Nashville Jewish community started observing Yom Hazikaron, the Israeli equivalent of Memorial Day. Organized by several Israelis living in Nashville, including the late Miriam Halachmi, that first, brief ceremony was held in his office at Congregation Sherith For the third year, members of the community who have helped build relations between Nashville and Israel were given the honor of lighting Israel. About 20 people attended. torches at the annual celebration of Israel’s independence. Photos by Rick Malkin Now here he was in a crowd that of three fallen Israelis – a soldier killed in Catering and music by three Israeli Defense Martha and Alan Segal, who made filled the Gordon Jewish Community combat, a military pilot who died in a Force veterans who are members of the their first ever visits to Israel this spring Center auditorium to mark Yom training accident and a civilian murdered musical troupe Halehaka (The Band). on a congregational mission. Hazikaron and then Yom Ha’atzmaut, the in a terror attack. Their stories were pre- For the third year, the highlight of • Rabbi Mark Schiftan of The Temple Israeli independence day.
    [Show full text]
  • ACLU Facial Recognition–Jeffrey
    Thursday, May 7, 2020 at 12:58:27 PM Eastern Daylight Time SubjeCt: Iden%fying the Impostor - October 10, 2019 Date: Monday, September 30, 2019 at 10:09:08 AM Eastern Daylight Time From: NESPIN Training To: jeff[email protected] Training Announcement Identifying the Impostor October 10, 2019 Location: Gardner Police Department, Gardner, MA Time: 8:00 am - 4:00 pm This program is a Universal Law Enforcement Identity Theft recognition program that will assist you in identifying the presence of an Impostor. This one-day course will teach participants a new and dynamic method of identifying identity thieves, aka Impostors, using multiple State and Federal databases. The instructor will focus on foreign national impostors using stolen identities to hide in plain sight and commit various offenses in the New England area and beyond with a focus on trafficking drugs. This program teaches a comprehensive study of NCIC Triple I and how to understand the search results, along with combining all this information provided from the RMV, BOP, AFISR and “DQs” from Puerto Rico. We also cover facial recognition and altered print recognition. Please go to the following link for complete details: https://extranet.riss.net/public/1419b76f-32ff-49df-a514-513433d66fb7 The views, thoughts, materials and opinions articulated by the instructor(s) of this training belong solely to the instructor(s) and do not necessarily represent the official position or views of NESPIN. ---------------------------------------------------------------- Please do not reply to this e-mail as it is an unmonitored alias. If you do not wish to receive these training mailings, please choose the Opt-out feature at the bottom of this email.
    [Show full text]
  • Privacy International, Human and Digital Rights Organizations, and International Legal Scholars As Amici Curiae in Support of Respondent
    No. 17-2 IN THE Supreme Court of the United States IN THE MdATTER OF A WARRANT TO SEARCH A CERTAIN EMAIL ACCOUNT CONTROLLED AND MAINTAINED BY MICROSOFT CORPORATION UNITED STATES OF AMERICA, Petitioner, —v.— MICROSOFT CORPORATION, Respondent. ON WRIT OF CERTIORARI TO THE UNITED STATES COURT OF APPEALS FOR THE SECOND CIRCUIT BRIEF OF PRIVACY INTERNATIONAL, HUMAN AND DIGITAL RIGHTS ORGANIZATIONS, AND INTERNATIONAL LEGAL SCHOLARS AS AMICI CURIAE IN SUPPORT OF RESPONDENT LAUREN GALLO WHITE BRIAN M. WILLEN RYAN T. O’HOLLAREN Counsel of Record WILSON, SONSINI, GOODRICH BASTIAAN G. SUURMOND & ROSATI, P.C. WILSON, SONSINI, GOODRICH One Market Plaza Spear Tower, & ROSATI, P.C. Suite 3300 1301 Avenue of the Americas, San Francisco, California 94105 40th Floor (415) 947-2000 New York, New York 10019 [email protected] (212) 999-5800 [email protected] [email protected] [email protected] Attorneys for Amici Curiae (Counsel continued on inside cover) CAROLINE WILSON PALOW SCARLET KIM PRIVACY INTERNATIONAL 62 Britton Street London, EC1M 5UY United Kingdom [email protected] [email protected] i QUESTION PRESENTED Whether construing the Stored Communications Act (“SCA”) to authorize the seizure of data stored outside the United States would conflict with foreign data-protection laws, including those of Ireland and the European Union, and whether these conflicts should be avoided by applying established canons of construction, including presumptions against extra- territoriality and in favor of international comity, which direct U.S. courts to construe statutes as applying only domestically and consistently with foreign laws, absent clear Congressional intent. ii TABLE OF CONTENTS PAGE QUESTION PRESENTED ..........................
    [Show full text]
  • Annual Report 2017
    101110101011101101010011100101110101011101101010011100 10111010101110110101001 10101110001010 EDRi EUROPEAN DIGITAL RIGHTS 110101011101101010011100101011101101010011100101 10101110110101000010010100EUROPEAN010 DIGITAL001 RIGHTS11011101110101011101101100000100101101000 DEFENDING RIGHTS AND FREEDOMS ONLINE 01000111011101110101 101011101101010000100101000100011101110111010101110110110000010010110100001000111011101110101 101110101010011100 101110101011101101010011100 101011101101010000100101000100011101 101011101101010000100101000100011101110111010101110110110000010010110100001000111011101110101 101110101010011100 101110101011101101010011100 1010111011010100001001010001000111011101110101011101101100000 101011101101010000100101000100011101110111010101110110110000010010110100001000111011101110101 101110101010011100 101110101011101101010011100 10101110110101000010010100010001110111011101010111011011000001001011010000100011101110111010 101011101101010000100101000100011101110111010101110110110000010010110100001000111011101110101 101110101010011100 101110101011101101010011100 101011101101010000100101000100011101110111010101110110110000010010110100001000111011101110101 101110101010011100 101110101011101101010011100 EUROPEAN DIGITAL RIGHTS EUROPEAN DIGITAL RIGHTS ANNUAL REPORT 2017 1011011101101110111010111011111011 January 2017 – December 2017 1011011101101110111011101100110111 101011101101010000100101000100011101110111010101110110110000010010110100001000111011101110101 101110101010011100 101110101011101101010011100 101011101101010000100101000100011101110111010101110110110000010010110100001000111011101110101
    [Show full text]