The Privacy Policy Landscape After the GDPR

Total Page:16

File Type:pdf, Size:1020Kb

The Privacy Policy Landscape After the GDPR Proceedings on Privacy Enhancing Technologies ; 2020 (1):47–64 Thomas Linden, Rishabh Khandelwal, Hamza Harkous, and Kassem Fawaz* The Privacy Policy Landscape After the GDPR Abstract: The EU General Data Protection Regulation 1 Introduction (GDPR) is one of the most demanding and comprehen- sive privacy regulations of all time. A year after it went For more than two decades since the emergence of the into effect, we study its impact on the landscape of privacy World Wide Web, the “Notice and Choice” framework has policies online. We conduct the first longitudinal, in-depth, been the governing practice for the disclosure of online and at-scale assessment of privacy policies before and after privacy practices. This framework follows a market-based the GDPR. We gauge the complete consumption cycle of approach of voluntarily disclosing the privacy practices and these policies, from the first user impressions until the com- meeting the fair information practices [25]. The EU’s recent pliance assessment. We create a diverse corpus of two sets General Data Protection Regulation (GDPR) promises to of 6,278 unique English-language privacy policies from in- change this privacy landscape drastically. As the most side and outside the EU, covering their pre-GDPR and the sweeping privacy regulation so far, the GDPR requires post-GDPR versions. The results of our tests and analyses information processors, across all industries, to be trans- suggest that the GDPR has been a catalyst for a major parent and informative about their privacy practices. overhaul of the privacy policies inside and outside the EU. This overhaul of the policies, manifesting in extensive tex- tual changes, especially for the EU-based websites, comes at mixed benefits to the users. Research Question While the privacy policies have become considerably longer, Researchers have conducted comparative studies around our user study with 470 participants on Amazon MTurk the changes of privacy policies through time, particularly 1 indicates a significant improvement in the visual represen- in light of previous privacy regulations (e.g., HIPAA and 2 tation of privacy policies from the users’ perspective for GLBA ) [1, 3, 5, 26]. Interestingly, the outcomes of these the EU websites. We further develop a new workflow for studies have been consistent: (1) the percentage of websites the automated assessment of requirements in privacy poli- with privacy policies has been growing, (2) the detail-level cies. Using this workflow, we show that privacy policies and descriptiveness of policies have increased, and (3) the cover more data practices and are more consistent with readability and clarity of policies have suffered. seven compliance requirements post the GDPR. We also The GDPR aims to address shortcomings of previous assess how transparent the organizations are with their pri- regulations by going further than any prior privacy reg- vacy practices by performing specificity analysis. In this ulation. One of its distinguishing features is that non- analysis, we find evidence for positive changes triggered by complying entities can face hefty fines, the maximum of 20 the GDPR, with the specificity level improving on average. million Euros or 4% of the total worldwide annual revenue. Still, we find the landscape of privacy policies to be in a Companies and service providers raced to change their pri- th transitional phase; many policies still do not meet several vacy notices by May 25 , 2018 to comply with the new reg- key GDPR requirements or their improved coverage comes ulations [9]. With the avalanche of updated privacy notices with reduced specificity. that users had to accommodate, a natural question follows: What is the impact of the GDPR on the landscape of Keywords: GDPR, Privacy, Security, Automated Analysis online privacy policies? of Privacy Policy, Polisis Researchers have recently started looking into this ques- DOI 10.2478/popets-2020-0004 tion by evaluating companies’ behavior in light of the Received 2019-05-31; revised 2019-09-15; accepted 2019-09-16. GDPR. Their approaches, however, are limited to a small number of websites (at most 14) [8, 32]. Concurrent to our work, Degeling et al. [9], performed the first large- scale study focused on the evolution of the cookie consent Thomas Linden: University of Wisconsin, E-mail: tlin- notices, which have been hugely reshaped by the GDPR [email protected] (with 6,579 EU websites). They also touched upon the Rishabh Khandelwal: University of Wisconsin, E-mail: [email protected] Hamza Harkous: École Polytechnique Fédérale de Lausanne, E-mail: [email protected] *Corresponding Author: Kassem Fawaz: University of Wisconsin, E-mail: [email protected] 1 The Health Information and Portability Accountability Act of 1996. 2 The Gramm-Leach-Bliley Act for the financial industry of 1999. The Privacy Policy Landscape After the GDPR 48 growth of privacy policies, finding that the percentage of D. Compliance (Sec. 8). We design seven queries that sites with privacy policies has grown by 4.9%. codify several of the GDPR compliance metrics, namely those provided by the UK Information Commissioner (ICO). The GDPR’s effect is evident; for both of our an- Methodology and Findings alyzed datasets, we find a positive trend in complying Previous studies have not provided a comprehensive an- with the GDPR’s clauses: substantially more companies swer to our research question. In this paper, we answer improved (15.1% for EU policies and 10.66% for Global this question by presenting the first on-scale, longitudinal policies, on average) on these metrics compared to those study of privacy policies’ content in the context of the that worsened (10.3% for EU policies and 7.12% for Global GDPR. We develop an automated pipeline for the collec- policies, on average). tion and analysis of 6,278 unique English privacy policies E. Specificity (Sec. 9). Finally, we design eight queries by comparing pre-GDPR and post-GDPR versions. These capturing how specific policies are in describing their data policies cover the privacy practices of websites from dif- practices. Our results show that providers are paying spe- ferent topics and regions. We approach the problem by cial attention to make the users aware of the specific data studying the change induced on the entire experience of collection/sharing practices; 25.22% of EU policies and the users interacting with privacy policies. We break this 19.4% of Global policies are more specific in describing experience into five stages: their data practice. Other providers, however, are attempt- A. Presentation (Sec. 4). To quantify the progress in ing to cover more practices in their policies at the expense privacy policies’ presentation, we gauge the change in user of specificity; 22.7% of EU policies and 17.8% of Global perception of their interfaces via a user study involving 470 policies are less specific than before. participants on Amazon Mechanical Turk. Our results find Building on the above, we draw a final set of takeaways a positive change in the attractiveness and clarity of EU- across both the time and geographical dimensions (Sec. 12). based policies; however, we find that outside the EU, the visual experience of policies is not significantly different. B. Text-Features (Sec. 5). We study the change in 2 GDPR Background the policies’ using high-level syntactic text features. Our analysis shows that both EU and Global privacy poli- The European Union began their expedition into privacy cies have become significantly longer, with (+35%, +25%) legislation since the 1990s. The Data Protection Direc- more words and (+33%, +22%) more sentences on average tive (95/46/EC) [11] set forth a set of principles for data respectively. However, we do not observe a major change protection as a preemptive attempt to safeguard EU con- in the metrics around the sentence structure. stituents in a data-heavy future. This directive was further We devise an approach, inspired by goal-driven require- amended with the ePrivacy Directive (2002/58/EC) [10] to ments engineering [35], to evaluate coverage, compliance, enhance the privacy of high-sensitivity data-types such as e- and specificity in the privacy policies. While previous lon- communications data. While the two directives maintained gitudinal studies either relied on manual investigations some focus on protecting user privacy, article 1, paragraph or heuristics-based search queries [1, 3, 9], we build on 2 of the Data Protection Directive very clearly states that the recent trend of automated semantic analysis of poli- "Member States shall neither restrict nor prohibit the free cies. We develop a total of 24 advanced, in-depth queries flow of personal data between Member States for reasons that allow us to assess the evolution of content among connected with the protection afforded under paragraph 1" the set of studied policies. We conduct this analysis by [11]. Thus, while the set of principles intended to increase building on top of the Polisis framework (Sec. 6), a recent user privacy, the primary focus of the legislation was to system developed for the automated analysis of privacy prevent the inhibiting of personal data flow. Furthermore, policies [12]. Following this approach, we perform a deeper the directives designated the Member State where a par- level of semantic analysis that overcomes the limitations ticular controller is located as the supervising authority. of keyword-based approaches. Naturally, the decentralized approach led to a wide vari- C.
Recommended publications
  • Google Privacy Policy: Main Findings and Recommendations
    APPENDIX GOOGLE PRIVACY POLICY: MAIN FINDINGS AND RECOMMENDATIONS OUTLINE I. Main findings .............................................................................................................................................................. 2 1) Legal Framework ................................................................................................................................................. 2 2) Information ............................................................................................................................................................ 2 3) Combination of data across services ........................................................................................................... 3 4) Retention period .................................................................................................................................................. 5 II. Recommendations ................................................................................................................................................... 5 1) Information ............................................................................................................................................................ 5 i. Particular case of mobile users ................................................................................................................. 6 ii. Particular case of passive users ................................................................................................................ 7 2)
    [Show full text]
  • An Analysis of Google's Privacy Policy and Related
    1 AN ANALYSIS OF GOOGLE’S PRIVACY POLICY AND RELATED FAQs A DATA PROTECTION ANALYSIS FROM AMBERHAWK TRAINING LTD DR. C. N. M. POUNDER, MARCH 2012 An analysis of Google’s Privacy Policy (March 2012) ©Amberhawk Training Limited 2 AN ANALYSIS OF GOOGLE’S PRIVACY POLICY AND RELATED FAQs MANAGEMENT SUMMARY Google’s new combined Privacy Policy (March 2012) has been widely criticised by privacy professionals and Data Protection Authorities (in particular the CNIL – the French Data Protection Authority). However the reasons for this criticism have been made in general terms; my analysis provides a detailed explanation. The analysis shows that Google’s Privacy Policy is incoherent because it uses overlapping terms. This makes the Policy difficult to follow and to understand what type of information the Policy is claiming to protect. It cannot be fair to users if they cannot understand what the Policy means for them. The Policy is also unfair in conventional terms as it does not, in many instances, fully describe the purposes of the processing. Secondly, my analysis also confirms the claim of the CNIL that the Privacy Policy is in breach of the Data Protection Directive. However, I also show that it is in breach of the USA’s Safe Harbor Principles. As the Privacy Policy states that “Google complies with the US-EU Safe Harbour Framework”, I show that this claim cannot be substantiated if Google’s new Privacy Policy is implemented. Contradictory and confusing? The Privacy Policy uses a wide range of similar terms in different circumstances which I think are contradictory.
    [Show full text]
  • Spying on Students: School Issued Devices and Student Privacy (EFF)
    Spying on Students SCHOOL-ISSUED DEVICES AND STUDENT PRIVACY Frida Alim, Bridge Fellow Na e Cardo!o, Se"ior Sta## A or"e$ %e""ie Ge&'art, Re(ear)'er *are" Gullo, Media Relatio"( Anal$( Amul Kalia, Anal$( A-ril 1/, 21.2 ELECTRONIC FRONTIER FOUNDATION EFF3OR% 1 Authors: Frida Alim, Nate Cardozo, Gennie Gebhart, Karen Gullo, Amul Kalia With assistance from: Sophia Cope, Hugh D’Andrade, Jeremy Gillula, Rainey Reitman A publication of the Electronic Frontier Foundation, 2017. “Spying on Students: School-Issued Devices and Student Privacy” is released under a Creative Commons Attribution 4.0 International License (CC BY 4.0). ELECTRONIC FRONTIER FOUNDATION EFF3OR% 2 Table of Contents Executive Summary..................................................................................................................................................... 5 Introduction.................................................................................................................................................................. 7 Part 1: Survey Results................................................................................................................................................. 8 Methods..................................................................................................................................................................... 9 Respondents and Overall Trends....................................................................................................................10 Findings..................................................................................................................................................................
    [Show full text]
  • Privacy Policy
    Privacy Policy IDI, LLC and its affiliate, ICS Inventory, LLC (“We,” “Our,” or “Us”) respects the privacy and security of your personal information. The following describes how we collect, use and disclose information and what rights you have to it. This Privacy Policy describes our practices in connection with information that we collect from you through our software applications, services, websites and mobile applications from our public facing side of our services and products and from our private access products and services that require registration before use (“Services”) and from sources other than our Services including, but not limited to, the telephone, mail and trade shows. This Privacy Policy also applies to any information that you may provide to us via an IDI Qualified Administrator’s (“IDIQA”) credential, website and/or other access method that enables you to use our Services. By providing information to us and/or using our Services, you acknowledge and agree to the terms and conditions of this Privacy Policy. If you do not agree to these terms and conditions, please do not use our Services. Information Collection We gather various types of information from you including personal information and personal data Personal information or personal data refers to any data or information which relates to an identified or identifiable natural person, and are the subject to applicable data protection laws, such as the EU General Data Protection Regulation 2016/679 (“GDPR”), and/or the California Consumer Privacy Act (Assembly Bill 375) (“CCPA”) including, but not limited to, your name, postal addresses (including billing and shipping addresses), telephone numbers, and email addresses, and Sensitive Information (as defined below).
    [Show full text]
  • PRIVACY POLICY Institute for Economics and Peace 1. We Respect Your Privacy
    PRIVACY POLICY Institute for Economics and Peace 1. We respect your privacy a. Institute for Economics and Peace respects your right to privacy and is committed to safeguarding the privacy of our customers and website visitors. This policy sets out how we collect and treat your personal information. b. We adhere to the Australian Privacy Principles contained in the Privacy Act 1988 (Cth) and to the extent applicable, the EU General Data Protection Regulation (GDPR). c. "Personal information" is information we hold which is identifiable as being about you. This includes information such as your name, email address, identification number, or any other type of information that can reasonably identify an individual, either directly or indirectly. d. You may contact us in writing at PO Box 42 St Leonards , Sydney , New South Wales, 1590 for further information about this Privacy Policy. 2. What personal information is collected a. Institute for Economics and Peace will, from time to time, receive and store personal information you submit to our website, provided to us directly or given to us in other forms. b. You may provide basic information such as your name, phone number, address and email address to enable us to send you information, provide updates and process your product or service order. c. We may collect additional information at other times, including but not limited to, when you provide feedback, when you provide information about your personal or business affairs, change your content or email preference, respond to surveys and/or promotions, provide financial or credit card information, or communicate with our customer support.
    [Show full text]
  • Privacy Policy
    Privacy Policy Human Rights Watch Privacy Statement This website together with any non-English language versions (the Human Rights Watch websites) are provided by Human Rights Watch, Inc. ("Human Rights Watch" or "HRW") and hosted in the United States. Human Rights Watch is a non-profit, non-governmental organization that investigates and exposes abuses around the world and advocates for the protection of human rights and human dignity around the globe. We are committed to protecting your right to privacy. What is this Privacy Statement? This Privacy Statement sets out our information processing practices in relation to you and your personal data. These principles apply when we collect personal data from you through any channel. It explains Human Rights Watch’s information processing practices and applies to any personal information you provide to Human Rights Watch and any personal information we collect from other sources. This Privacy Statement is a description of our practices and of your rights regarding your personal information, but it is not a contract and it does not create any new rights or obligations beyond what the law already provides. This Privacy Statement does not apply to your use of a third-party site linked to on this website. Please refer to the privacy statement of that third-party site to understand how they handle your personal information. Who is responsible for your information? Throughout this Statement, "Human Rights Watch" or "HRW" refers to Human Rights Watch, Inc., and its various parts around the world, including its affiliated charities, branch offices and other registered entities (also referred to as "we", "us", or "our").
    [Show full text]
  • Online Service Providers' Privacy and Transparency Practices Regarding
    THE ELECTRONIC FRONTIER FOUNDATION'S SEVENTH ANNUAL REPORT ON Online Service Providers’ Privacy and Transparency Practices Regarding Government Access to User Data Nate Cardozo, Senior Staff Attorney Andrew Crocker, Staff Attorney Jennifer Lynch, Senior Staff Attorney Kurt Opsahl, Deputy Executive Director and General Counsel Rainey Reitman, Activism Director July 2017 ELECTRONIC FRONTIER FOUNDATION EFF.ORG 1 Authors: Nate Cardozo, Andrew Crocker, Jennifer Lynch, Kurt Opsahl, Rainey Reitman With assistance from: Hugh D’Andrade, Gennie Gebhart A publication of the Electronic Frontier Foundation, 2017 “Who Has Your Back? 2017” is released under a Creative Commons Attribution 4.0 International License (CC BY 4.0). ELECTRONIC FRONTIER FOUNDATION EFF.ORG 2 Table of Contents Executive Summary.....................................................................................................................4 2017 Results Table...................................................................................................................5 Major Findings and Trends.................................................................................................................................5 Overview of Criteria...............................................................................................................7 Deep Dive and Analysis of Select Corporate Policies..............................................................................10 Follows Industry-Wide Best Practices................................................................................10
    [Show full text]
  • An Analysis of P3P-Enabled Web Sites Among Top-20 Search Results
    An Analysis of P3P-Enabled Web Sites among Top-20 Search Results Serge Egelman Lorrie Faith Cranor Abdur Chowdhury Carnegie Mellon University Carnegie Mellon University America Online, Inc. 5000 Forbes Avenue 5000 Forbes Avenue 22000 AOL Way Pittsburgh, PA 15213 Pittsburgh, PA 15213 Dulles, VA 20166 [email protected] [email protected] [email protected] ABSTRACT Keywords Search engines play an important role in helping users find P3P, Privacy Policies, Search Engines, E-Commerce desired content. With the increasing deployment of computer- readable privacy policies encoded using the standard W3C Platform for Privacy Preferences (P3P) format, search en- 1. INTRODUCTION gines also have the potential to help users identify web sites According to a 2005 poll conducted by CBS News and the that will respect their privacy needs. We conducted a study New York Times, 82% of respondents believed that the right of the quantity and quality of P3P-encoded privacy poli- to privacy in the U.S. is either under serious threat or is al- cies associated with top-20 search results from three popular ready lost. This same poll also found that 83% believe that search engines. We built a P3P-enabled search engine and companies could potentially share their personal information used it to gather statistics on P3P adoption as well as the inappropriately [7]. These responses are similar to a 2000 privacy landscape of the Internet as a whole. This search en- survey conducted by The Pew Internet & American Life gine makes use of a privacy policy cache that we designed to Project, where 86% of respondents said that they wanted facilitate fast searches.
    [Show full text]
  • The Privacy Problem a Broader View of Information Privacy and the Costs and Consequences of Protecting It
    F IRST R EPORTS VOL. 4, NO. 1 MARCH 2003 The Privacy Problem A broader view of information privacy and the costs and consequences of protecting it BY FRED H. CATE A FIRST AMENDMENT CENTER PUBLICATION Fred H. Cate is a distinguished professor, Ira C. Batman Faculty fellow and director of the Center for Applied Cybersecurity Research at the Indiana University School of Law-Bloomington. He specializes in privacy and other information-law issues and appears regularly before Congress, state legislatures and professional and industry groups on these matters. He has directed the Electronic Information Privacy and Commerce Study for the Brookings Institution and has served as a member of the Federal Trade Commission’s Advisory Committee on Online Access and Security. He is a visiting scholar at the American Enterprise Institute and a senior policy adviser at the Hunton & Williams Center for Information Policy Leadership. Cate is the author of many articles and books concerning privacy and information law, including Privacy in Perspective (AEI Press), Privacy in the Information Age (Brookings Institution Press) and The Internet and the First Amendment (Phi Delta Kappa). He is the co-author (with Marc Franklin and David Anderson) of the sixth edition of the best-selling Mass Media Law (Foundation Press). A graduate of Stanford University and Stanford Law School, Cate is a member of the Phi Beta Kappa Senate and of the board of directors of the Phi Beta Kappa Fellows, and is listed in Who’s Who in America and Who’s Who in American Law. F FIRSTIRST RR EPORTSEPORTS The Privacy Problem A broader view of information privacy and the costs and consequences of protecting it BY FRED H.
    [Show full text]
  • New Rules for Personal Data Protection and Media: Why Reconcile Privacy Policies and Media Business Models in the Digital Environment?
    New rules for personal data protection and media: Why reconcile privacy policies and media business models in the digital environment? Summary Media companies in Serbia are not sufficiently aware of the importance of personal data protection, nor are there developed procedures regarding the fulfilment of standards in line with new rules for personal data protection. Taking into account media business models in a digital environment, which are substantially based on collecting data of visitors and users of websites, it is necessary to advance their practices and internal policies. This is necessary so that they can use the potentials of digital economy and respect the privacy of their audiences in accordance with the new legislation. Introduction All around the world, 25 May 2018 will be remembered as a turning point in the protection of privacy of citizens and as the beginning of a new age of the personal data protection. This was the day when the General Data Protection Regulation (GDPR) ,a new legislative framework which is considered to be the highest standard of the privacy protection, entered into force. The GDPR introduced additional obligations to all those processing personal data, including media and journalists. The regulation covers media policies in terms of controlling data of users, which is a key resource for media business models in a digital environment. In addition, Article 85 of the GDPR places an obligation on the EU member states for ’reconciling’ the freedom of expression and information, and personal data protection in their laws. Thus they are required to establish the balance between these two rights, which also includes processing data for journalistic purposes (European Union, 2016).
    [Show full text]
  • Privacy Policy Template for Facebook App
    Privacy Policy Template For Facebook App Improperly undeclining, Victor suborn thiocarbamide and emphasising armadas. Whit usually donning jerry-buildingseasonally or herrequoting rightist boiling exchangeably, when bionic immoderate Rodolphe and caroling dying. certes and disjunctively. Edwin The us and marketing activities, financial and auditing related procedures to protect users should publish the template privacy for facebook app must notify you are a website and other information collected may be gathered PC or Mac computers, smartphones, tablets, or entity other device that is used to access or interact under the Service. Ftc has been designed for app policies and policy template allows customers. Sign in with Apple. Conditions document for your website. No catches like search terms generators. Notwithstanding the foregoing, we sniff the i to screen, remove, edit, my block any User Content we lend in violation of best Terms or local we find, not our own discretion to tank otherwise objectionable, at our own discretion. Any point we will find this data collection, or other service providers in a malicious activity or app must request through app privacy policies in your android phone number. Likewise, profit is no guarantee that our physical, technical, or organizational security measures are foolproof. You provide some of this data directly, and we get some of it by collecting data about your interactions, use, and experiences with our products. For privacy policy for some user data to assert copyright symbol of privacy. In some cases, we simply request further information in launch to commemorate your identity. This privacy and search engines, and subsequently established for case, access this includes analyzing data for app.
    [Show full text]
  • Protection of Personal Data (Both Automated and Conventional)
    THIRTY YEARS AFTER THE OECD PRIVACY GUIDELINES 2011 ORGANISATION FOR ECONOMIC CO-OPERATION AND DEVELOPMENT The OECD is a unique forum where governments work together to address the economic, social and environmental challenges of globalisation. The OECD is also at the forefront of efforts to understand and to help governments respond to new developments and concerns, such as corporate governance, the information economy and the challenges of an ageing population. The Organisation provides a setting where governments can compare policy experiences, seek answers to common problems, identify good practice and work to co-ordinate domestic and international policies. The OECD member countries are: Australia, Austria, Belgium, Canada, Chile, the Czech Republic, Denmark, Estonia, Finland, France, Germany, Greece, Hungary, Iceland, Ireland, Israel, Italy, Japan, Korea, Luxembourg, Mexico, the Netherlands, New Zealand, Norway, Poland, Portugal, the Slovak Republic, Slovenia, Spain, Sweden, Switzerland, Turkey, the United Kingdom and the United States. The Commission of the European Communities takes part in the work of the OECD. © OECD 2011 Cover image: © kentoh – Fotolia.com No reproduction, copy, transmission or translation of this document may be made without written permission. Applications should be sent to OECD Publishing: [email protected] FOREWORD – 3 Foreword The OECD has played major role in shaping and defining how privacy is protected around the globe. The OECD Guidelines Governing the Protection of Privacy and Transborder Flows of Data were the first inter- nationally agreed-upon set of privacy principles. Canada participated in the Expert Group which produced the Guidelines under the wise leadership of the Honourable Michael Kirby of Australia. The Guidelines have been extremely influential in Canada, where, with minor changes, they were incorporated into the Personal Information Protec- tion and Electronic Documents Act, Canada’s private-sector privacy law.
    [Show full text]