Quick viewing(Text Mode)

The Technological Downside of Algorithms: an 'Elsagate' Case Study

The Technological Downside of Algorithms: an 'Elsagate' Case Study

Master´s Thesis TFG 8 – Changing Power relationships

The technological downside of algorithms: an ‘’ case study

Author: Mathijs Stals Student number: u1273312

Date of conclusion: August 2020

Supervisors Mr. Dr. Colette Cuijpers Sascha van Schendel

Department: TILT

Table of contents Chapter 1: The introduction to the ElsaGate Scandal P.1

1.1 Introduction p.1 1.1.1 YouTube Kids p.1 1.1.2 Violence & sexual misconduct on the YouTube Kids App p.2 1.1.3 The rationale behind the ElsaGate videos p.3 1.2 What is the legal/social problem associated with the ElsaGate? p.4 1.3 Existing Literature & Gap p.5 1.4 Research question p.8 1.5 Methodology p.8 1.6 Structure/Roadmap p.9 Chapter 2: What are the algorithms that YouTube currently uses, and in what way do they not function as intended? P.9

2.1 Which Algorithms are utilized by the YouTube platform? p.9 2.2 How does the recommendation algorithm work? p.10 2.2.1. The input data p.11 2.2.2. Related videos p.11 2.2.3. The selection of recommendation candidates p.12 2.2.4. Ranking p.12 2.2.5. User Interface p.13 2.2.6. System implementation p.13 2.2.7. Sub conclusion p.13 2.3. The challenges of the personalized recommendation system used by YouTube p.13 2.4. How does the algorithm not function as intended? p.14 Chapter 3: ‘What is the impact of COPPA and the GDPR in relation to YouTube and how are these legal instruments perceived in legal literature?’ P.15

3.1 US regulation on the privacy and data protection of minors p.16 3.1.1 The Children’s Online Privacy Protection Act p.16 3.1.2 The required compliance for commercial websites and online services p.17 3.1.3 The relevance of the COPPA Rule for YouTube p.17 3.2 The case between the FTC and YouTube p.18 3.3 COPPA’s reception in legal literature p.20 3.4 The European situation p.21 3.4.1 The General Data Protection Regulation (GDPR) p.21

Chapter 4: How does YouTube implement the various international regulations onto their platform? p.25

4.1.1 Terms of Service YouTube p.25 4.1.2 Privacy policy p.26 4.1.3 YouTube Kids Privacy notice p.28 4.1.4 Sub conclusion p.29 4.2 The application of the COPPA Rule by YouTube p.30 4.3 The impact of the COPPA Rule on ElsaGate p.31 4.4 The relationship between COPPA and GDPR p.33 4.5 The difference in compliance between COPPA and the GDPR on YouTube p.34 4.6 Conclusion p.35

Chapter 5: Conclusion p.36 Bibliography p.40 Chapter 1 ‘The introduction of the ElsaGate scandal’

1.1 Introduction 1.1.1. YouTube Kids On 23 February 2015, YouTube LLC developed an alternative to its widely popular video website YouTube, being specifically tailored for kids. This ‘YouTube Kids’ app, which can be downloaded on all devices via the Play Store or the Apple App Store, is a different product than the standard YouTube app. YouTube Kids features a children-friendly layout, which, according to YouTube, is designed to “make it safer for children to explore the world through online video”.1 The app has multiple integrated parental controls. Prior to using the YouTube Kids app for instance, a parent is required to unlock the app and verify their children’s age. Other parental controls include the possibility to turn the ‘search’ option on or off, with the latter meaning that the kid can only see video’s from video creators verified by YouTube itself, and a timer which limits the amount of time that a user can use the app. The YouTube Kids app therefore offers a ‘barebones’ version of the original YouTube app, by removing several features. It is not possible to leave a rating on videos in the YouTube Kids app, and there is no comment section below the videos where the viewers can leave their thoughts. This is purposefully designed in order to limit the unwanted exposure to some of the content that is available on YouTube, which was deemed inappropriate for younger audiences. In order to prevent exposure to inappropriate content, all videos on the YouTube Kids app are checked whether they are child friendly. The YouTube Kids app contains a ‘recommended’ tab under videos, which displays other videos that are related to the video that a user is currently watching. These videos are all videos from the YouTube Kids app only, subjected to the same age restrictions as other YouTube Kids videos. Advertisements are also displayed on the videos. These advertisements are extensively checked by YouTube to ensure that these are family-friendly.2 All the content submitted to the YouTube Kids app is subjected to a verification process by a machine algorithm.34 In the case that the algorithm approves a video for YouTube Kids, then every user can view this video.

1 This snippet is taken from the app description available on the Apple App Store.

2 The statement regarding advertisements on the YouTube Kids app can be found here: https://support.google.com/youtube/answer/6168681?hl=en

3 Kantrowitz, Alex. ‘YouTube Kids Is Going To Release A Whitelisted, Non-Algorithmic Version Of Its App’ (Buzzfeed News, April 6, 2018). Retrieved 2 April 2019 from https://www.buzzfeednews.com/article/alexkantrowitz/youtube-kids-is-going-to-release-a-whitelisted- non#.ftVwoX5dp

4 Wojcicki, Susan. ‘Protecting Our Community’ (YouTube Creator , 2017). Accessed April 2 2019. https://youtube-creators.googleblog.com/2017/12/protecting-our-community.html

1

1.1.2. Violence and sexual misconduct on the YouTube Kids app A problem emerged for YouTube however, when it was discovered that the algorithm did not function as intended, and approved videos that contained content not suited for kids, such as violence and sexual misconduct. After the inappropriate videos were discovered by parents and other (older) users, multiple articles were written about this phenomenon.5 This period of controversial videos being widely spread on the website was later referred to as the ‘ElsaGate’6. These ElsaGate videos featured videos of reoccurring topics and characters, such as Spiderman, from Frozen and the from Batman.7 These animated figures engaged in controversial and inappropriate behavior, such as decapitation, pornographic acts and criminal behavior including, but not limited to, murder, theft and sexual assault. Younger audiences were thus subjected to severely disturbing behavior, which is detrimental to their emotional development. Multiple studies have found that media has a vast impact on youth, with studies finding correlations between increased violent behavior when subjected to violent television programming8, and promoting sexual behavior.910These ElsaGate videos were exposed to millions of kids, whose behavior and emotional development has been impacted due to these videos. The ElsaGate videos were discovered by and were later subjected to countless articles written by media outlets such as the BBC11, The Guardian12 and Der Standard13. The respective journalists documented fragments from the videos ranging from teeth being pulled out from the mouth of Peppa Pig, to animated figurines being buried alive while distressed music is being played in the background. This was everything but the content that should have been available on the kids app. Scientific research concerning deep learning architectures were published in response to the ElsaGate, bringing up further discussion alongside potential solutions to the problem14.

5 Maheshwari, Sapna. ‘On YouTube Kids, Startling videos slip past filters’. (New York Times, 2017). Accessed April 2, 2019, from 6 Brandom, Russell. "Inside Elsagate, The Conspiracy-Fueled War On Creepy Youtube Kids Videos". 2017. The Verge. Accessed May 1 2019. https://www.theverge.com/2017/12/8/16751206/elsagate-youtube-kids-creepy- conspiracy-theory.

7 Dornhoeschen, ‘What is ElsaGate?’ (, 2017). Retrieved April 2 from https://www.reddit.com/r/ElsaGate/comments/6o6baf/what_is_elsagate/ 8 Johnson, JG et. al. ‘Television viewing and aggressive behavior during adolescence and adulthood’ (2002). Accessed June 23 2019. https://www.ncbi.nlm.nih.gov/pubmed/11923542 9 Strasburger, Victor C. ‘Adolescent Sexuality and the Media’ (1989). Accessed June 23 2019. https://www.sciencedirect.com/science/article/pii/S0031395516366949 10 Brown, Jane D. ‘ influences on sexuality’. Accessed June 23 2019. https://www.tandfonline.com/doi/abs/10.1080/00224490209552118 11 Subedar, Anisa. "The Disturbing Youtube Videos That Are Tricking Children" (2019). Accessed April 2 2019. https://www.bbc.com/news/blogs-trending-39381889. 12 Dredge, Stuart. 2016. "Youtube's Latest Hit: Superheroes, Giant Ducks And Plenty Of Lycra". The Guardian. Accessed April 2 2019. https://www.theguardian.com/technology/2016/jun/29/youtube-superheroes- children-webs-tiaras. 13 "Youtube: Wie Gefälschte Disney-Cartoons Kinder Verstören - Derstandard.At". 2019. DER STANDARD. Accessed April 2 2019. https://derstandard.at/2000055049856/Youtube-Wie-gefaelschte-Disney-Cartoons- Kinder-verstoeren.

14 Ishikawa, Akari et. al. “Combating the ElsaGate Phenomenon: Deep Learning Architectures for Disturbing Cartoons”. 2019. Arxiv. Accessed July 3, 2020, from

2

The disturbing content remained under the radar for a while. The reason why the disturbing content was not detected directly, is because the content were mainly short fragments embedded within videos of over 10 minutes long. Parents would therefore not immediately see what their kids were watching. Once a video has finished, the YouTube Kids app immediately opens a new video based on the viewing behavior of the user, where the algorithm recommends a video to the user. This leads to videos getting millions and sometimes billions of views. YouTube expected the ElsaGate videos to be picked up and filtered by the recommendation algorithm in place, but in reality, it proved to be incapable of dealing with these videos on the platform. The videos were not detected until users on the platform reported the videos, which led to both journalists, and later the scientific community, expressing their concerns about the content of these videos1516. YouTube acted upon the complaints made by the community, but the algorithm was purposely designed to prevent the videos from publicly appearing on the platform in the first place. 1.1.3. The rationale behind the ElsaGate videos Because the videos that were being displayed on the YouTube Kids app generated so many views, it became attractive for other content creators to switch to the kids section of YouTube. Because little kids tend to click on everything that they see, including recommended videos and watching these videos for an extended period of time, the popularity of the YouTube Kids app led to a ‘gold rush’ for content creators.17 All content creators on YouTube can enable monetization on their videos once they have amassed a total of 4.000 hours of ‘watch time’, which equals to 24.000 minutes that a creator’s videos have to be watched for within the last 12 months. In addition, the content creator needs to have more than 1.000 subscribers18 on their YouTube channel to be eligible for monetization.19 Any user on the YouTube platform can subscribe for free with the click of a button. Once monetization is enabled on a YouTube channel, the owner of the YouTube channel can put advertisements on their videos, which are put there by Google. These advertisements can be displayed next to a video when watching on a computer, under a video when watching a video on a mobile device, or during a video. During the video, ads can be displayed before the video has started, or after a video. If a video is longer than 8 minutes in

15 Placido, Dani Di. ‘YouTube’s “ElsaGate Illuminates The Unintended Horrors Of the Digital Age”. Forbes. 2017. Accessed July 24, 2020, from 16 Ishikawa, Akari et. al. ‘Combating the ElsaGate phenomenon: Deep learning architectures for disturbing cartoons”. Cornell. 2019. Accessed July 24, 2020, from < https://arxiv.org/abs/1904.08910> 17 Placido, Dani Di "Youtube's "Elsagate" Illuminates The Unintended Horrors Of The Digital Age". 2019. Forbes.Com. Accessed May 1 2019. https://www.forbes.com/sites/danidiplacido/2017/11/28/youtubes- elsagate-illuminates-the-unintended-horrors-of-the-digital-age/#2d3826536ba7.

18 Subscribing to a YouTuber is free of charge and can easily be done by clicking the ‘subscribe’ button under the video of a content creator, or via their own YouTube channel page. Subscribing to a YouTube content creator means that you will be notified by YouTube when that creator uploads a new video. Subscribers on YouTube are often seen as ‘fans’ of a YouTube content creator.

19 The two mentioned requirements are the two most important eligibility criteria. More information can be found on the YouTube website. https://support.google.com/youtube/answer/72851

3 total length, multiple advertisements can also be placed in the middle of the video. The advertisements shown are from companies that have bought advertisement spaces on YouTube. These companies therefore pay to have their products shown via advertisements on YouTube, and YouTube content creators that display these products via ads on their channel get paid by YouTube depending on the amount of times that the advertisements have been seen and clicked on. This means that there is a big monetary benefit in making videos longer than 8 minutes, because it can significantly increase the revenue that content creators receive on their videos. These advertisements generate substantial revenue for content creators. For instance, the YouTube channel ‘Kids Club TV’ collected around 25 million views in the past month, which equals to 5.5-88.000 euros per month, depending on the country where the viewers are from20. In comparison, in January 2019 the YouTube channel generated 250 million views in one month, ten times as much as in the last 30 days. This is merely an illustration of one of the rationales of users making content for the YouTube Kids app. The rationale behind making kids videos controversial, however, remains unclear. Some suggestions such as grooming, normalizing inappropriate behavior and profit maximization have been named, but unrefuted proof is lacking.21 What the ElsaGate has illustrated, however, is that algorithms are not fool proof. As a result of the ElsaGate, YouTube turned towards manual curation to check the content, to hit two birds with one stone: removing disturbing content as fast as possible and making sure that advertisements are only placed on videos that adhere to the code of conduct of YouTube. If a video violates the code of conduct of YouTube, monetization can be disabled on that video by YouTube. This means that a YouTuber will not be able to earn revenue from that video.

1.2 What is the legal/social problem associated with the ElsaGate? The social problem associated with the ElsaGate scandal is that millions of kids have been exposed to disturbing and severely shocking material, which has a detrimental effect on their emotional development. An app which is specifically tailored for younger audiences should not have had these videos present on the platform in the first place, which happened because the YouTube algorithm failed to detect the shocking fragments in the videos. Phenomena such as the ElsaGate should not happen again, so it is important to analyze the regulatory framework which is currently in use by YouTube, to prevent another ElsaGate from happening. The ElsaGate scandal forms a great case study to analyze the functioning of the algorithms on YouTube, and how regulation such as COPPA22, an American-based law enacted by the United States Congress in 1998 and implemented by YouTube as of January 2020, can play a role in effectively tackling unwanted and disturbing content on the YouTube platform. COPPA requires the American Federal Trade Commission to issue and enforce

20 This is the YouTube analytic page for the YouTube channel; ‘Kids Club TV’ https://socialblade.com/youtube/channel/UCwKLDhm_3kneligmQZEVRzw

21 Dornhoeschen, ‘What is ElsaGate?’ (Reddit, 2017). Retrieved April 2 from https://www.reddit.com/r/ElsaGate/comments/6o6baf/what_is_elsagate/ 22 COPPA – Children’s Online Privacy Protection Act. Accessed January 8, 2020, from .

4 regulations concerning the privacy of children in the online environment23. The official regulation, known as the COPPA Rule, became effective on April 21, 2000. An amended version of the COPPA Rule was issued on December 19, 2012, which took effect on July 1, 2013. This final version of the COPPA Rule is what will be analyzed in the thesis. With the recent implementation of the COPPA Rule on the platform, it is important to analyze whether YouTube has properly implemented this regulation on its platform. As COPPA is a US law, European legislation relevant to the processing of data of children, such as the General Data Protection Regulation, will also be included in the research. Human interference is required to address the inappropriate content, due to the ElsaGate videos demonstrating that the algorithms alone cannot prevent these videos from appearing on the platform. The aim of this thesis is therefore to assess the current regulatory framework in place for the YouTube platform and analyze whether it is appropriately implemented or if adjustments are needed in order to prevent another ElsaGate scandal from occurring. It is crucial to evaluate how algorithms can be used in the online environment, and how the correct implementation of a regulatory framework in combination with an algorithm can help in tackling the issue of disturbing content. A proper privacy policy and optimization of the existing algorithms can lead to prevention of any disturbing content from appearing on the platform again, as these instruments can filter inappropriate content for younger audiences by preventing the videos from appearing on YouTube. This thesis therefore aims to provide an overview of the regulatory framework currently in place on YouTube, and whether it can be improved to prevent another ElsaGate situation from happening.

1.3 Existing Literature & Gap The main focus of the thesis is to analyze the legal framework used by YouTube to address the privacy and personal data of its users, and assess whether the platform has appropriately reacted to the ElsaGate scandal to prevent it from happening again. This thesis will therefore provide insight into the regulation of user-generated content on platforms like YouTube, which must adhere to worldwide rules and codifications such as COPPA24 and the GDPR25. Firstly, background information must be gathered concerning the topic of algorithms and regulation, to appropriately understand the material prior to investigating the recommendation system which is in use by YouTube. The article ‘Exploring Echo-Systems: How Algorithms Shape Immersive Media Environments’ by James N. Cohen describes how algorithms like the one used on YouTube, can lead to social phenomena like the ElsaGate. This is how Cohen explains the issue: “To operate at the scale of over a billion simultaneous users, YouTube operates its feed with the algorithm-based recommendation system. As the algorithm is operating so vastly and

23 Federal Trade Commission, COPPA Rule. Accessed July 24, 2020, from < https://www.ftc.gov/enforcement/rules/rulemaking-regulatory-reform-proceedings/childrens-online-privacy- protection-rule> 24 Children’s Online Privacy Protection Rule. Accessed July 3, 2020, from 25 General Data Protection Regulation (GDPR). Accessed July 3, 2020, from < https://eur-lex.europa.eu/legal- content/EN/ALL/?uri=CELEX%3A32016R0679>

5 globally, it cannot always account for exploits in the hundreds of hours of content uploaded every minute. This technological oversight can lead to feed manipulation, dangerous content bypassing the filters, and potentially the weaponization of media content.”26 The article outlines the functioning of the YouTube algorithm, explaining the way in which it is designed. The functioning of the YouTube algorithm is further clarified by the article “Deep Neural Networks for YouTube Recommendations”27 by Paul Covington et. al. The article states that YouTube is one of the biggest, if not the biggest, platform for user-generated video content. They state that the recommendation system is vital in guiding users of the platform towards new content that they might enjoy, and the article explains how YouTube’s recommendation algorithm is implemented with the help of deep learning and neural networks. These Articles are therefore important for understanding how the recommendation system operates. With this information in mind, the recommendation algorithm in use by YouTube was assessed, which has been extensively written about in the article ‘The YouTube video Recommendation system’ by Davidson et. al.28 The article outlines which factors the algorithm takes into account for video recommendation, how it processes the input data gathered from these factors and determines a list of recommended videos based on the preferences by the user, which the algorithm derived from the data collected. Several machine learning techniques are described in the article, which are explained in more detail in the articles ‘Association Rule Mining by Surya Remanan29, and ‘Using Co-Visitation Networks for Detecting Large Scale Online Display Advertising Exchange Fraud’ by Stitelman et. al.30 These articles further clarify how videos are selected by the YouTube algorithm, and which technique it uses to compile a list of recommendations to its users. They can thus provide answers to how the algorithm recommended the inappropriate videos. In addition to exploring the current algorithmic system in place by YouTube and how it contributed to the ElsaGate scandal, other ways of regulation will also be addressed in this thesis. The article ‘Regulation and social practice online’31 by Jenny Kennedy explains how individual interactions between people on platforms can create regulatory norms, exploring how regulation on social media websites can take place. It provides another

26 Cohen, James N. ‘Exploring Echo-Systems: How Algorithms Shape Immersive Media Environments (Journal of Media Literacy Education 2018). p. 143. Accessed April 3 2019. https://digitalcommons.uri.edu/cgi/viewcontent.cgi?article=1366&context=jmle

27 Covington et. al, “Deep Neural Networks for YouTube Recommendations”. Accessed May 3 2019, from https://ai.google/research/pubs/pub45530

28 Davidson et. al. ‘The YouTube Video Recommendation System’. Accessed July 31, 2019.

29 Remanan, Surya. ‘Association Rule Mining’. Accessed July 31, 2019, from

30 Stitelman et. al. ‘Using Co-Visitation Networks for Detecting Large Scale Online Display Advertising Exchange Fraud’ P. 1244. Retrieved August 1, 2019, from

6 regulatory trajectory rather than mere lawmaking and is useful for analyzing the different ways in which regulation of the YouTube platform can take place. The abovementioned articles dealing with the potential solutions for properly implementing algorithms, are supplemented by the article ‘An FDA for Algorithms’32 by Andrew Tutt. Tutt explains that complex algorithms require critical thought in order to properly implement them, to prevent harm and compensate it if need be. Tutt calls for a specialist agency that should deal with enforcing algorithmic safety. This specific solution is important to consider for the algorithm that YouTube uses, as they are currently already using human curation to assess content. Tutt’s ideas can therefore elaborate on the current system that YouTube uses. There is a substantial amount of literature to be found online concerning algorithms and their flaws, but there is little to nothing written about the ElsaGate scandal. Another Master’s Thesis by a student of the Utrecht University analyzes the ElsaGate33, but focuses on Algorithmic Imaginary, rather than algorithms as a solution to monitor content. In addition, to my knowledge, there are no published works covering a legal analysis on how US and EU privacy and data protection legislation impact another Elsagate from occurring. There are a substantial number of articles concerning the accountability of algorithms, explaining how these can function in algorithmic decision making, or delving deeper into the authority of algorithms. The gap in the literature, however, is that none of these articles deal with YouTube in particular. There are some articles that bring up the issue of privacy protection on YouTube and its problem with data privacy for children, such as the article by Ivana Stepanović ‘challenges to privacy protection on YouTube’.34 Alongside this article, there is plenty to be found about the problems of profiling, privacy and how the GDPR can provide guidance in tackling these challenges, such as the article by Sandra Wachter ‘Normative challenges of identification in the of Things: Privacy, profiling, discrimination, and the GDPR35’. The paper ‘Should User-generated Content be a Matter of Privacy Awareness’ by Nicolas Emilio Díaz Ferreyra et. al. claims that social network sites can cause unwanted incidents concerning sensitive and private information, due to the user- generated content being shared on the platforms36. None of these articles deal with the COPPA Rule however, nor do they focus on YouTube from the perspective of both the US and the EU. This thesis aims to analyze the ElsaGate scandal and how COPPA and the GDPR tackle privacy protection of minors on YouTube, which is different to the available literature on the subject. Most of the articles above deal with just the GDPR, while YouTube is a global platform dependent on user-generated content, filled with sensitive content which can be inappropriate for younger audiences. By

32 Tutt, Andrew. ‘An FDA for Algorithms’. Accessed June 10 2019, from < https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2747994> 33 Tuijl, J. Van. “Imagining Algorithms in Everyday Social Media Life: An Investigation into the Algorithmic Imaginary within the Elsagate Discussion on Reddit” (Utrecht University Repository, 2018). Accessed April 3 2019. https://dspace.library.uu.nl/handle/1874/373842 34 Stepanović, Ivana. “Challenges to Privacy Protection on YouTube”. 2018. Institute of Criminological and Sociological Research. 35 Wachter, Sandra. “Normative challenges of identification in the Internet of Things: Privacy, profiling, discrimination, and the GDPR”. 2018. Elsevier. Accessed July 3, 2020, from < https://www.sciencedirect.com/science/article/pii/S0267364917303904?casa_token=Yap3mrivwzMAAAAA:Cg xy1xX8dlVa-gSukQpyHKnOjv1oUH81ifrI2Cyark2fT5qOmN4C3yTWOCGM2-0ohtfbcExv6jE>. 36Nicolas Emilio Díaz Ferreyra et. al. “Should User-generated Content be a Matter of Privacy Awareness”. 2017.

7 analyzing the US-EU situation on privacy protection and how YouTube implemented both instruments on their platform, the thesis will contribute to the literature currently published. Because the ElsaGate scandal happened very recently, it poses great food for thought for other researchers in discovering the potential flaws of algorithms in the way that YouTube implemented them and prevent future exploits that may happen on different websites. This thesis will therefore combine the existing literature written about the recommendation system that is used by YouTube, and focus on the legal nature rather than the technical one, by exploring different ways in which regulation or legal remedies can prevent another ElsaGate scandal from happening.

1.4 Research question (central question) The above leads to the following central question for research: What legal requirements are put in place for user generated content providers such as YouTube, based on the frameworks of COPPA & GDPR, and are these required measures sufficient to prevent another ElsaGate?

Sub questions - What are the algorithms that YouTube currently uses, and in what way do they not function as intended? - What is the impact of COPPA and the GDPR in relation to YouTube and how are these legal instruments perceived in legal literature?’ - In what way has YouTube implemented the relevant US and EU regulations, and is it enough to prevent another ElsaGate?

1.5 Methodology In order to answer the research question and sub-questions, I will study the related articles and legislation that I have found for every sub-question, in order to combine the knowledge gathered from the various sources and apply it to my research questions. These sources will help to provide insight into the type of algorithms that are currently in use by YouTube and how they do not function as intended. The technical aspects of the YouTube system will also be addressed, using the technical literature and the explanations given by YouTube concerning the functioning of their algorithm. In addition, the analysis of the recently adopted COPPA Rule on the platform provides room for legal analysis, to determine whether YouTube has adequately implemented the measures required by the COPPA Rule. The General Data Protection Regulation is another matter of interest, as this is the guiding legal document concerning data protection in the European Union. I will compare the legal situation with the COPPA Rule and the GDPR in mind to determine whether YouTube appropriately implements the regulations. After the analysis of the legal documents, I will conclude my findings by applying them to the ElsaGate case, in order to determine whether

8

YouTube has adequately resolved the scandal, or give recommendations on how to prevent a future scandal from happening.

1.6 Structure/Roadmap This first chapter of the thesis serves as an introduction, where I sketch the facts of the ElsaGate, and how the algorithm proved to be faulty in effectively monitoring the content that was allowed on the website. The second chapter will focus on the algorithm that YouTube currently has in use on its platform, explaining its functioning and the potential challenges that YouTube faces with their current algorithm. The chapter will delve deeper into how YouTube determines the list of videos recommended to its users, by collecting personal data of the users based on their viewing history. How this data is processed will be addressed in the chapter. In addition, the chapter will touch upon the question how the algorithm did not function as intended, and thus made the ElsaGate scandal possible. The third chapter deals with the regulatory framework which is currently in place for user-generated content services like YouTube. In this chapter, both the European and the American side will be discussed. Apart from this American-based law, YouTube is also required to adhere to European legislation regarding data protection, namely the General Data Protection Regulation. This chapter will thus set out the relevant regulation for this thesis. Chapter four will elaborate on the third chapter, by assessing whether YouTube has appropriately implemented both the European law and the COPPA Rule onto their platform. As of January 2020, YouTube implemented the COPPA Rule, originally enacted by the United States Congress and the Federal Trade Commission in 1998, which requires additional protection concerning the privacy and data protection of minors. Chapter five will serve as the conclusion for this thesis, rounding up the points made in this thesis and concluding whether another ElsaGate scandal will take place again.

Chapter 2 ‘What are the algorithms that YouTube currently uses, and in what way do they not function as intended?’

2.1 Which Algorithms are utilized by the YouTube platform? YouTube makes use of personalized recommendations for information retrieval, stating that the personalization of these recommendations is a key factor in the gathering of information.37 Personalized recommendations mean that the videos displayed on the main page of YouTube and on the ‘browse’ tab of the website, are dependent on the viewing behavior of that particular user. The recommendation system tracks the interests of the viewer and recommends videos related to this viewing behavior. When a person is very fond of cat videos, he or she will be shown more videos that contain cats, because the algorithm has detected that the particular user is interested in watching videos of felines. This

37 Davidson et. al. ‘The YouTube Video Recommendation System’. Accessed July 31, 2019.

9 recommendation system is implemented by YouTube in order to make the navigation of the platform as efficient and pleasing as possible for the user, and to provide high quality videos that align with the interests of the individual user. YouTube aims to translate the viewing of a single video (referred to as ‘direct navigation’ by Google) to viewing multiple videos of the same topic (‘search & goal-oriented browse’), via the implementation of the recommendation algorithm. The goal-oriented browse can be improved with the use of the recommendation system, to ensure that users are satisfied when watching the videos on the platform. YouTube reports that this algorithm is very successful in terms of effectiveness, noting that about 60% of the videos clicked on from the YouTube home page, derive from personal recommendations retrieved by the algorithm. Because of this, YouTube has put an emphasis on optimizing the recommendation algorithm that they have been using for several years. In the next section, the functioning of the algorithm will be discussed in detail.

2.2 How does the recommendation algorithm work? The batch of recommended videos displayed to the user are dependent on the user’s personal activity on the platform. Factors that the algorithm considers are the types of videos watched, the videos that have received a ‘like’ rating from the user, and the videos that are favorited by the user. The latter means that the user has added the video to a specific list on their YouTube account, with all their favorite videos in one overview. This is different from leaving a like on a video, which basically means that the video interested the user but does not fall within their list of favorite videos on the platform. Based on these factors, the recommendation system recommends a set of videos, which are ranked based on the video’s relevance and diversity. These factors are purposely both coupled and isolated by the system, to more accurately understand the interests of the viewer. This means that YouTube will not rely their recommendation solely on the input data that it has received, coupling it together with other factors, such as the related vides that a user has watched within a specific timespan for example. The recommendation system furthermore incorporates multiple different factors in their algorithm: (A) Input data; (B) Related videos; (C) The selection of recommendation candidates; (D) Ranking of the recommended videos; (E) The User interface; (F) System implementation.38 These factors will all be briefly explained below.

38 Ibid, section 2 and onward.

10

2.2.1. The input data YouTube gathers personalized data from multiple sources39. Most of the data stems from two main sources: content data and user activity data. The amount of views that a video has and other general data such as the title of the video, likes and dislikes as well as the video description fall under content data, while activity data is more focused on the behavior of the individual watching the video. Activity data is further divided between implicit and explicit categories. Implicit data relates to factors such as the rating that the video has received from the viewer, whether the video is liked or favorited and whether the user is subscribed to the creator of that video. In addition, YouTube considers the implicit activities, which relate to the watch time of the user. This means that the recommendation algorithm differentiates between users that watch a video for only a short period of time, and those who watch large portions of the video, or the entire video. This input data is quite unreliable however, because this data does not indicate the person’s satisfaction while watching the video. Because someone watched a 20-minute long video about how to solve a specific mathematics problem, does not have to mean that that person is interested in other related mathematics subjects. Due to these unreliable factors, the input data is only a small portion of data that is considered when recommending videos to individuals.

2.2.2. Related videos YouTube aims to recommend videos that are related to the interests of the individual and calculates this via a specific (simplified) formula.40 To put this formula into words: the algorithm takes the video that the user is currently watching (‘seed video’), and recommends videos using techniques such as ‘association rule mining’ and ‘co-visitation counts’. Association rule mining is a type of machine learning where an algorithm is programmed to find patterns between data, by searching for dimensions that either occur simultaneously, or are interrelated.41 Co-visitation counts refer to the amount of times that specific videos are watched after each other within a time span of 24 hours.42 If the algorithm for instance finds that a lot of viewers who have watched cat video A also watch cat video B, D & E within 24 hours, then those videos will be higher up in the recommendation ranking than cat video C, which has enjoyed severely less co-visitation than the other cat videos. The more videos have been co-watched and share data patterns, the more videos will be related to each other, and thus recommended to the individual based on their seed video. These two techniques are the foundation for analyzing related videos on YouTube, which is calculated by the aforementioned formula.

39 Ibid, p.294. 40 The formula can be found on p. 294 of the article by Davidson et. al. 41 Remanan, Surya. ‘Association Rule Mining’. Accessed July 31, 2019, from 42 Stitelman et. al. ‘Using Co-Visitation Networks for Detecting Large Scale Online Display Advertising Exchange Fraud’. P. 1244. Retrieved August 1, 2019, from

11

2.2.3. The selection of recommendation candidates In order to appropriately select the videos to recommend to the individual, the algorithm combines the input data and related videos data as explained above, creating an overview of recommended videos. This overview is thus a selection of either videos that were ‘liked’ or ‘favorited’ by the user, or that were (co-)watched within a time span of 24 hours. These videos combined form the so-called ‘seed set’, stemming from the ‘seed video’ mentioned before. In order to form an appropriate selection of recommended videos for this seed set, the algorithm checks all the individual videos from this seed set, to determine whether the recommended videos in this seed set align with the seed video that is watched by the user. In other words, the algorithm checks the recommended videos of all the videos in the seed set, to determine whether the recommended videos of those videos in the seed set are appropriately placed in the user’s seed set. By gathering an initial selection of videos related to the users first viewed video (‘seed video’) and then making a re-selection based on the recommended videos from all the individual videos of the seed set, the algorithm can much more accurately recommend videos that are in the interest of the viewer. One of the downfalls of this system is that because the seed set refreshing is based on the recommended videos from each of the individual videos of the seed set, the final list of recommended videos will be extremely alike. This might lead to the viewer not finding any content which is fundamentally ‘new’. YouTube aims to prevent this by limiting the total of identical factors between videos, to remain true to the seed video, while at the same time providing recommendations that are new and diverse. 2.2.4. Ranking When the selection of recommendation candidates is made, this set of videos is ranked for relevance to the users’ interest by considering several factors43. One of these factors is the quality of the video, based on the numbers of views that the video has, the amount of like- and favorite ratings the video has received, along with other factors such as the upload time of the video and the activity around the video (how many times it has been shared via media). Another factor that the algorithm considers is the specificity of the user. The algorithm ranks videos higher when the video is very much related to the preferences of the viewer. To determine these preferences, the algorithm analyzes the seed video and checks the watch history. A third factor is diversity of the recommendation list. Because YouTube is only able to showcase a small number of recommended videos, stretching from a mere four videos to a whopping sixty videos, the algorithm favors videos that are diverse in their category, instead of videos that would typically be the most relevant. If a user has primarily watched videos of cats but has also shown interest in videos of dogs, the algorithm opts to also incorporate dog videos and even videos of other animals, due to the users watch history. The algorithm detected that this user has an interest in animal videos, but not only in cats.

43 Ibid, p. 295.

12

2.2.5. User interface The user interface of YouTube is specifically tailored to improve the experience of the users44. All videos that have been chosen by YouTube are under a specific tab on the website called ‘recommended for you’, while displaying information that is relevant for the video. The thumbnail of the video is shown, along with the amount of views, the day that the video was uploaded and more.

2.2.6. System implementation Rather than opting for an on-demand method of displaying recommended videos to its users, YouTube chose a batch-oriented pre-computation approach. This is an approach in which a computer executes all tasks that it has been instructed, in this case the gathering and processing of the data explained in the sections above, doing so without any human intervention.45 The official patent description states the following: ‘Each batch re-computation order instructs the computation platform to compute a plurality of database query results. The computation platform processes a batch re- computation order by computing the plurality of database query results according to a batch processing schedule within a given time frame. It returns the computed database query results resulting from the batch re-computation order to the search platform.’46 This approach was taken to efficiently manage resources due to the large amounts of data, while maintaining low latency in refreshing and displaying the recommended videos. The downside of this is however, that there is a bigger delay in serving new batches of recommended videos, because of the pre-computation element rather than using the on- demand method. In order to remedy this as much as possible, YouTube updates their data sets several times per day to prevent long delays from occurring.

2.2.7. Subconclusion The steps described above can be narrowed down to three main steps. First YouTube collects the data from the individual, based on their viewing habits. After this data has been gathered by the algorithm, it creates a batch of recommended videos based on the data gathered from the individual, in the manner as is explained above. Finally, these recommended videos are displayed to the user via the pre-computational batch method.

2.3 The challenges of the personalized recommendation system used by YouTube YouTube acknowledges that there are several challenges which they are faced with because of the recommendation algorithm. Firstly, the videos available on the platform lack metadata: little available descriptive elements that define what the videos are about, or data related to

44 Ibid, p. 295. 45 ‘Batch Processing’ definition, as found on 46 More on the patent can be read here: ‘Database system using batch-oriented computation’. < https://patents.google.com/patent/WO2013160721A1/en>

13 the subject/purpose of the video. Streaming services like gather a lot more metadata, based on the series and movies that a user watches. These videos are a lot longer and contain more information (the title of the video, genre, actors etcetera). Compare this to YouTube, where a lot of the content available is very short in length, being under ten minutes in total. I will visualize the aforementioned problem by using cats again as an example. Only a very small percentage of users want to watch a video about the same cat for over ten minutes in total, unless it is their own cat. Exceptions to this are compilations, which are several short clips from different video makers combined into one video, thus featuring numerous different cat videos in one. Generally, according to the data extracted by YouTube47, the videos are rather short in length, however. Because these interactions with the videos are rather short and do not provide the algorithm with a lot of information regarding the viewing experience of the user, it becomes a bigger challenge for the recommendation system to properly recommend videos that align with the user’s interests. Another phenomenon that YouTube must consider, are the current trends, often referred to as ‘hypes’ or ‘viral videos. Examples of hypes on YouTube are the ‘Harlem Shake’48 phenomenon and the popularity around the ‘Ice Bucket challenge’ several years ago, where a person in the latter trend would throw a bucket of ice water over their body in support of the ALS disease. These hypes typically last for a couple of weeks and are then taken over by another hype. This means that even though a user has watched numerous Harlem Shake videos during a brief period, it is not a reputable source for extracting the interests of the viewer. When the hype is over, the person may not be interested in the Harlem Shake anymore, and leaves the website dissatisfied because he or she is only recommended Harlem Shake videos. YouTube aims to prevent this, but due to the lack of metadata available it poses a challenge for YouTube to correctly implement the YouTube recommendation system, therefore lacking the proper functioning of an algorithm in special cases such as viral videos.

2.4 How does the algorithm not function as intended? In the subsections above it has been explained how the recommendation algorithm works, and what the challenges are that YouTube faces in implementing this algorithm, which therefore leads to the algorithm not functioning as intended. The challenges explained by YouTube are however not the only things to lookout for, as has been demonstrated by the ElsaGate controversy. Because of the recommendation system in place on YouTube, users would receive batches of videos related to their watch history. The reliance on a user’s watch history creates an issue when the initial video watched by the user contains shocking content, as was the case in the ElsaGate themed videos. Children on the YouTube Kids app would click on a video, not knowing that the content of this video would be harmful to them. These videos were all animated cartoons with flashy colors and figurines such as Elsa, Spiderman and Peppa Pig engaging in all sorts of inappropriate activities. As soon as a handful of these videos have been watched by kids, the algorithm picks up on this data and ‘feeds’ these users related animated videos, featuring Peppa Pig, Elsa and Spiderman. Due to all these videos

47 Ibid, p. 293. 48 The ‘Harlem Shake’ was an internet gag in the form of a video, where a user would dance to the song ‘Harlem Shake’ by American DJ Baauer in a ludicrous manner. An illustration of this can be found here:

14 being so related to each other, the recommendation algorithm created a situation in which these videos were all recommended to each other in a sort of loop, leading to these ElsaGate themed videos generating millions, and sometimes even billions, of views49. Kids videos have become among the most watched videos on the entire platform. Because of the increasing popularity of these videos, other creators on the YouTube platform made the same type of videos to make kids click on their content as well, thus the recommendation loop only got worse and the recommendation system at one point failed to recommend other material than these animated videos, which contained shocking content such as decapitation, rape and violence. The algorithm thus functioned how it was programmed according to the recommendation algorithm article published by YouTube: it recommends videos based on the viewing behavior of the individual. With the implementation of YouTube Kids however, the algorithm was faced with a whole new audience: toddlers that click on everything that they can lay their fingers on. With the YouTube Kids app, toddlers clicked on all videos in sight and within reach, and when they watched these videos for a few seconds, they clicked on another video that was in their direct line of sight, thus feeding the algorithm information that they liked these types of videos (they clicked on one of the videos recommended to them after all). The algorithm failed to detect the shocking material that was present in a lot of these animated videos. Because millions of videos are watched every day, it becomes impossible for the human eye to track all videos manually, thus the algorithm was in charge of monitoring the content, and to keep supplying the users with content based on the individual’s preferences. Because the algorithm did not detect the shocking content, these videos were all recommended to kids using the YouTube Kids app, due to the algorithm being built upon the techniques such as ‘association rule mining’ and ‘co-visitation counts’, which have been described earlier in the chapter. Thus far, it has been discussed which algorithm is in place on the platform, and what the several challenges are that YouTube faces in making sure that the video recommendation algorithm functions as intended. In the next chapter it will be discussed what the current regulation is concerning user-generated content services such as YouTube, to provide an overview of which legislation is currently in place to prevent situations such as the ElsaGate controversy from happening.

Chapter 3 ‘What is the impact of COPPA and the GDPR in relation to YouTube and how are these legal instruments perceived in legal literature?’ In order to answer the focus point of this thesis, which is the question of how the COPPA Rule and the GDPR can tackle the issue of unwanted and shocking content on user-generated content platforms such as YouTube, an overview must be made of the regulation already available on the matter. This chapter will begin by covering the Children’s Online Privacy

49 Lafrance, Adrienne. ‘The Algorithm that makes preschoolers obsessed with YouTube’. Accessed August 2, 2019, from < https://www.theatlantic.com/technology/archive/2017/07/what-youtube-reveals-about-the-toddler- mind/534765/>

15

Protection Act (COPPA)50. COPPA is a federal law enacted by the United States Congress and monitored by the United States Federal Trade Commission (FTC). The FTC is required by COPPA to safeguard the privacy of children online, and provide parents with more control over the collection of data from children under the age of thirteen.51 The official regulation is called the COPPA Rule, the revised version of which is in effect as of July 1, 2013. The COPPA Rule has been implemented on the YouTube platform as of January 2020. The COPPA Rule protects the privacy of American children only, based on the definition of an ‘operator’ under the COPPA Rule.52 The definition specifies that foreign-based websites and online services must comply with COPPA if they collect personal data of minors of the United States. If the foreign based operator has any commercial activity with the United States or any of its territories, it is bound by the COPPA Rule. Contrary to the United States, the European Union has no specific law concerning the data processing of minors. It does have a general regulation concerning data protection, the General Data Protection Regulation (GDPR). The GDPR also governs the privacy and data protection of minors. Both the COPPA Rule for the US and the GDPR for the EU will be analyzed in this chapter, to provide an overview of what these legal instruments require for appropriate compliance. The fourth chapter will consequently check whether both these instruments are applied on the platform.

3.1 US regulation on the privacy and data protection of minors 3.1.1. The Children’s Online Privacy Protection Act The Children’s Online Privacy Protection Act is a US-based law, enacted by the United States Congress, and monitored by the Federal Trade Commission.53 The Rule aims to protect the privacy and safety of children in the online environment. This safeguarding includes the prohibition of unnecessary and unauthorized collection of personal information of children, by any operator of either an Internet website or an online service.54 3.1.1.2. The required compliance for commercial websites and online services

50 Federal Trade Commission, Children’s Online Privacy Protection Rule (“Coppa”). Accessed January 6, 2020, from < https://www.ftc.gov/enforcement/rules/rulemaking-regulatory-reform-proceedings/childrens-online- privacy-protection-rule>/. 51 Federal Trade Commission, ‘Children’s Online Privacy Proposed Rule Issued by FTC’. Accessed July 3, 2020, from < https://www.ftc.gov/news-events/press-releases/1999/04/childrens-online-privacy-proposed-rule- issued-ftc>. 52 COPPA “Definitions”. Accessed July 3, 2020, from < https://www.ecfr.gov/cgi-bin/text- idx?SID=4939e77c77a1a1a08c1cbf905fc4b409&node=16%3A1.0.1.3.36&rgn=div5#se16.1.312_12> 53 COPPA – Children’s Online Privacy Protection Act. Accessed January 8, 2020, from . 54 Electronic Code of Federal Regulations, Children’s Online Privacy Protection Rule’. Accessed April 7 2020, from

16

All commercial websites and online service providers that are bound by COPPA, are obligated to abide by the following requirements55: A) Create an online privacy policy that is clear and easy to understand, in which it is detailed how personal information is collected; B) Provide all parents and guardians with a notice, in the situation that personal information is collected from the children. Consent must be acquired prior to collecting the information; C) Parents must be given the opportunity to give consent for the collection of personal data, simultaneously preventing the website and/or service from disclosing this consent to any third party. Disclosure is only allowed when it is integral to the functioning of the service, albeit that this must be known by the parents; D) Offer parents an overview of the data that is collected, along with the possibility to delete any information gathered; E) Enable parents to prevent the use and/or online collection of any personal information of the child; F) All data collected from children must maintain appropriate security, confidentiality, and integrity. Reasonable steps must also be taken to only share this collected information with third parties that can guarantee the same safeguards of confidentiality and security; G) The collected data can only be stored by the operator for as long as is necessary to fulfill the purpose of the collection. When this purpose has been fulfilled, the collected personal data shall be deleted using reasonable measures. The COPPA Rule is applicable to all operators of online services and commercial websites that collect, use or disclose personal data from children under the age of thirteen. Personal data under the COPPA Rule is any type of data that can be used to identify a person including, stretching from the first and last name to geolocations that can help identify the residence of a child.

3.1.1.3. The relevance of the COPPA Rule for YouTube Although COPPA is an American law, it is implemented by YouTube worldwide as of January 2020.56 This means that YouTube content creators from countries in the European Union, Asia and all other regions must adhere to the COPPA Rule, and are required to designate whether their videos are directed to children. If the video is designated for children, this means that it falls under COPPA’s scope. The reasoning behind this decision to apply the COPPA Rule on YouTube worldwide can be explained by looking at the scope of the Rule. In paragraph 312.2 of the COPPA Rule, an ‘operator’ is defined as “any person who operates a Web site located on the Internet or an online service and who collects or maintains personal information from or about the users of or visitors to such Web site or online service, or on whose behalf such information is collected or maintained, or

55 Federal Trade Commission, ‘Complying with COPPA: Frequently Asked Questions’. Accessed January 8, 2020, from < https://www.ftc.gov/tips-advice/business-center/guidance/complying-coppa-frequently-asked- questions>. 56 YouTube Official Blog, ‘An update on kids and data protection on YouTube’. 2019. Accessed July 24, 2020, from

17 offers products or services for sale through that Web site or online service, where such Web site or online service is operated for commercial purposes involving commerce among the several States or with 1 or more foreign nations; in any territory of the United States or in the District of Columbia, or between any such territory and another such territory or any State or foreign nation; or between the District of Columbia and any State, territory, or foreign nation Personal information is collected or maintained on behalf of an operator when: (1) It is collected or maintained by an agent or service provider of the operator; or (2) The operator benefits by allowing another person to collect personal information directly from users of such Web site or online service.” 57 Following this definition, YouTube is considered an operator, at it is an online service which collects and maintains personal information from its users whilst doing so for commercial purposes in both the United States and in foreign countries, and thus falls under the scope of COPPA. The FTC specifies that it does not matter whether a content creator on YouTube uploads videos on YouTube and is from outside the US, as COPPA applies in the same way as if that content creator had its own website58. Depending on the nature of the content, the YouTube channel of the content creator can be defined as an ‘operator’, which is consequently covered under COPPA. If the content is ‘made for kids’ and that content creator, or someone on behalf of the content creator such as an advertisement network, collects data from the viewers of that content, then that channel is covered by the COPPA Rule. What it means to be covered by the COPPA Rule or when content is ‘made for kids’ will be discussed later in chapter four. The decision to make the COPPA requirements apply to every content creator can prevent circumvention of the COPPA Rule by those profiles operating outside the US, which would consequently not be covered by the COPPA Scope. YouTube did not implement the COPPA Rule, which is already effective as of April 2000, until January 2020, only after allegations were made by the FTC that YouTube illegally collected the personal data of children, without parental consent.59 This resulted in a case between the FTC and YouTube/Google, who consequently settled for $170 million dollars. The case will be discussed in the next paragraph.

3.1.2. The case between the FTC and YouTube The Federal Trade Commission is an independent body of the United States Government, able to initiate federal court proceedings in the case of any violations of the COPPA Rule. Together with the People of the State of New York, represented by Attorney General Letitia

57 COPPA Rule, paragraph 312.2. Accessed July 3, 2020, from < https://www.ecfr.gov/cgi-bin/text- idx?SID=4939e77c77a1a1a08c1cbf905fc4b409&node=16%3A1.0.1.3.36&rgn=div5#se16.1.312_12>. 58 Federal Trade Commission, “YouTube channel owners: Is your content directed to children?”. “How Coppa Applies to Channel Owners”. Accessed July 3, 2020, from 59 Federal Trade Commission, ‘Google and YouTube will Pay Record $170 Million for Alleged Violations of Children’s Privacy Law’. Accessed July 3, 2020, from

18

James, a complaint was filed against the defendants, Google LLC and YouTube LLC, for violating the COPPA Rule.60 The case facts state that the YouTube platform falls under the scope of the COPPA Rule, due to them collecting personal information of minors for advertising purposes, as well as identifying personal behavior of its users on behalf of commercial entities advertising via the platform. The FTC claims that YouTube has knowledge about this collection of personal information via channels that are focused on creating content for children.61 In addition to collecting this data, YouTube has failed to inform parents and/or guardians with a notice that the personal information of their children is being collected. YouTube actively promotes its sub-section of the platform, YouTube Kids, to potential advertisers.62 The platform markets itself as the ‘favorite website for kids aged 2-12’. While YouTube categorizes content into different age groups, with ratings such as ‘Y’ for ages 0-7, PG for ages 10 and older and X for those older than 18, they do not treat the collection of the data from these videos any differently from one another. This means that videos indicated with a ‘Y’ rating still display behavioral advertisements, even though these videos are intended for an age group that explicitly falls under the scope of the COPPA Rule. The children-oriented content in general was treated identical to the other content available on the platform. Apart from assigning content ratings to videos, YouTube hosts several online profiles which are directed to children. The videos uploaded by these online profiles are manually curated by the platform to also appear on the YouTube Kids app, and the channels in question actively promote themselves as being made for kids, in the ‘About’ section of the channel, a component of a YouTube profile which serves as a biography of the channel and creator. Furthermore, these videos include the use of animated characters and children that play with toys or other activities which are, according to the FTC, deemed as ‘children-oriented activities’. Examples of these children-specific channels are , which displays kids’ television shows such as the Powerpuff Girls; Hasbro which hosts programs such as My Little Pony, Play-Doh Town, and many more. The owner of the Hasbro channel specifically indicates that the general demographic of My Little Pony is children between ages 5-8 years old.63 Other channels such as Masha and the Bear display animated videos of Masha and her bear friend, and the “about” section of the channel states that the show is intended for both children and parents alike, as well as setting the target audience at children aged 3-9 years old. The FTC discovered that YouTube earned close to 50$ million in behavioral advertising from channels such as the ones described above, stating that this is only a part of the content which is designed specifically for kids on the platform. The FTC concludes that YouTube has knowledge of the collection of personal information via behavioral identifiers gathered via content ratings, communications between children-oriented channel owners emphasizing that

60 Federal Trade Commission & People of the State of New York vs. Google LLC & YouTube LLC. Accessed April 7, 2020, from 61 Ibid, ‘Overview’ 62 Federal Trade Commission & People of the State of New York vs. Google LLC & YouTube LLC. Revised, ‘Exhibits A-C’. Accessed April 7, 2020, from

63 Ibid, P. 11.

19 their content is directed towards children under the age of 13, and the manual content curation done by YouTube for videos that consequently appear on the YouTube Kids app. Based on the conclusion described above, the FTC constituted that YouTube and Google violated the COPPA Rule, as they are ‘operators’ as described by the Rule, and that they collect personal information of children under the age of 13. Additionally, the platform did not notify parents and legal guardians of the data collected, nor did they receive consent for collecting the data in question. The FTC requested injunctive relief to the Court, and stated that any collection of data of children under the age of 13 violates the COPPA Rule. As a result of the $170 million dollar settlement between the FTC and Google, it was agreed that YouTube had to create a mechanism which required YouTube content creators to designate which videos that they upload are ‘made for kids’64. A video would not be catered to kids if a few kids under the age of thirteen happen to watch it. The situation would be different if the intended audience of the video is kids under the age of thirteen specifically. In the latter case the content would be considered “made for kids”. What YouTube has done to comply with the COPPA Rule on the platform following the settlement with the FTC will be analyzed in chapter four of this thesis.

3.1.3. COPPA’s reception in legal literature Although the FTC has filed numerous cases against operators that failed to comply with the COPPA Rule, like the one against YouTube, COPPA is perceived negatively in legal literature. A study of 162 children websites found that only four fully complied with all the requirements set out by COPPA, whilst 109 websites collected personal information of minors65. 67 percent of the websites that collected personal information made no effort to obtain permission from parents to do so. The study calls upon the FTC to more proactively enforce the COPPA requirements, as the current situation leads to most websites either not complying at all, or engage in minimal compliance. Another study found that most websites simply ban users under the age of twelve66. This is however an ineffective way of dealing with the matter, as it simply encourages minors to fill in a fake age. Given the huge increase in young users on the internet, the article by Matecki calls for a complete overhaul of COPPA, requiring more regulation concerning the collection of personal information by operators. A recent analysis of the 5.855 most popular Android apps for children found that a lot of

64 Federal Trade Commission, YouTube channel owners: Is your content directed to children?”. Accessed July 3, 2020, from 65 Cai, Xiaomei et. al. ‘Children’s Website Adherence to the FTC’s Online Privacy Protection Rule’. 2010. Accessed August 3, 2020, from . 66 Matecki, Lauren A. ‘COPPA is Ineffective Legislation! Next Steps for Protecting Youth Privacy Rights in the Social Networking Era’. 2010. Accessed August 3, 2020, from .

20 smartphone apps are in violation of COPPA, due to the software development tools used to create the apps, which enables tracking and behavioral advertising67. Based on the analysis done by the articles, COPPA is not held in high regard. The consensus of the studies is that COPPA requires more active enforcing in order to protect the data collection of minors, as currently most operators only partly implement the COPPA Rule, if they do so at all. The application of the COPPA Rule on YouTube, however, is done in conjunction with the FTC. This means that the FTC actively monitored YouTube to ensure that a proper mechanism is put in place to combat the illegal collection of personal data. YouTube was able to collect data from children up until the final decision by the Court, forcing YouTube to change its approach. As the studies illustrated however, operators will not implement COPPA on their platform independently, thus stricter enforcement by the FTC is required according to legal literature. Chapter four of this thesis will discuss in depth how YouTube has applied the COPPA Rule on their platform, to see whether this tailored approach by the FTC and YouTube is enough to prevent the collection of personal data of minors on YouTube.

3.3. The European situation The European Union has multiple instruments that concern privacy and collection of data, such as the ePrivacy Directive and the European Charter. The regulation which is most relevant for this thesis is the General Data Protection Regulation, as it is the uniform law concerning the processing of personal data in the EU and of EU citizens and addresses the transfer of data with countries outside of the European Union68. It governs the data processing between international companies, which makes it interesting in the case of YouTube, as YouTube also operates outside of the EU, and collects data from its European viewers. The legal text of the GDPR will be discussed in this chapter, and chapter four will deal with the implementation of the GDPR on YouTube.

3.3.1. General Data Protection Regulation (GDPR) The General Data Protection Regulation is built upon the foundation of Article 8(1) of the Charter of Fundamental Rights of the European Union and Article 16(1) of the Treaty on the Functioning of the European Union, as the protection of the processing of personal data is a fundamental right for natural persons.69 The GDPR therefore functions as the harmonizing body within the European Union on the subject of processing of personal data, and is the go- to legal document concerning the protection of personal data of minors. Personal data is defined by the GDPR as: “Any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an

67 Reyes, Irwin et. al. ‘Won’t Somebody Think of the Children? Examining COPPA Compliance at Scale. 2018. Accessed August 3, 2020, from 68 General Data Protection Regulation, Article 3. Accessed July 3, 2020, from < https://eur-lex.europa.eu/legal- content/EN/ALL/?uri=CELEX%3A32016R0679> 69 Regulation 2016/679 (General Data Protection Regulation). Preamble. Accessed April 7, 2020, from < https://eur-lex.europa.eu/eli/reg/2016/679/oj>.

21 identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person”.70 Processing of data is defined as “Any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means, such as collection, recording, organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction”.71 As per the territorial scope of the GDPR, the regulation applies to all forms of personal data processing in connection to any professional or commercial activity, by a controller or processor who processes data of a data subject in the European Union72. This means that a data processor or controller from outside the European Union, with a commercial or professional goal, that offers goods or services, or monitors behavior of data subjects inside the EU, is covered by the GDPR. Content data, activity data and all other types of personalized data collected by YouTube from a European data subject must be processed in compliance with the GDPR. If a company from the US would process data from a US data subject, then it would not be covered by the GDPR. Any processing of data must be in accordance with the regulation, and failure to do so can result in fines of up to 20 million euros, or 4% of the annual turnover if this amount surpasses the fixed amount of 20 million73. In paragraph 38 of the preamble, the regulation acknowledges the special protection for minors concerning the processing of personal data, as children are not aware of the risks and consequences that flow from the processing of data. Any processing of data with the purpose of marketing or personal profiling particularly falls within the scope of the GDPR’s protection. If processing of personal data of a child takes place, any form of communication and information must be provided in plain language, able for the child to understand.74Just like any other individual, children enjoy the same rights as adults, with extra safeguards in place. These rights are the following: A) The right to be informed. Individuals must be informed about the collection and use of their personal data75. This information includes but is not limited to, the purposes of processing, how long the data is kept for, and to whom the data is shared as well as the period of time that the data will be stored. The information about processing must be given in a transparent and comprehensible way, so that any person is able to understand what their data is used for. Children are provided with extra protection, so the language used must be intelligible to minors if they are subject to data processing.

B) The right of access. Individuals have the right to access their personal data76. Data controllers must provide a copy of the data processed when requested by the data

70 Ibid, Article 4(1). 71 Ibid, Article 4(2). 72 General Data Protection Regulation, Article 3. Accessed July 3, 2020, from < https://eur-lex.europa.eu/legal- content/EN/ALL/?uri=CELEX%3A32016R0679> 73 Ibid, Article 83(5). 74 Ibid, paragraph 58 preamble. 75 Ibid, Articles 13 & 14. 76 Ibid, Article 15.

22

subject. The aim of this right is to provide the individual with insight on how and why his data are being used and to assess the lawfulness hereof.

C) The right to rectification. In the case that any information about an individual is incorrect or incomplete, the person can have this information rectified.77

D) The right to erasure. Also referred to as the ‘right to be forgotten’. Individuals are entitled to have their personal data deleted by the data controller upon request78. Erasure is an obligation for the data controller when consent has been withdrawn, if the information stored about the individual is no longer necessary or if the information is unlawfully processed in the first place. This right is particularly important to children, in the case that an individual gave consent when he or she was still a child79. Adults can still withdraw this consent at any time.

E) The right to restrict processing. Individuals can restrict the processing of personal data, so that organizations are limited in their ability to use the data80. Individuals must have a reason for the restriction of data processing. The right to restriction of processing is seen as an alternative to the right to erasure.

F) The right to data portability. Any individual is enabled to acquire and reuse the personal data gathered by the data controller, to transmit the data to another controller or use it for other personal purposes81. Data portability offers individuals the ease-of- access in moving or transferring personal data from one place to the other.

G) The right to object. Individuals are able to object to the processing of personal data.82 Data processing can be stopped indefinitely if used for direct marketing purposes. This objection also applies to information processed for scientific or historic purposes, unless this research is of general public interest. Upon receiving the request of the individual, the data controller must stop the processing immediately.

H) Rights in relation to automated decision making and profiling. Automated decision refers to any decision making which is conducted without human involvement. An example of this is an aptitude test for recruitment purposes, which is built upon algorithms and criteria that are pre-programmed.83 Individuals have the right to object

77 Ibid, Article 16. 78 Ibid, Article 17. 79 Preamble, Paragraph 65. 80 Ibid, Article 18. 81 Ibid, Article 19. 82 Ibid, Article 21. 83 Information Commissioner’s Office, Rights related to automated decision making including profiling. Accessed April 8, 2020, from < https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the- general-data-protection-regulation-gdpr/individual-rights/rights-related-to-automated-decision-making- including-profiling/>.

23

to this way of decision making, obligating the companies to base their decision not only on the automated decision making and profiling algorithms, if this automated system alone would result in legal or other significant effects for the individual in question84. In chapter four of the thesis, it will be assessed whether YouTube has implemented these rights accordingly in their privacy policy. Next to these universal rights, children are given extra protection via Article 8 of the GDPR. This article sets out additional requirements for consent found in Article 6(1)(A) GDPR, if the data subject is under the age of 16 years old. Article 6(1)(A), states the following: “Processing shall be lawful only if and to the extent that at least one of the following applies: (A) the data subject has given consent to the processing of his or her personal data for one or more specific purposes.”85 Article 8 GDPR supplements this requirement by stating that any processing of data by information society services will only be lawful if the data subject is at least 16 years old. If the individual is under the age of 16, processing is only lawful if it can be verified by the data controller that consent is given by a parent or guardian. Data controllers are required to make reasonable efforts to verify that consent has been given. Although recital 38 and article 8 of the GDPR indicate that special provisions have been put in place to provide children with extra protection, in reality the added amount of protection is lackluster. None of the articles in the GDPR address the question as to how children can seek redress when their privacy has been infringed or invoke one of the rights established in the GDPR, and no specific guidelines are given as to how consent by parents can be verified86. Another point to consider is the awareness that children have concerning the implications of data processing, and what consequences it has on their online experience. Children are often the first to use new technologies and explore the digital environment, and without specific protection in place, they are at risk of providing sensitive data to a data processor. At the same time, parents are often less informed about the internet than their children are, making it difficult for both parent and child to manage the processing of data online. While Article 57(b) of the GDPR lays down that every supervisory authority shall promote public awareness and understanding of the risks, rules, safeguards and rights related to processing, specifically for children, the difference in the level of ‘digital literacy’ between parents and children opens up issues concerning both age verification and parental consent87. The child can rather easily circumvent age-restrictions, and the parents are often not aware of the data that their children are sharing. These are concerns that should be addressed by the GDPR, but the regulation fails to be specific enough towards data processors and controllers, leaving the text of many articles, particularly article 8 in this example, up to interpretation about terms such as ‘reasonable efforts’ or when consent is ‘given or authorized’. Scholars have found that, like with the COPPA Rule, children in the EU are still subjected to data tracking and personal

84 GDPR, Article 22. 85 Ibid, Article 6(1)(A). 86 Livingstone, Sonia. ‘Children: a special case for privacy?’. 2018. Accessed August 3, 2020, from < http://eprints.lse.ac.uk/89706/1/Livingstone_Children-a-special-case-for-privacy_Published.pdf>. 87 Ibid, P.19.

24 advertisements by third party companies88. Another point to consider is the lack of harmonization when it comes to the digital age of consent, where article 8 allows Member States to set this age from 13-16 years old. This is a vast difference in age when considering the level of comprehension that the child has as to what is being done with their data. In Belgium and the United Kingdom, a child can give consent to the collection of their data when they reach the age of thirteen, while in the Netherlands this age is set at 16. Instead of giving Member States the freedom to set an age to their preference, the age should be fixed depending on the capacity of the child to understand and consciously accept privacy policies and how data is collected89. Furthermore, the GDPR’s lack in providing verifiable age and consent provisions make it an instrument which leaves more to be desired when it comes to the protection of minors.

Chapter 4 ‘In what way has YouTube implemented the relevant US and EU regulations, and is it enough to prevent another ElsaGate?’ In the previous chapter the legal framework of both The United States and Europe has been outlined via the COPPA Rule and the GDPR. In this chapter it will be assessed whether YouTube has implemented the measures necessary to comply with these legal instruments, starting with an overview of the policies in place to adhere with the GDPR in Europe, being the Terms of Service, the privacy policy and the YouTube Kids’ privacy policy. These policies have all been explicitly tailored to the European Economic Area and Switzerland, after implementation of the GDPR in 2018. After this overview, the attention will shift towards the COPPA Rule, explaining how YouTube has applied the COPPA Rule on the platform, as well as how the Rule impacts future ElsaGates. In the subsequent sections, both instruments will be analyzed together, to illustrate the relationship between COPPA and GDPR on YouTube, alongside the difference in compliance between both instruments on the platform. Finally, a conclusion will be drawn based on the research done in this chapter, to recap what YouTube has done, and whether it can prevent another ElsaGate from happening in the future.

4.1.1. Terms of Service YouTube YouTube has a unique terms of service policy for the European Economic Area (EEA) and Switzerland.90 These terms of service were updated as of July 22, 2019, which should mean that the General Data Protection Regulation should be complied with on the basis of the

88 Vlajic et. al. ‘Online Tracking of Kids and Teens by Means of Invisible Images: COPPA vs. GDPR’. 2018. Accessed August 4, 2020, from 89 Mc Cullagh, Karen. ‘The General data protection regulation: a partial success for children on social networking sites?’. 2016. Accessed August 3, 2020, from < https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2985724> 90 YouTube Terms of Service. Accessed April 9, 2020, from < https://www.youtube.com/t/terms?preview=20191210#summary>.

25 measures taken in this document. YouTube has set the minimum age for using the service at 16 years old, or 13 years old if a parent or guardian has given consent via the Google ‘Family Link’ app. Children under the age of 13 years old are able to watch the YouTube Kids app without any restrictions, provided that the parent or guardian has enabled their child to access the app. If a child is under 18 years old but older than 16, consent must still be given by a parent or guardian, but the terms of service provide no specific requirement for how this consent must be given, in contrast to kids between 13 and 16 years old, where it is specifically stated that the parent or guardian needs to give consent via the Family Link app. The terms of service merely state that parents or guardians are bound by the terms, and therefore are responsible for the behavior of their children. The terms of service also lay down several restrictions on its users concerning the privacy of its users. Any user on the platform is prohibited from collecting information which can be used to identify a person unless consent is given by that specific individual91. Furthermore, the terms of service state that by uploading content on the platform, the uploader grants a worldwide, non-exclusive royalty-free license to every user on the platform to use that content. ‘Use’ is defined by YouTube as the right to reproduce, distribute, edit, display, and demonstrate. This license will remain intact until the content is removed on the platform.

4.1.2. Privacy Policy The terms of service also refer to Google’s privacy policy, which is YouTube’s parent company. The privacy policy explains what Google and YouTube do with the data that they collect from its users, and was updated on March 31, 2020.92 Google states in its privacy policy that information is generally collected to provide a better service to all users. They mention that the information helps them choosing the appropriate language settings for the user, which advertisements are the most useful to the user, and which videos or online personas are most in line with your preferences.93 Even if a user is not signed into a , information is still being collected via ‘unique identifiers’ linked to the browser or device that the person is using. These unique identifiers are a string of characters specific to a browser, application, or device. These identifiers help Google to remember preferences, choose relevant personal advertisements and prevent fraud94. Apart from the information described above, Google stores personal information such as names and passwords, phone numbers and payment information. They monitor a user’s activity on apps, browsers and devices that access Google services, which is a range of information collected. This information is collected from search queries a user searches for online, videos that are being watched, voice and audio information when voice is enabled, but also matters such as purchasing activity, people that the user communicates with, interaction with content and advertisements alongside many other factors95.

91 Ibid, ‘Rechten en beperkingen’. 92 Google Privacy Policy. Accessed April 9, 2020, from . 93 Google Privacy Policy, ‘Information that Google collects’. 94 Google Privacy Policy, ‘Unique Identifiers’. 95 Ibid, ‘Your Activity’.

26

According to Article 6 GDPR, processing of this data is lawful if the data subject, which is the user of the Google service, has given consent for the processing of his or her personal data for one or more specific purposes. Consent is given to Google/YouTube if a user accepts the terms of service and consequently uses one of Google’s service96, but Google will also ask for explicit consent for matters not included in the privacy policy, for the sharing of sensitive data or for the display of personalized advertisements. Google states in their privacy policy that the collected data is being used for several purposes. These purposes include: A) Improving Google’s services by returning better search results based on the personal information gathered, or suggesting to share content with the personal contacts of the user; B) Maintenance of the service, by using the information collected to troubleshoot issues reported to Google or tracking any other technical problems; C) Develop new services based on the app usage of existing Google services, to improve upon the services that are already available; D) Providing personalized advertisements and content, such as custom search results, personalized security tips and advertisements based on searches made by the user. The settings concerning personal advertisements can be altered within the ‘ad settings’ of the user’s Google account. Information that directly identifies the user, such as name or e-mail, are not shared with the advertisers. In addition, personal advertisements on sensitive data are not displayed, which is information based on religion, health, sexual orientation, or race. In addition to this, explicit consent is requested by Google when sensitive data is concerned. This appears to be in line with Articles 9(1) and 11 GDPR. E) Measuring performance, to analyze how Google’s services are being used; F) Communication with its users, such as using the user’s e-mail address to interact with the user directly about matters of suspicious activity or if improvements are made to any of Google’s services; G) Protection of Google and its service, by improving safety and reliability by identifying security risks as well as any technical issues.97 The data collected is processed by Google via automated systems and algorithms which recognize identifiable patterns of a user which help Google in providing personal search results and advertisements. Google will also combine several strings of information collected to process the data as described above. The privacy policy states that personal information outside of Google is only shared with the user’s consent. Google also emphasizes that specific consent will be asked concerning information collected for any purpose not stated in the privacy policy. Google not only outlines how a user’s information is processed, but also provides several choices for regulating the information gathered. Every user is able to review the information collected in their profile settings, and choose how advertisements are being displayed on the services, what type of activity is monitored by Google, and decide what other users can see

96 YouTube Terms of Service, ‘Toepasselijke voorwaarden’. Accessed July 4, 2020, from < https://www.youtube.com/t/terms> 97 Google privacy policy, ‘Why Google collects data’. Accessed July 4, 2020, from < https://policies.google.com/privacy?hl=en-GB&gl=uk#whycollect>

27 about you. Users can also download and export a copy of the data collected from their Google account, and request to remove any content from specific Google Services. Google provides detailed instructions on how a user can download all data collected by them, and what steps they take to comply with their privacy policy98. By going to the data & personalization tab on your Google account, advertisements settings can be changed in the ‘ad personalization’ tab. In this tab you can turn advertisement personalization on or off. The tab displays the various interests that Google has gathered and consequently displays ads of. Categories include ‘Computers & Electronics’, ‘Career Resources & Planning’ and many more. Deletion of data can be requested via the ‘Data & personalization’ tab in the Google account as well. This will delete all data that Google has collected about the user99. Finally, the Google privacy policy has a specific section about compliance and cooperation with regulators. In this section it is explained that Google Ireland Limited is the data controller within the European Economic Area and Switzerland, and is thus responsible for the data processing within this area. Google reiterates that all information is processed with the user’s consent, which is given to Google when an individual uses any of the services operated by them and accepts their terms of service, and done so with the legitimate interests in mind that have been summarized earlier in this paragraph.100 Apart from these grounds, Google also processes user’s data if they receive a governmental request to do so, in the case of legal matters. If any information in the policy is still unclear, Google provides a link to contact them or the data protection office of Google, as well as explaining that local authorities can be contacted for any data protection issues. 4.1.3. YouTube Kids Privacy Notice In contrast to the localized terms of service and privacy policy, the YouTube Kids Privacy Notice is only available in English, and provides no specific policy for the European Economic Area and Switzerland.101 As this is the only available version of the privacy policy for the YouTube Kids app specifically, it appears to be a universal notice and thus applicable worldwide. The privacy notice was last updated on 27 July 2019 at the point of writing this thesis, which is prior to the measures taken to comply with the COPPA Rule at the start of 2020. The privacy notice reiterates a lot of points made in the general privacy policy discussed in the previous paragraph, but some noticeable differences can be made with the specific YouTube Kids Privacy Notice. It is also worth noting that the YouTube Kids Privacy Notice overrules the general privacy policy in the case of conflicting terms.102 YouTube states that they do not collect personal information such as the name, contact information or address of the child using the app, focusing the collection of data on the aforementioned unique identifiers, device type and log information to analyze the IP address and details about the usage of the service. The data is processed for internal matters such as improvement to the service and spam prevention, but it is also used for generating

98 Google Account Help, ‘Download your data’. Accessed June 8, 2020, from < https://support.google.com/accounts/answer/3024190?hl=en> 99 Winder, Davey. ‘Google Confirms It Will Automatically Delete Your Data – What You Need To Know’. Forbes. Accessed July 4, 2020, from < https://www.forbes.com/sites/daveywinder/2019/05/05/google-confirms- it-will-automatically-delete-your-data-what-you-need-to-know/#62c80dc920d4>. 100 Google privacy policy, ‘Compliance & Cooperation with Regulators’. 101 YouTube Kids Privacy Notice. Accessed April 9, 2020, from . 102 Ibid, ‘interpretation of conflicting terms’

28 personalized content. They collect the data from the user and process the viewing history and the search terms entered to provide the user with search results and videos relevant to their preferences. Google has changed the way of advertising on the YouTube Kids app, only providing contextual advertisements and preventing personalized advertisements from showing up on the app. They also made it impossible for users to share or make public any personal information with a third party. Upon downloading the YouTube Kids app and opening it, the app asks whether the user on the app is a child or a parent. By choosing ‘I am a child, the app locks itself, stating that only a parent can unlock the app. By choosing ‘I am a parent’, the app requires the parent to set up a couple of parental controls. First, the birth year of the parent is requested, to verify the age. After the age verification, the Kids app displays a video showing what the parent can do on the app, such as creating separate accounts for each kid, block and report videos, see the watch history of its kid, or set timers that limit the amount of time that their child can watch videos on the app. After exploring on the app for a few minutes, scrolling and browsing through the videos, an advertisement intro popped up prior to the video starting, followed by an advertisement highlighted by the tag “Ad”. While there is age verification present on the app, a child could pretend to be a parent and fill in a bogus birthdate. This does not matter for the purpose of data processing however, as the advertisements displayed on the app are contextual, and therefore not personalized ads based on the data collected by Google. Whether a 40-year old or a 10-year old would use the app does not matter, as the contextual advertisements will be the same for all ages. It does not deal with the matter of obtaining consent from parents or guardians from using the app, however. A child can just download the app on his device, and fill in that his parents have given consent to use the app. Google does not check whether this is the case in reality. 4.1.4. Subconclusion Based on the wording of all the documents above, at first glance it seems that YouTube has addressed the main points of the GDPR relevant to this thesis. The rights of data subjects which have been discussed in chapter 3 of the thesis can all be found in YouTube/Google’s privacy policy, and the terms of service indicate that extra care has been given towards access of the services to minors. Whether the platform fully complies with the GDPR would require an in-depth analysis, which is not the focus of this thesis. There are some concerns, however. While there are rules in effect that govern the form of consent needed for minors, there is no age verification procedure in place. While the platform explicitly states that children are not allowed to use the service unless parents or guardians give consent, no policy whatsoever is provided by YouTube to verify whether a user correctly entered their age. Viewer restrictions based on age therefore are not nearly as effective when an individual under the age of eighteen creates an account on YouTube and states that he is twenty years old, thus nullifying the entire purpose behind creating these restrictions. The only reason to prevent this is if the parents or guardians themselves create the account together with the child, and prevent them from creating another account on their own where they lie about their age. It therefore looks as if YouTube relies on the integrity of its users, but therefore ignores the implementation of any verification procedure to make sure that consent is given in a valid way.

29

In the next paragraphs the application and impact of the COPPA Rule on YouTube will be explained, alongside remaining concerns and points of interest. The reasoning behind focusing on the COPPA Rule specifically is because the implementation of the COPPA measures led to additional worldwide safeguards on the platform, which help in tackling the ElsaGate scandal as these new measures are specifically put in place for the protection of children. While the GDPR is also in effect on the platform as of 2018 the ElsaGate still took place. By analyzing the new situation, it can be assessed whether the COPPA Rule will provide extra protection against another ElsaGate from happening. 4.2. The application of the COPPA Rule by YouTube YouTube implemented measures to comply with the COPPA Rule on its platform on January 6, 2020. The months prior to this change, all YouTube content creators were required by YouTube to review their channel settings, to determine whether their content was made for kids or not. All creators were given a selection of the following three options: A) This channel is made for kids, and focusses on video’s specifically made for kids; B) This channel is not made for kids, and never uploads video’s that are made for kids; C) My channel uploads video’s both made for kids and not made for kids, so I need to review this for every video that is being uploaded to my channel.103 Along with this mandatory review, an article was uploaded to the YouTube Help center, indicating the changes that were being made concerning kids content on the YouTube platform.104 Firstly, the creator had to determine whether their content is ‘made for kids’ or for a general audience. The FTC outlines the following factors that need to be considered in deciding whether the content is made for kids: A) “Subject matter of the video (e.g. educational content for preschoolers). B) Whether children are your intended or actual audience for the video. C) Whether the video includes child actors or models. D) Whether the video includes characters, , or toys that appeal to children, including animated characters or cartoon figures. E) Whether the language of the video is intended for children to understand. F) Whether the video includes activities that appeal to children, such as play-acting, simple songs or games, or early education. G) Whether the video includes songs, stories, or poems for children. H) Any other information you may have to help determine your video’s audience, like empirical evidence of the video’s audience. I) Whether the content is advertised to children.”105106

103 These options can be found in the ‘advanced settings’ tab under ‘channel’ on any user’s YouTube channel. 104 YouTube, ‘Upcoming changes to kids content on YouTube.com’. Accessed January 8, 2020, from . 105 Cohen, Kristin. ‘YouTube channel owners: Is your content directed to children? (2019). Accessed January 8, 2020, from . 106 COPPA Rule. Paragraph 312.2. Definition of ‘Web site or online service directed to children’. Accessed July 4, 2020, from < https://www.ecfr.gov/cgi-bin/text- idx?SID=4939e77c77a1a1a08c1cbf905fc4b409&node=16%3A1.0.1.3.36&rgn=div5#se16.1.312_12>

30

YouTube has stated in their official blog that as of January 6, 2020, data retrieved from children’s content will be treated in a different way.107 If a content creator makes videos to which multiple of the abovementioned factors apply, they should evaluate their channel settings to determine whether the content is made for kids. If this is the case, this has severe consequences on their channel. YouTube will not display personalized advertisements on these kids videos108, and engagement will be minimalized. This means that the ability to leave a comment on the video is turned off, and the like/dislike ratio of videos is taken away.109 If a user is watching a video which is made for kids according to the criteria explained above, YouTube automatically treats the user as a kid, regardless of the actual age of the viewer. This limits the data collected from the user, only using the data necessary for optimal functioning of the service. The implication of these changes is that creators of kids content will receive less revenue from their videos, thus having a big impact on the business model of these creators. These content creators earn revenue from personalized advertisements and the amount of views that a video has received, but both will be impacted by the measures taken to comply with the COPPA Rule. Views will be impacted, apart from the restriction on personalized advertisements, because engagement factors such as the ability to leave a like/dislike or leave a comment improves the number of views of a video due to the active engagement, but by adhering to the COPPA Rule this will not be possible anymore, thus hampering the potential amount of views. YouTube has thus taken a very strict interpretation of COPPA, to prevent any possible violation of unlawful data processing from happening on the platform. 4.3. The impact of the COPPA Rule on ElsaGate As has been stated earlier in this section, the COPPA Rule is a big step towards preventing another ElsaGate scenario on the platform. The ElsaGate videos included animated characters and real-life depictions of cartoon characters, which is covered under the requirements for YouTube’s new COPPA policy. This means that the ElsaGate videos will be marked as “made for kids”, and will consequently not receive personalized advertisements, the video will have no engagement possibilities such as comments and the data of the viewers of this video will not be collected. YouTube has stated that they will actively monitor whether creators appropriately label their videos as ‘made for kids’ or not in their profile settings110. Legal consequences can be taken by YouTube if a video creator is caught in falsely labelling their videos, or it can lead to termination of the YouTube channel. It has been explained in the introduction of this thesis that one of the reasons why these ElsaGate style videos were made, was because they gathered millions of views, which led to a

107 YouTube Official Blog, ‘An Update on kids and data protection on YouTube’. Accessed April 9, 2020, from . 108 Non-personalized advertisements will still be shown. These, however, are less lucrative to the creator due to it being general rather than personalized on the audience. 109 YouTube Official Blog, ‘An Update on kids and data protection on YouTube’. Accessed April 9, 2020, from . 110 YouTube Help, ‘Determining if your content is “made for kids”. Accessed June 8, 2020, from < https://support.google.com/youtube/answer/9528076?hl=en#:~:text=According%20to%20guidance%20from%2 0the,made%20for%20kids%20just%20because%3A&text=Children%20may%20incidentally%20see%20it>

31 lot of revenue to the creators of these videos.111 This is however not the only reason why these ElsaGate videos were created. While unrefuted proof is not available, other potential reasons behind these videos are grooming and the normalization of inappropriate behavior. As has been stated in the analysis of the YouTube Kids Privacy Notice, personal data is still being collected about these users, to recommend videos to them based on their viewing preferences. Due to the implementation of the COPPA Rule, targeted advertisements have been removed and replaced with contextual advertisements, but it does not address grooming and/or normalizing inappropriate behavior. In addition, whilst creators will generate a lot less revenue from contextual advertisements over personal advertisements, it is still possible to earn money from these types of videos, thus not completely getting rid of this problem either. Another factor to be considered is the lack of age-verification on YouTube, which has been discussed in section 4.1.4. of this thesis. What the COPPA Rule has achieved however, is that videos which are labeled to be ‘made for kids’ are restricted in their engagement possibilities and advertising. For these safety measures it does not matter whether the child has entered a false age, because the restrictions are put on the video itself, and not the account of the child. Currently, YouTube has two systems in place for labeling videos. One is for determining whether videos are made for kids, and the other is for labeling what elements the video contains for advertisement purposes. Creators have to specify whether there is inappropriate language being used, whether the video contains violence or other sensitive issues such as death or tragedy. The algorithm then checks whether the creator has correctly filled in what the video is about, and whether it is made for kids or not. If the creator has correctly entered the information, YouTube will notify the creator in the following way:

This is however not enough to stop another ElsaGate. As of July 2020, ElsaGate themed videos can still be found on YouTube quite easily. Whereas Peppa Pig, Spiderman and Elsa were used in the initial spree of ElsaGate videos, the change has been made to Minecraft, a popular videogame for all ages in which a player can create and build his own world in a pixelated digital environment. These videos still have personalized advertisements, and contains content such as extreme gore, defecation, and murder, among other shocking material112. The YouTube channel has hundreds of videos that have thousands and sometimes millions of views, with sexually themed thumbnails. The videos still have comments enabled, and judging by the comments, the videos are watched by a lot of children, as the text of a lot of comments is pure gibberish, or some letters chained to each other. Another example is the channel called ‘P1ck1es ’, which also uploads Minecraft videos with inappropriate thumbnails, which are viewed by thousands of users.113Multiple comments have stated that

111 YouTube operates in percentages concerning the ‘advertisement-friendliness’ of a video. Factors that impact this percentage are the language used (no swearing leading to higher friendliness), along with the number of minutes that a video is watched and the trust of the account. More on this can be read in this article by Scott S. Bateman: . 112 Link to the video in question: < https://www.youtube.com/watch?v=cZ0eM0emoYU>. 113 P1ck1es Animations:

32 they reported the videos to YouTube, yet they are still online for everyone to see. Even though YouTube has already deleted thousands of channels and even more videos, ElsaGate themed videos are still available on the platform. The videos are however not available on YouTube Kids, after searching for them via the app. While YouTube has taken steps to remove this type of content from the separate Kids app, they have not succeeded in deleting the videos on the regular platform, which can still be accessed by children via their computer, phone or tablet. Combine this with no verification procedure for age or parental consent, and children can consequently still be exposed to ElsaGate videos.

4.4. The relationship between COPPA and GDPR In the preceding paragraphs it has been established that both COPPA and the GDPR aim to safeguard the privacy and safety of children in the online environment by prohibiting unauthorized data processing of personal information of children. While the COPPA Rule is specifically aimed at children aged 13 years old or younger, The GDPR refers to children as any individual under the age of 16, albeit mentioning that Member States can opt for lowering this age as long as it is not below 13 years old. COPPA on the other hand only applies to children of 13 years old or younger. The territorial scope also differs between the two instruments. The FTC states that the COPPA Rule is applicable to all foreign-based service providers as well, if they collect data of children based in the United States.114 The GDPR states in Article 3 that the Regulation applies to any data processor and data controller who has an establishment in the European Union, regardless of whether or not the processing of personal data actually takes place in the European Union.115 Additionally, the Regulation is applicable if the controller or processor is not established in the European Union, in the case that the purpose of processing relates to either the offering of goods and services to data subjects within the Union, or any monitoring of behavior if this behavior takes place within the European Union.116 YouTube has its headquarters in San Bruno, California, but also has establishments in Europe.117 Following the text of both regulations, YouTube therefore falls under the scope of both COPPA and the GDPR. YouTube both has an establishment in the European Union and America, whilst offering services and monitoring behavior of children in both America and the European Union for marketing purposes. In the next section of this thesis, it will be assessed whether YouTube adequately adheres to COPPA and the GDPR, by examining the manner in which COPPA is applied on the platform, as well as inspecting the platform’s service agreement and privacy policy. The way in which the GDPR is implemented on YouTube will be discussed later in this chapter.

114 COPPA – Children’s Online Privacy Protection Act. Par. 312.2 Definitions. Accessed June 8, 2020, from < https://www.ecfr.gov/cgi-bin/text- idx?SID=4939e77c77a1a1a08c1cbf905fc4b409&node=16%3A1.0.1.3.36&rgn=div5>. 115 GDPR, Article 3(1). 116 Ibid, Article 3(2). 117 Craft, YouTube headquarters and office locations. Retrieved April 8, 2020, from .

33

4.5. The difference in compliance between COPPA and the GDPR on YouTube It is interesting to note that YouTube seems to primarily adhere to the COPPA Regulation in its approach to protect the data collection of children, alongside the GDPR. Whereas the GDPR only applies to users in the European Union, COPPA is applicable to all creators worldwide. In this subsection it is therefore important to analyze the difference between COPPA and GDPR in terms of compliance, to assess whether YouTube has extra work to do in in order to comply with data protection laws of children in the European Union. At first glance it can be noticed that the COPPA Rule is much narrower in scope. While the GDPR addresses the protection of both children and adults, merely stating in recital 38 that children should be given extra protection which is effectuated in Article 8 GPDR, the COPPA Rule specifically deals with the protection of data of children118. COPPA only applies to operators of websites or online services that are directed at children which are knowingly collecting personal information from children119, while the GDPR governs all processing of data of any data processor or data controller that processes data of a data subject from the European Union120. Furthermore, the age that is set for a minor is lower, thus stricter, in the COPPA Rule, setting the age at thirteen121. The GDPR allows Member States to set the age themselves, no lower than thirteen but no higher than sixteen122. The requirements of obtaining consent from parents or guardians is also different between both legal instruments. Article 8 of the GDPR states that processing of data is only lawful if consent is given or authorized by the parent or guardian, and that the data controller shall make ‘reasonable efforts’ to verify whether consent is given by the holder of parental responsibility123. This is a rather vague requirement, and no specifics are given as to what ‘reasonable efforts’ entails in practice. In paragraph 312.3 of the COPPA Rule it is stated that the operator shall obtain verifiable parental consent prior to the collection, use and/or disclosure of personal information from children, also requiring operators to provide the parent or guardian with ‘reasonable means’ to review the information collected from the child, with the possibility to refuse the use of the data. Paragraph 312.5. of the Rule elaborates on this requirement, by stating what an operator should do to comply. Methods of obtaining verifiable parental consent include a consent form signed by the parent and returned via postal mail or electronic scan, requiring a parent to use a credit card which provides notification, enabling the parent to call a toll-free number, and many more. Paragraph 312.6 continues with the right of the parent to review the personal information provided by the child, stating extensively what must be done by the operator to comply with the requirement. This includes a description of the different types of categories of personal data collected, the opportunity for the parent to refuse the further collection of data, and the requirement that the operator shall not make it overly burdensome to the parent to review the information collected. Concerning the data processing of children, the COPPA Regulation is therefore much more extensive and stricter in application than the GDPR, explaining elaborately how consent must be given, the right to review information by the parent and how a proper notice that data is

118 COPPA, paragraph 312.1. 119 COPPA, paragraph 312.2. 120 GDPR, Article 3. 121 COPPA, paragraph 312.2. 122 GDPR, Article 8. 123 Ibid.

34 being collected must be given by the operator. The GDPR puts these requirements much more vaguely, thereby making it more open to interpretation to the Member States. The GDPR is more extensive concerning data processing in general, laying down rights for data subjects regarding data minimization or erasure, processing grounds and many more. The conclusion to be made for this comparison is therefore, that on the subject of data processing of children, YouTube can choose to adhere to the COPPA Rule on the platform without violating the specific parts of the GDPR relevant to this thesis, as the requirements set out by the COPPA Rule are much more narrow and more extensively laid down than in the GDPR.

4.6. conclusion In conclusion, while the COPPA Rule is a big step towards the prevention of another ElsaGate, it is not enough to completely prevent a new ElsaGate scandal from happening. Independent of the ElsaGate videos which are still available on the regular YouTube platform and are being viewed by children, the videos which are made for kids, and are thus available on the YouTube Kids app, will continue having advertisements, and personal data about these children is still being collected to feed the recommendation algorithm in use by YouTube. Another measure that I therefore recommend YouTube to take is to combine the additional measures taken as a result of implementing the COPPA Rule with extra human curation. Put in place a reporting system specifically for kids content, so that users who come across a video which is in violation of the new policy can be flagged and reported to the platform. Human curation would enable the platform to more effectively deal with the reports made by its users. Human curation is currently used by Facebook to manually select news stories to send to its users124, and YouTube itself has implemented a feature called ‘Collections’ for YouTube Kids, which only features videos from trusted partners, to make it easier for parents to select videos suited for children125. This ‘Collections’ idea can potentially be expanded towards the regular YouTube platform, to allow its users to select hand-picked trusted partners that they want to watch on the platform and receive recommendations from. As children still use the normal YouTube app and website, this looks to be a viable strategy to further combat the ElsaGate videos, and prevent children from having access to these shocking videos. Next to this, they can implement an algorithm that filters videos which meet multiple criteria that are laid down by the FTC as to be “made for kids”. These can subsequently be subjected to human curation. By combining the COPPA Rule with human curation and an algorithm, the filtering of videos will be more effective, as the human element will deal with the inconsistencies of the algorithm. The human eye will be more aware of the elements of the videos that the algorithm may overlook, such as sexual video thumbnails, gore, or other violence, and put it into context. Whereas the algorithm may not filter the video as it contains

124 Isaac, Mike. ‘In New Facebook Effort, Humans Will Help Curate Your News Stories’. 2019. New York Times. Accessed July 4, 2020, from < https://www.nytimes.com/2019/08/20/technology/facebook-news- humans.html> 125 YouTube Official Blog, ‘Introducing new choices for parents to further customize YouTube Kids’. 2018. Accessed July 4, 2020, from < https://youtube.googleblog.com/2018/04/introducing-new-choices-for-parents- to.html>.

35 material from the video game Minecraft, which is in itself not against the terms of service, but the human oversight can ensure that these videos will also be filtered out. YouTube has taken huge steps towards prevention of unwanted shocking behavior on the platform, having deleted thousands of accounts and countless videos that included content which was against the terms of service, disabling features on videos that are made for kids, and initiating specific programs such as ´Collections´ on YouTube Kids to counteract both the processing of data of minors and the potential of shocking content reaching this audience. It is still not enough however, as these videos are still being uploaded to the platform, reaching millions of views, and exposing kids to inappropriate content. There is still much to be done for the platform to completely eradicate the issue of ElsaGate videos.

Chapter 5 ‘Conclusion’ YouTube is one of the biggest platforms worldwide. Thousands of videos are uploaded every day and millions of people visit the platform to watch content uploaded by creators. In 2017 it was discovered that among the millions of videos available on the platform, a portion of those turned out to be in violation of YouTube’s terms of service, displaying popular cartoon characters such as Peppa Pig, Spiderman and Elsa from Frozen, who engaged in highly shocking behavior. This behavior includes drug use, sexual misconduct, and dangerous acts such as asphyxiation. These videos were very inappropriate for children, yet were still visible on YouTube’s separate child app, YouTube Kids. The shocking content only took place during a small portion of a longer video, which caused the videos to go unnoticed for a long time, until articles by media such as were published. It turned out that these videos were not filtered out by the algorithm in place by YouTube, and thus millions of children were exposed to these videos on the YouTube Kids app, a scandal later referred to as the ‘ElsaGate’. In this thesis, light has been shed on YouTube’s algorithm, which pushes videos to its users, designed to meet the viewing interest of the user. This practice, however, indirectly caused ElsaGate. As has been established in chapter two of this thesis, YouTube collects metadata from its users, where the algorithm designs a list of videos based on the gathered metadata. This list of videos consequently appears on the ‘recommended videos’ tab on the platform, or on the homepage when a user accesses the platform on their device. These videos are carefully ranked and compiled by the platform, using methods such as pre-computational batch orientation and association rule mining. As the disturbing content of the videos went unnoticed by the YouTube algorithm, and parents or guardians did not pick up on the inappropriate content that their children were exposed to, these videos started to appear in the recommended videos tab of users that watched similar kids videos about cartoons, as the YouTube algorithm compared all metadata of the viewing history of the user to determine that the videos must be alike, and therefore be in the interest of the user. This resulted in the ElsaGate videos being circulated on the YouTube Kids App and generating millions of views, thus reaching a very large population of minors. Another matter which was discussed in this thesis was whether YouTube has continued its efforts to prevent another ElsaGate from happening. To assess this, both the European General Data Protection Regulation and the COPPA Rule, an American law that YouTube

36 complies with as of January 2020, were analyzed. In chapter three the content of these legal instruments were discussed, detailing the requirements that both legal documents put forward for effective compliance. The chapter outlined the rights that data subjects in the European Union have concerning the protection of their personal data, as well as elaborating on the approach taken by the Federal Trade Commission to protect the privacy of American minors. In chapter four an analysis of the implementation of the legal documents on the platform took place, to analyze which measures YouTube has taken to comply. A key area of interest in this thesis was how YouTube dealt with the information that they gather about minors: which data was collected in the first place, how consent was obtained and in what way the interests of children were protected. Other documents which are extensively discussed are the Terms of Service of YouTube and their privacy policy. In addition, the difference between the GDPR and the COPPA Rule concerning the data processing of children was analyzed, to determine whether YouTube was violating the GDPR by choosing to adhere to the COPPA Rule for the protection of data. It was discovered that the COPPA Rule is much stricter in its scope than the GDPR, extensively laying down how verifiable consent must be given and what rights parents or guardians have to limit the collection of data of their child. The GDPR was much vaguer in that respect, leaving the interpretation of article 8 up to the Member States to decide what ‘reasonable efforts to obtain verifiable consent’ entails. The COPPA Rule had vast repercussions for those content creators on YouTube who specialized in creating videos tailored for children. It was implemented after a 170$ million settlement with the Federal Trade Commission, which stated that YouTube must put in place mechanisms to combat unlawful data processing of children under the age of thirteen. Since its implementation, users are required to label their videos as “made for kids” or not. YouTube has compiled a list of criteria which need to be met for a video to be labeled as to be made for kids. If the target audience of the video is kids under the age of thirteen, if the video includes activities that normally appeal to children or if the language used is simplified and understandable particularly for children, the video is labeled as ‘made for kids’. If a video is made specifically for children, this video will have engagement possibilities such as comments and likes disabled. In addition, personal advertisements are no longer displayed on these videos. Even though the analysis done in this thesis demonstrates that YouTube has taken measures to comply with both the GDPR and the COPPA Rule, some concerns are still present. Although YouTube restricts access to its service to children under the age of eighteen unless they obtain consent from their parent or guardian, no procedures are put in place to control this consent. In addition, the lack of age-verification is an issue. An individual can pretend to be of legal age, while in reality being a minor who created an account entering an incorrect age. No verification procedure is in place to monitor this in any way, thus nullifying many efforts put in place by YouTube to restrict content which is deemed as ‘inappropriate’ for children. The most pressing issue, however, is that ElsaGate videos are still present on the regular YouTube website, albeit in a slightly different way. Where in the beginning the videos featured animated characters or real life depictions of these characters, the ElsaGate videos have now branched out to Minecraft, a popular video game for children, in which they can create and build their own world in a digital environment. The videos continue to display shocking content such as extreme gore, sexual misconduct and killing. The videos sometimes get millions of views, and the viewers are predominantly children. Comments are turned on

37 for these videos, and personalized advertisements are displayed on the video, despite the taken measures. This means that the procedures that YouTube has put in place to comply with COPPA are not sufficient, as the data from these videos are still collected, and the videos are not flagged as “made for kids”, while they should be labeled as made for kids. This indicates that several videos continue to slip past the new algorithm filters that YouTube has put in place with the aim to remove this content, and children can still be exposed to these ElsaGate- type videos on the YouTube platform. Although the regular YouTube app still deals with ElsaGate videos being published, YouTube Kids has seen drastic improvements. YouTube no longer collects data of the viewers for advertising purposes and merely to improve their services, and they have initiated multiple projects to limit the exposure to inappropriate content. An example of these initiatives is “Collections”. Parents can choose to select specific content creators which are trusted and checked by YouTube, whose videos will consequently be available to the user on YouTube Kids. Only the videos from these selected creators will be viewable. Kids can however still access the regular YouTube app via their computer, smartphone or tablet, where there remains a risk that they run into ElsaGate videos. Kids may want to watch Minecraft videos which are not available on YouTube Kids, so they move over to the regular YouTube app, where they run the risk of being recommended to a Minecraft video which contains inappropriate content. Both the implementation of the COPPA Rule and YouTube’s earlier efforts to tackle the ElsaGate scandal however are huge steps towards preventing the potential exposure to ElsaGate videos. While the actual motives have never been revealed, it has been suggested that these videos were made for both financial gain as well as for the normalization of sexual behavior, or grooming. Where YouTube in 2017 initially deleted the majority of channels that featured the ElsaGate videos, the worldwide implementation of the COPPA Rule on the platform further restricted its users from profiting from videos made for kids, as well as limiting the exposure of these videos by turning off comments and likes on these videos. This hereby limits the exposure of children to potential predators in the comment sections of videos, and largely diminishes the profits that users would make from these types of videos. Money can still be earned from the videos however, so the potential for financial gain is not completely taken away with the implementation of COPPA. My conclusion on the basis of this research is, however, that YouTube has implemented insufficient measures to prevent the ElsaGate videos from appearing on the platform. YouTube has put no proper age verification procedure in place, purely relying on the integrity of its users to enter their age truthfully, and ElsaGate videos still appear despite all measures required under the new COPPA Rule. The fact that the ElsaGate videos still appear on the platform is an indication that they do not comply with COPPA as of July 2020, and measures need to be taken to further remove these videos. In addition, the consent requirements of the GDPR for the processing of personal information of minors are not properly implemented, as is illustrated by the ElsaGate scandal. The reporting tool in place by YouTube can be enhanced to include a section dedicated to content similar to that of the ElsaGate, being shocking and disturbing material, which can be exposed to children. In addition, efforts can be made by the platform to combine the algorithm with human curation. The algorithm can filter a wide pool of videos which are possibly in violation of YouTube’s terms, and send this list of videos to a separate entity consisting of a human team, which can consequently review these videos to ensure that they meet all the guidelines. In addition, the YouTube community can

38 flag videos which are in violation of YouTube’s terms. The added human element ensures that ElsaGate videos such as the ones from Minecraft can be removed from the platform, as they are currently viewable without any restrictions. YouTube will have to bear the cost for this added step of human curation, but if they do not take this step, they will be in violation of both the COPPA Rule and the GDPR. The combined efforts of the enhanced reporting system being made available to the community, an algorithm and human curation, can much more effectively eradicate any shocking content from appearing on the platform, as it is inexcusable that YouTube had a multitude of these videos online on the platform for so long, exposing millions of children to these images which can traumatize children. This phenomenon should be prevented at any cost, and in my view, YouTube has not done all they can to effectively remedy the situation, relying too much on the integrity of its users to regulate the online environment themselves. I therefore want to conclude from this thesis that the current system in place is not sufficient to prevent another ElsaGate from happening, as these videos are still available on the platform, getting millions of views and exposing children to shocking content, therefore not complying with the COPPA Rule in effect on YouTube as of January 2020.

39

Bibliography

Alexander J, YouTube officially rolls out changes to children’s content following FTC settlement (2020) Brown JD, Mass media influences on sexuality (2010). Cai X et. al, Children’s Website Adherence to the FTC’s Online Privacy Protection Rule (2010). .

Cohen JN, Exploring Echo-Systems: How Algorithms Shape Immersive Media Environments (2018). Cohen K, YouTube channel owners: Is your content directed to children? (2019). https://www.ftc.gov/news-events/blogs/business-blog/2019/11/youtube-channel-owners-your- content-directed-children Children’s Online Privacy Protection Rule; Final rule 16 CFR 312 78 FR 4008 (Federal Trade Commission, January 17, 2013). < https://www.ecfr.gov/cgi-bin/text- idx?SID=4939e77c77a1a1a08c1cbf905fc4b409&node=16%3A1.0.1.3.36&rgn=div5 >. Covington et. al, Deep Neural Networks for YouTube Recommendations (2016). Accessed May 3 2019, from https://ai.google/research/pubs/pub45530 Craft, YouTube headquarters and office locations. Davidson et. al, The YouTube Video Recommendation System (2010). Dornhoeschen, What is ElsaGate? (2017). Dredge S, Youtube's Latest Hit: Neon Superheroes, Giant Ducks And Plenty Of Lycra (2016). Federal Trade Commission, ‘Complying with COPPA: Frequently Asked Questions’. < https://www.ftc.gov/tips-advice/business-center/guidance/complying-coppa-frequently- asked-questions> Federal Trade Commission, ‘Google and YouTube Will Pay Record $170 million for Alleged Violations of Children’s Privacy Law’.

40

Federal Trade Commission & People of the State of New York v Google LLC & YouTube LLC. Accessed April 7, 2020, from https://www.ftc.gov/enforcement/cases-proceedings/172- 3083/google-llc-youtube-llc Google Account Help, ‘Download your data’. Google Privacy Policy. . Information Commissioner’s Office, Rights related to automated decision-making including profiling. Isaac M, In New Facebook Effort, Humans Will Help Curate Your News Stories (2019).

Ishikawa A et. al, Combating the ElsaGate Phenomenon: Deep Learning Architectures for Disturbing Cartoons (2019). Johnson JG et. al, Television viewing and aggressive behavior during adolescence and adulthood (2002). Kantrowitz A, YouTube Kids Is Going To Release A Whitelisted, Non-Algorithmic Version Of Its App (2018). Kennedy J, Regulation and social practice online (2016) Lafrance A, The Algorithm that makes preschoolers obsessed with YouTube (2017) Livingstone S, Children: a special case for privacy? (2018). . Maheshwari S, On YouTube Kids, Startling videos slip past filters (2017). Matecki LA, COPPA is Ineffective Legislation! Next Steps for Protecting Youth Privacy Rights in the Social Networking Era (2010). .

41

Mc Cullagh K, The General data protection regulation: a partial success for children on social networking sites? (2016).

Nicolas EDF et. al, Should User-generated Content be a Matter of Privacy Awareness (2017). < https://www.researchgate.net/publication/319102794_Should_User- generated_Content_be_a_Matter_of_Privacy_Awareness_-_A_Position_Paper> Peterson T, Creators Are Making Longer Videos To Cater To The Youtube Algorithm - Digiday (2018). Placido D, Youtube's "Elsagate" Illuminates The Unintended Horrors Of The Digital Age (2019). https://www.forbes.com/sites/danidiplacido/2017/11/28/youtubes-elsagate- illuminates-the-unintended-horrors-of-the-digital-age/#2d3826536ba7 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), OJ 2016 L 119/1. Remanan S, Association Rule Mining (2018). Reyes I et. al, Won’t Somebody Think of the Children? Examining COPPA Compliance at Scale (2018).

Stepanović I, Challenges to Privacy Protection on YouTube (Institute of Criminological and Sociological Research 2018).

Stitelman et. al, Using Co-Visitation Networks for Detecting Large Scale Online Display Advertising Exchange Fraud’ Strasburger VC, Adolescent Sexuality and the Media (1989). Subedar A, The Disturbing Youtube Videos That Are Tricking Children (2019). Tuijl J, Imagining Algorithms in Everyday Social Media Life: An Investigation into the Algorithmic Imaginary within the Elsagate Discussion on Reddit” (2018). Tutt A, An FDA for Algorithms (2016). Vlajic et. al. Online Tracking of Kids and Teens by Means of Invisible Images: COPPA vs. GDPR (2018).

42 vvmwkJxJsoSfWlSo3DH74dqXnuSO2- tJFcpe1inGvMoxZT1agr_W9DmiA7GhlY8eCnT3QpEk0>

Wachter S, Normative challenges of identification in the Internet of Things: Privacy, profiling, discrimination, and the GDPR (2018). . Winder D, Google Confirms It Will Automatically Delete Your Data – What You Need To Know (2020). Wojcicki Susan, Protecting Our Community’ (2017). YouTube Kids Privacy Notice. . YouTube Help, Determining if your content is “made for kids”. YouTube Official Blog, An Update on kids and data protection on YouTube. . YouTube Terms of Service. < https://www.youtube.com/t/terms?preview=20191210#summary>. YouTube, Upcoming changes to kids content on YouTube.com. . Youtube: Wie Gefälschte Disney-Cartoons Kinder Verstören - Derstandard.At (2019).

43