Executive Summary , and Childrens This is an executive summary for the research memorandum on artificial Rights intelligence and children's rights

1 Executive Summary

As Artificial Intelligence-based tech- In this memo, we briefly outline a nologies become increasingly inte- series of case studies to illustrate grated into modern life, the onus is on the various ways that artificial intel- companies, governments, researchers ligence-based technologies are be- and parents to consider the ways in ginning to positively and negatively which such technologies impact chil- impact children’s human rights. We dren’s human rights. The potential identify valuable opportunities to impact of artificial intelligence on chil- use artificial intelligence in ways that dren deserves special attention, given maximize children’s wellbeing, and children’s heightened vulnerabilities spotlight critical questions that re- and the numerous roles that artificial searchers, corporations, governments, intelligence will play throughout the educators and parents should be ask- lifespan of individuals who are born ing now in order to better protect chil- in the 21st century. As much of the dren from negative consequences. We underlying technology is proprietary hope that this memo will help a range to corporations, corporations’ willing- of stakeholders better understand and ness and ability to incorporate human begin to lay a framework for address- rights considerations into the develop- ing the potential impact of artificial ment and use of such technologies will intelligence on today’s children, and on be critical. Governments will also need future generations. to work with corporations, parents, children and other stakeholders to cre- ate policies that safeguard children’s human rights and related interests.1

2 Methodology

At the request of UNICEF and its re- both positively and negatively, by its search partners, a team of students at use.3 Importantly, while any technolo- the Human Rights Center at UC Berke- gy that affects adults will have second- ley School of Law spent the Fall 2018 ary impacts on children, for the sake of semester researching how artificial in- space we focused only on applications telligence technologies are being used that have been designed specifically in ways that positively or negatively for children. In this summary, we spot- impact children at home, at school, and light three case studies that are partic- at play.2 We also reviewed and identi- ularly illustrative of emerging issues. fied the disparate human rights that For other examples, please see the full may be disproportionately impacted, memorandum.

3 What is Artificial Intelligence?

With the recent rise of and attention given to technologies, the terms artificial intelligence, machine learning, and deep learning have been used somewhat interchangeably by the general public to reflect the concept of replicating “intelligent” behavior in machines. For purposes of this memo, we use artificial intelligence to mean a subfield of computer science focused on building machines and software that can mimic such behavior. Machine learning is the subfield of artificial intelligence that focuses on giving computer sys- tems the ability to learn from data. Deep learning is a subcategory of machine learning that uses neural networks to learn to represent and extrapolate from a dataset. In this memo, we focus on the ways that machine learning and deep learning processes impact children’s lives and ultimately, their human rights.

4 What are childrens, rights ?

The Convention on the Rights of the Child (CRC) is the most comprehen- sive legal framework that protects children--defined as human beings 18 years old and under--as rights bearers.4 The CRC aims to ensure children’s equality of treatment by States.5 More than a binding international document, the Convention is an ethical and legal framework for assessing states’ prog- ress or regress on issues of particular interest to children.6 Because of the exponential advancement of artifi- cial intelligence-based technologies over the past few years, the current international framework that protects children’s rights does not explicitly address many of the issues raised by the development and use of artificial intelligence.7 However, it does identify several rights that may be implicated by these technologies, and thus pro- vides an important starting place for any analysis of how children’s rights may be positively or negatively affected by new technologies, such as rights to privacy, to education, to play, and to non-discrimination.8

5 CASE STUDY ONE , Children s Rights at Home: YouTube

6 Social media platforms that rely on The machine learning algorithms – incentivizing sensational content.26 streaming technologies are revolution- primarily the recommendation engine Prioritizing such content is one of the izing how adults and children consume employed by YouTube and YouTube critical impacts of YouTube’s use of media content. Platforms are working Kids – are optimized to ensure that machine learning algorithms.27 Kids hard to ensure consumers maximize children view as many videos on the are particularly susceptible to content their time on these sites. YouTube9 platform as possible.19 Children do not recommendations, so shocking “relat- stands out as the dominant player in need to enter any information or affirm ed videos” can grab children’s atten- this space, especially when it comes any acquired permissions to watch tion and divert them away from more to today’s youth. In 2017, 80% of U.S. thousands of videos on YouTube and child-friendly programming.28 children ages 6 to 12 used YouTube on YouTube Kids.20 Touchscreen technol- a daily basis.10 YouTube was the 2016 ogy and the design of the platforms Another challenge is children’s poten- and 2017 “top kids brand” according to allow even young children substantial tial exposure to YouTube and YouTube Brand Love studies.11 In the 2017 study, ease of access.21 Unfortunately, neither Kids-related advertising.29 YouTube’s 96% of children ages 6 to 12 were found recommendation system appears to business model relies on tracking the to be “aware of YouTube,” and 94% of optimize for the quality or educational IP addresses, search history, device children ages 6 to 12 said they “either value of the content.22 Because com- identifiers, location and personal data loved or liked” YouTube.12 The You- panies developing children’s program- of consumers so that it can categorize Tube phenomenon isn’t just occurring ming are similarly concerned about consumers by their interests, in order in the as YouTube has maximizing viewers and viewer hours, to deliver “effective” advertising.30 massive user bases in India, Moscow, their posts are often designed around Some of the top advertising companies across Europe, and beyond.13 YouTube’s privileging of quantity pay vast sums to guarantee with little consideration for quality, that their ads are placed on YouTube In 2015, YouTube decided to launch including educational value.23 There is channels with popular children’s pro- a dedicated platform called YouTube particular concern that with YouTube grams.31 Advertisers also routinely em- Kids as a means to provide safe, age and YouTube Kids’ algorithm-derived ploy keywords such as “kid,” “child,” appropriate content for children.14 “related-videos” recommendations “toddler,” “baby” or “toy” in order to On both YouTube and YouTube Kids, children can become easily trapped in better target children on YouTube.32 machine learning algorithms are used “filter bubbles” of poor-quality con- Although YouTube Kids claims to to both recommend and mediate the tent.24 prohibit “interest-based advertising” appropriateness of content.15 YouTube and ads with “tracking pixels,” adver- representatives, however, have been Filtering algorithms also raise other tising disguised as programming is opaque about differences in the input problems, especially when a significant ubiquitous on the YouTube Kids appli- data and reward functions underlying number of external entities are able cation.33 Although YouTube restricts YouTube Kids and YouTube.16 Lack to co-opt YouTube and YouTube Kids’ paid advertising of food and beverages of transparency about the input data algorithmic discovery processes to on YouTube Kids, for example, food used in algorithms makes it difficult maximize viewer time with sometimes companies may use their own brand- for concerned parties to understand startling consequences for children.25 ed channels to spotlight particular the distinction.17 More generally, the For example, anyone over the age of food and beverages that they produce, issue of algorithmic opacity is of con- 18 can create and upload content onto burying what are essentially ads cern with both YouTube and YouTube YouTube and their creations are not within programs, and thereby target Kids, since YouTube, and not YouTube regulated by professional protocols. children with their products.34 Thus, Kids, continues to account for the YouTube and YouTube Kids’ algo- corporations are finding ways to target overwhelming majority of viewership1 rithmic discovery processes can be minors in ways that uphold the letter of children’s programming within the manipulated to push content that the but not the spirit of the rules and in YouTube brand.18 pusher expects will perform well on ways that may be opaque to parents the platform’s “related-videos” engine, and other concerned parties.35

7 CASE STUDY TWO Children ,s Rights at Play: Smart Toys 36 YouTube 8 Children’s leisure activities have collaboration with Toy Talk, Inc.– re- comply with not only the basic Kid- changed significantly over the last two leased an FAQ to try to address these Safe requirements but the additional decades, from engaging with toys with pressing questions.43 First, the docu- requirements for KidSafe+. 53 For ex- little interactive capacity to smart toys ment states that the conversations be- ample, the communications between that are capable of responding back.37 tween the doll and the child cannot be Hello Barbie and a child are encrypted Through the use of weak artificial in- intercepted via Bluetooth technology and stored on a trusted network.54 telligence, these toys incorporate a set because the conversation takes place of techniques that allow computers over a secured TLS (HTTPS) network, One emergent concern, despite these to mimic the logic and interactions of making it impossible to connect the safeguards, is whether a company has humans.38 Such toys raise a host of doll via Bluetooth. 44 The document a duty to report or otherwise “red flag” human rights-related concerns. These does advise against connecting the sensitive information shared through include potential violations of a child’s doll to third party Wi-Fi, which may be their toys—for example, children right to privacy, and whether corpo- especially vulnerable to interception.45 who reveal they are being abused, or rations have (or should have) a duty Further, the document claims that the children who share suicidal thoughts to report sensitive information that is Hello Barbie doll is not always listen- or other self-harm related behavior.55 shared with a toy and stored online— ing but becomes inactive when not Existing privacy laws and common law such as indications that a child might expressly engaged.46According to the tort duties fall short of providing di- be being abused or otherwise harmed. document released by Mattel, the doll rectly relevant protection.56 For exam- 39 has similar recognition technology ple, while COPPA protects the privacy to and is activated only when the rights of minors under the age of thir- There are three nodes involved in user pushes down the doll’s belt buck- teen, requiring companies to obtain smart toy processes, each of which le.47 Finally, the company states that parental consent and to disclose what comes with a set of challenges and vul- the doll does not ask questions that information is being collected about a nerabilities: the toy (which interfaces are intended to elicit personal informa- minor, it does not impose any report- with the child), the mobile application, tion, in order to minimize the circum- ing requirements regarding suspected which acts as an access point for Wi-Fi stances in which a child might divulge child abuse and neglect.57 connection, and the toy’s/consumer’s sensitive information during his/her personalized online account, where conversation with the doll.48 Ultimately, most mechanisms for data is stored. Such toys communicate tackling these challenges have been with cloud-based servers that store and Notably, parents can access their designed by the corporations them- process data provided by the children child’s ToyTalk cloud account and selves.58 In the case of Hello Barbie, who interact with the toy. 40 listen to what their child has said, de- ToyTalk has created automatic re- leting any personal information.49 As sponses for serious conversations Privacy concerns arising from this a safeguard, ToyTalk also participates such as bullying or abuse. Such model can be illustrated by the Cloud in the FTC’s KidSafe Seal Program, responses include “that sounds like Pets case, in which more than 800,000 a compliance program for websites something you should talk to a grown-up toy accounts were hacked, exposing and online services targeted towards about.” 59 While an important step customers’ (including children’s) pri- children. 50 There are two types of towards addressing this issue, this vate information.41 Another example certificates that a website or online approach potentially pushes any re- is that of the Hello Barbie doll, which service can obtain: the KidSafe certifi- sponsibility for acting to the parents raised civil society concerns around the cate and the KidSafe+ certificate. 51 The or to the child herself. It is unclear how interception of sensitive information KidSafe+ certificate requires additional many children would act on this re- and whether the doll allowed2 for per- requirements and compliance with sponse to report problems to a grown- vasive surveillance in ways that were COPPA.52 Because Hello Barbie targets up or what it means for children if an not transparent to users.42 In that case, children in the age range protected adult in their household is the one the toy’s manufacturer, Mattel – in by COPPA, ToyTalk makes sure to perpetrating the harm.

9 CASE STUDY THREE Children ,s Rights at School : AI in Education

10 AI-based tools have three general In addition to the software and tools risks and limit children’s ability and orientations in terms of their use in touched on above, AI-incorporating ro- willingness to take risks and other- schools: learner-facing, teacher-facing bots are increasingly transforming ed- wise express themselves, especially and system-facing.60 Adaptive learning ucational methods and practices. Ro- in educational contexts.69Always-on systems that are learner-facing employ bots are being brought to classrooms surveillance practices that continu- algorithms, assessments, student in a way that alters how students learn, ously monitor everything from chil- feedback and various media to deliver calling attention to a wide variety of dren’s engagement in the classroom to material tailored to each student’s applications. Even though educational their emotional states throughout the needs and progress.61 For example, AI robots promise great benefits to chil- day threaten the creativity, freedom may be used to enhance social skills, dren—such as personalized learning, of choice and self-determination of especially for children with special helping kids develop social skills, en- children by potentially fostering an needs. One company that employs AI abling distance education for children overabundance of self-censorship and for this purpose is Brain Power, which in remote regions, etc.—they also pose social control.70 Once automated sur- addresses the issue of autism through risks.65 Human rights that may be pos- veillance technologies are deployed at a wearable computer.62 Another ex- itively or negatively affected by their schools and in classrooms, children’s ample would be the deployment of use include the right to education, as rights such as the right to privacy, the AI to help high school students build well as the right to protection from right not to be subjected to discrimina- career skills by using GPA calculators exploitation and abuse, and the protec- tion, the right to flourish, and freedom and language learning applications. tion of children with disabilities. of expression may be compromised Duolingo is one such language learn- due to the panopticon environment ing application which gives students Surveillance of children is another use in which children are confined.71 The personalized feedback in over 300,000 case that is booming due to advance risks vary depending on who does the classrooms around the globe.63 Under machine learning and deep learning surveilling (governments, teachers, the teacher-facing category, AI helps techniques.66 Although some degree parents etc.) and for what purpos- teachers in administrative tasks such of surveillance promises advanced es.72 However, the potentially chilling as grading papers and detecting plagia- security, surveillance may also leave effect of having cameras constantly rism. For example, Carnegie Learning children more vulnerable than pre- trained on children is undeniable.73 It is working on a startup called Lumilo, viously.67 On the positive side, police is important to consider and evaluate building an AI augmented reality in New Delhi recently trialed facial the actors involved, their purposes, assistant that will keep teachers in recognition technology and identified the tools and methods they’ll use, and the loop as students work on assign- almost 3,000 missing children in four the safeguards they’ll put in place, so ments.64 days.68 However, surveillance can also that the emerging trend of classroom create privacy, safety and security surveillance—and surveillance more generally—helps children more than it 374 hurts.

11 How Corporations and Governments Can Help Mitigate Harmful Impacts of AI on Children

Microsoft and Google have both estab- through education, the broader scope lished principles for the ethical use of of impact on children is missing.81 An AI.75 However, neither has public-facing example of a country that has taken policies specific to AI and children.76 a more proactive look at the potential Several technology centers, trade asso- benefits of AI for children is India, ciations, and computer science groups whose AI initiative focuses on using have also drafted ethical principles with AI in education, such as creating regards to AI.77 However, most have ex- adaptive learning tools for custom- cluded explicit reference to child rights, ized learning, integrating intelligent or discussion of the risks to children and interactive tutoring systems, on AI-incorporating technologies more adding predictive tools to inform generally.78 pre-emptive action for students pre- dicted to drop out of school, and de- Like corporations, governments veloping automated rationalization around the world have adopted strat- of teachers and customized profes- egies for becoming leaders in the sional development courses.82 development and use of AI, fostering environments congenial to innovators Ultimately, both corporations and and corporations.79 However, in most governments would be well advised cases, policymakers have not directly to think through how their AI strat- addressed how the rights of children egies can be strengthened to maxi- fit into their national strategy.80 While mize the benefits and minimize the France’s strategy deals with the AI-re- harms of AI for children today, and in lated issues of achieving gender equal- the future. ity and implementing digital literacy

12 Recommendations A thorough set of recommendations is beyond the scope of this memo. However, some initial suggestions are touched on below:

•• Avoid the overuse of facial and be- Corporations havioral recognition technologies, Parents •• Carefully review and consider •• Incorporate an inclusive design ap- including for security purposes, in avoiding the purchase and use proach when developing child-fac- ways that may constrain learning of products that do not have ing products, which maximizes and appropriate risk taking. clear policies on data protec- gender, geographic and cultural tion, security, and other issues diversity, and includes a broad that impact children. range of stakeholders, such as par- Governments ents, teachers, child psychologists, • • Set up awareness campaigns that •• Incorporate children into the and—where appropriate—children help parents understand the impor- decision-making process about themselves. tance of privacy for their children. how their data will be used, Parents should be aware of how including whether to post their •• Adopt a multi-disciplinary ap- their children’s data is being used information to social media proach when developing technolo- and processed for diverse purposes, sites and whether to engage gies that affect children, and consult including for targeted ad campaigns smart toys, helping children with civil society, including aca- or non-educative social media rec- understand the potential short demia, to identify the potential im- ommendations. They should also and long-term impacts of that pacts of these technologies on the be aware of the impacts of posting use. rights of a diverse range of potential pictures or other information about end-users. their children to social media, and •• Identify how schools might the ways that what they post can be using artificial intelli- •• Implement safety by design and have a dramatic impact on their gence-based technologies to privacy by design for products and children’s future. assist or surveil children, and services addressed to or commonly raise concerns if some of the used by children. • • Adopt a clear, comprehensive policies or procedures are un- framework for corporations that clear or seem inappropriate— •• Develop plans for handling espe- imposes a duty of care connected to for example, by disincentiviz- cially sensitive data, including reve- the handling of children’s data, and ing creativity and exploration. lations of abuse or other harm that provides an effective remedy (ju- Encourage the use of artificial may be shared with the company dicial, administrative or other) for intelligence-based technologies through its products. breach. This framework should in- when they seem likely to en- corporate human rights principles. hance learning and that posi- tive benefit has been confirmed Educators • • Establish a comprehensive national by peer-reviewed research. •• Be aware of and consider using ar- approach to the development of ar- tificial intelligence-based tools that tificial intelligence that pays specif- may enhance learning for students, ic attention to the needs of children such as specialized products that as rights-bearers and integrates can assist non-traditional learners children into national policy plans. and children with special needs.

13 Conclusion

The role of artificial intelligence in chil- wellbeing in a thoughtful and system- dren’s lives—from how children play, atic manner. As part of this assessment, to how they are educated, to how they stakeholders should work together to consume information and learn about map the potential positive and negative the world—is expected to increase uses of AI on children’s lives, and de- exponentially over the coming years. velop a child rights-based framework Thus, it’s imperative that stakehold- for artificial intelligence that delineates ers come together now to evaluate the rights and corresponding duties for risks of using such technologies and developers, corporations, parents, and assess opportunities to use artificial children around the world. intelligence to maximize children’s

The authoring team of this memorandum are Mélina Cardinal-Bradette, Diana Chavez-Varela, Samapika Dash, Olivia Koshy, Pearlé Nwaezeigwe, Malhar Patel, Elif Sert, and Andrea Trewinnard, who conducted their research and writing under the supervision of Alexa Koenig of the UC Berkeley Human Rights Center. The authors thank all of members of the Child Rights Working Group for their advice and input, and especially thank Jennie Bernstein for her careful edits and patient guidance throughout the process of researching and drafting this memo.

14 References

1 Cedric Villani, “For a Meaningful Artificial 31 “Request to Investigate Google’s YouTube Online 59 Mattel, “Hello Barbie Frequently Asked Questions.” Intelligence Towards a French and European Strategy,” Service and Advertising Practices for Violating the 60 Anissa Baker Smith, “Educ-AI-tion Rebooted?,” Nesta, March 8, 2018, available at https://www.aiforhumanity.fr/ Children’s Online Privacy Protection Act,” 2, https:// https://www.nesta.org.uk/report/education-rebooted/ pdfs/MissionVillani_Report_ENG-VF.pdf. www.law.georgetown.edu/wp-content/uploads/2018/08/Filed- 61 Ibid. 2 “Office of Innovation, UNICEF Office of Innovation,” Request-to-Investigate-Google%E2%80%99s-YouTube- 62 Brain Power, “About Us,” http://www.brain-power.com/ UNICEF Innovation Home Page, available at https://www. Online-Service-and-Advertising-Practices-for-Violating- 63 Jackie Snow, “AI Technology is disrupting the unicef.org/innovation/. COPPA.pdf traditional classroom,” https://www.pbs.org/wgbh/nova/ 3 “Human Rights Center,” Human Rights Center Home 32 Ibid. article/ai-technology-is-disrupting-the-traditional-classroom/ Page, available at https://humanrights.berkeley.edu. 33 Sapna Maheshwari, “New Pressure on Google and 64 Jackie Snow, “AI Technology is disrupting the 4 UN General Assembly, “Convention on the Rights of YouTube Over Children’s Data,” NY Times, September 20, traditional classroom,” https://www.pbs.org/wgbh/nova/ the Child, 20 November 1989,” United Nations, Treaty 2018, https://www.nytimes.com/2018/09/20/business/media/ article/ai-technology-is-disrupting-the-traditional-classroom/ Series, vol. 1577, p. 3, Article 1. google--children-data.html 65 Jon-Chao Hong, Kuang-Chao Yu, and Mei-Yung Chen, 5 United Nations News, “UN lauds Somalia as country 34 Cecilia Kang, “YouTube Kids App Faces New “Collaborative Learning in Technological Project ratifies landmark children’s rights treaties,” 20 Complaints Over Ads for Junk Food,” NY Times, Design,” International Journal of Technology and Design January 2015, available at https://news.un.org/en/ December 21, 2017, sec. Technology, https://www.nytimes. Education 21, no. 3 (August 2011): 335–47.; Mazzoni, Elvis, and story/2015/01/488692-un-lauds-somalia-country-ratifies- com/2015/11/25/technology/youtube-kids-app-faces-new- Martina Benvenuti, “A Robot-Partner for Preschool Children landmark-childrens-rights-treaty. complaints.html Learning English Using Socio-Cognitive Conflict,” Journal of 6 UNICEF, “Convention on the Rights of the Child - 35 Smith and Shade, “Children’s Digital Playgrounds,” 5. Educational Technology & Society 18, no. 4 (2015): 474–85.; Frequently Asked Questions,” UNICEF, 30 November 36 Laura Rafferty, Patrick C. K. Hung, Marcelo Fantinato, Barak, Moshe, and Yair Zadok, “Robotics Projects and Learning 2005, available at https://www.unicef.org/crc/index_30229. Sarajane Marques Peres, Farkhund Iqbal, Sy-Yen Kuo, and Concepts in Science, Technology and Problem Solving,” html. Shih-Chia Huang, “Towards a Privacy Rule Conceptual International Journal of Technology and Design Education 19, 7 Geraldine Van Bueren, “The International Law on the Model for Smart Toys” in Computing in Smart Toys, no. 3 (August 2009): 289–307. Rights of the Child,” in International Studies in Human 85-102, available at https://link.springer.com/content/ 66 Emmeline Taylor, “Surveillance Schools: A New Rights, (Dordrecht/Boston/London: Martinus Nijhoff pdf/10.1007%2F978-3-319-62072-5_6.pdf. Era in Education,” in Surveillance Schools: Security, Publishers, 1995), 10-15. 37 Chris Nickson, “How a Young Generation Accepts Discipline and Control in Contemporary Education 8 UN General Assembly, “Convention on the Rights of the Technology,” A Technology Society, September 18, 2018, (London: Palgrave Macmillan UK, 2013), 15–39, https://doi. Child,” United Nations, Treaty Series, vol. 1577, (November available at http://www.atechnologysociety.co.uk/how- org/10.1057/9781137308863_2. 20, 1989), p. 3, Article 20 and Article 22. young-generation-accepts-technology.html. 67 , Timnit Gebru, “Gender Shades: 9 YouTube is a subsidiary of Google, whose parent company 38 Rafferty et al., “Towards a Privacy Rule,” https://link. Intersectional Accuracy Disparities in Commercial is Alphabet, Inc. springer.com/content/pdf/10.1007%2F978-3-319-62072-5_6. Gender Classification,” Conference on Fairness, 10 “2017 Brand Love Study: Kid & Family Trends,” pdf. Accountability, and Transparency: Proceedings of Machine Smarty Pants: the Youth and Family Experts (2017), 14. 39 Benjamin Yankson, Farkhund Iqbal, and Patrick C. K. Learning Research, (2018), 81. 11 Ibid. at 7. Hung, “Privacy Preservation Framework for Smart 68 Anthony Cuthbertson, “Police Trace 3,000 Missing 12 Ibid. Connected Toys,” Computing in Smart Toys, https://link. Children in Just Four Days Using Facial Recognition 13 Alexis Madrigal, “Raised by Youtube,” Atlantic 322, no. 4 springer.com/content/pdf/10.1007%2F978-3-319-62072-5_6. Technology,” The Independent, https://www.independent. (November 2018): 72–80. pdf, 149-164. co.uk/life-style/gadgets-and-tech/news/india-police-missing- 14 “Introducing the Newest Member of Our Family, the 40 Rafferty et al., Towards“ a Privacy Rule,” https://link. children-facial-recognition-tech-trace-find-reunite-a8320406. YouTube Kids App--Available on and the App springer.com/content/pdf/10.1007%2F978-3-319-62072-5_6. html, (April 24, 2018). Store,” Official YouTube Blog, https://youtube.googleblog. pdf. 69 Article 19, “The Global Principles on Protection of com/2015/02/youtube-kids.html; “YouTube Kids,” https:// 41 Alex Hern, “CloudPets stuffed toys leak details of half Freedom of Expression and Privacy,” op.cit. www.youtube.com/yt/kids/. a million users,” The Guardian, https://www.theguardian. 70 Rich Haridy, “AI in Schools: China’s Massive and 15 Karen Louise Smith and Leslie Regan Shade, “Children’s com/technology/2017/feb/28/cloudpets-data-breach-leaks- Unprecedented Education Experiment,” New Atlas - New Digital Playgrounds as Data Assemblages: details-of-500000-children-and-adults, (February 28, 2017). Technology & Science News, https://newatlas.com/china-ai- Problematics of Privacy, Personalization and 42 Corinne Moini, “Protecting Privacy in the Era of education-schools-facial-recognition/54786/, (May 28, 2018). Promotional Culture,” Big Data & Society, Vol. 5 (2018), Smart Toys: Does Hello Barbie Have a Duty to Report,” 71 Article 19, “Privacy and Freedom of Expression in the at 5. 25 Cath. U. J. L. & Tech 281, (2017), 4. Age of Artificial Intelligence,” https://www.article19.org/ 16 Adrienne LaFrance, “The Algorithm That Makes 43 “Hello Barbie FAQs,” at 4-5. wp-content/uploads/2018/04/Privacy-and-Freedom-of- Preschoolers Obsessed With YouTube Kids,” The 44 Corinne Moini, Protecting Privacy in the Era of Smart Expression-In-the-Age-of-Artificial-Intelligence-1.pdf, (2018), Atlantic, July 27, 2017, https://www.theatlantic.com/ Toys: Does Hello Barbie Have a Duty to Report, Catholic 8. technology/archive/2017/07/what-youtube-reveals-about- University Journal of Law and Technology, no.25 (2017): 4; 72 William Michael Carter, “Big Brother Facial Recognition the-toddler-mind/534765/. Hello Barbie FAQ Needs Ethical Regulations,” Phys.org, https://phys. 17 “Terms of Service - YouTube,” https://www.youtube. 45 Vlahos, James, “Barbie Wants To Get To Know Your org/news/2018-07-big-brother-facial-recognition-ethical. com/static?template=terms, (November 13, 2018); Matt Child,” New York Times, (September 16 2015). html#jCp. (July 23, 2018). O’Brien. “Consumer Groups Say YouTube Violates 46 Mattel, “Hello Barbie Frequently Asked Questions,” 73 Ibid. Children’s Online Privacy,” Time.Com, April 10, 2018, 1. (2015), http://hellobarbiefaq.mattel.com/wp-content/ 74 GDPR Article 5(1)(a). 18 Madrigal, “Raised by Youtube,” 80. uploads/2015/12/hellobarbie-faq-v3.pdf. 75 “ Salient Human Rights Issues,” Report - 19 Matt O’Brien, “Consumer Groups Say YouTube 47 Ibid. FY17, Microsoft. file:///Users/dreatrew/Downloads/Microsoft_ Violates Children’s Online Privacy,” Yahoo! News, 48 Ibid. Salient_Human_Rights_Issues_Report-FY17.pdf; Google, April 10, 2018, https://news.yahoo.com/consumer-groups- 49 Ibid. “Responsible Development of AI” (2018). youtube-violates-children-012240300.html. 50 Federal Trade Commission, “KidSafe Seal Program: 76 Microsoft,“The Future Computed: Artificial 20 O’Brien, “Consumer Groups Say YouTube Violates Certification Rules Version 3.0 (Final),” https:// Intelligence and Its Role in Society” (2018). Children’s Online Privacy.” www.ftc.gov/system/files/attachments/press-releases/ 77 Alexa Hern, “Partnership on AI” Formed by Google, 21 Elias, Nelly, and Idit Sulkin. “YouTube Viewers in ftc-approves-kidsafe-safeharbor-program/kidsafe seal- Facebook, , IBM and Microsoft,” The Guardian, Diapers: An Exploration of Factors Associated with programs certification rules ftcapprovedkidsafe-coppa- (September 28, 2016). Amount of Toddlers’ Online Viewing.” Cyberpsychology, guidelinesfeb_2014.pdf [hereinafter KIDSAFE SEAL 78 John Gerard Ruggie, “Global Governance and New November 23, 2017, at 2, available at https://cyberpsychology. PROGRAM], (2014). Governance Theory,” Lessons from Business and Human eu/article/view/8559/7739. 51 Federal Trade Commission, “KidSafe Seal Program,” Rights, Global Governance 20, http://journals.rienner.com/ 22 Madrigal, “Raised by Youtube,” 79. (2014). doi/pdf/10.5555/1075-2846-20.1.5, (2014), 5. 23 Adrienne LaFrance, “The Algorithm That Makes 52 Corinne Moini, “Protecting Privacy in the Era of Smart 79 Council of Europe, “Recommendation CM/REC (2018)7 Preschoolers Obsessed With YouTube Kids.” Toys: Does Hello Barbie Have a Duty to Report,” 25 of the Committee of Ministers to member States on 24 Ibid. Cath. U. J. L. & Tech 281(2017), 12-291. Guidelines to respect, protect and fulfil the rights of the 25 James Bridle, “Something Is Wrong on the Internet,” 53 Moini, Corinne, “Protecting Privacy in the Era child in the digital environment,” (July 4th 2018). Medium, November 6, 2017, at https://medium.com/@ of Smart Toys: Does Hello Barbie Have a Duty to 80 Cedric Villani, “For a Meaningful Artificial Intelligence jamesbridle/something-is-wrong-on-the-internet- Report,”4. Towards a French and European Strategy,” (March 8th c39c471271d2. 54 Hello Barbie FAQs, supra note 3, at 4-5. 2018). 26 Elias, Nelly, and Idit Sulkin. 2. 55 CAL. PENAL CODE 11165.7 (2016). 81 Ibid. 27 Ibid. 56 “Phrasee’s AI Ethics Policy,” (2018), https://phrasee.co. 82 NITI Aayog, “Discussion paper: National Strategy for 28 Ibid. 57 “Children’s Online Privacy Protection Rule,” 16 C.F.R. Artificial Intelligence,” http://niti.gov.in/writereaddata/files/ 29 Sarah Perez, “Over 20 advocacy groups complain 312.1 (2001). document_publication/NationalStrategy-for-AI-Discussion- to FTC that YouTube is violating children’s privacy 58 James Vincent, “Google Releases Free AI Tool to Help Paper.pdf, (June 2018), 7. law,” TechCrunch, April 9, 2018, https://techcrunch. Companies Identify Child Sexual Abuse Material,” The com/2018/04/09/over-20-advocacy-groups-complain-to-ftc- Verge, https://www.theverge.com/2018/9/3/17814188/google- that-youtube-is-violating-childrens-privacy-law/ ai-child-sex-abuse-material-moderation-tool-internet- 30 Ibid. watch-foundation, September 03, 2018.

15 Attribution-ShareAlike 3.0 Attribution-ShareAlike Read the full report here: unicef.org/innovation/ reports/memoAIchildrights

16