Topics for consideration 2019

Voice assistants, constantly listening to your private life 02

Cloud computing under the GDPR 06

Data sharing: public interest issues 08

Political communication & GDPR: updating recommendations and clarifying best practices 10

The reuse of data accessible “online” by the research community: challenges and prospects 12

How is children’s data protected? 14 02

Voice assistants, constantly listening to your private life

Smart speakers, voice-controlled personal assistants, conversational interfaces or chatbots, the vocabulary is vast and attests to the current significance of voice assistants. First rolled-out on phones, and later on speakers and headphones, voice assistants are gradually being added to our cars, our household appliances, and much more. Considered butlers of the 21st century, these assistants aim to help users with their day-to-day lives: answering questions, playing music, providing weather forecasts, adjusting heating, turning on lights, booking a chauffeur-driven vehicle/taxi, buying tickets, etc. Yet, whilst the aim of industrialists is to render technology invisible and to simplify exchanges, it is now clear that these technologies pose some real issues relating to users’ privacy. 03

Voice assistants are often paired with Having appeared in mass consumption These products are offered by major smart or connected speakers. However, goods in the early 2010s, many voice international groups such as GAFAM it is important to note that speakers are assistants are now being rolled out. (, , , Apple, only vectors and that assistants can be Depending on the activity carried out by Microsoft), BATX (, Alibaba, integrated into any type of appliance. the enterprises’ developing them, these , ) and even smaller In practice, voice assistants are devices assistants meet very specific needs: enterprises (such as Snips) with different with a speaker and a microphone, online sales, listening to music, task economic positions. more or less-developed computational planning, home automation, etc. abilities depending on assistants and, in nearly all cases, the ability to connect to the internet.

How do these devices work?

Vocal assistants generally work Step 3 a history of audio requests in order to according to five main steps. Take the The user makes a request enable the individual to listen to them example of a “smart” speaker: The phrase spoken by the user is and the publisher to improve its speech recorded locally by the assistant. This processing technology; Step 1 audio request recording can then be: metadata relating to the request, such The user “wakes the speaker up” as the date, time, account name, etc. using a key phrase (“Hey Snips”/ stored in the device, in order to allow “Ok Google”/“Hey Alexa”/etc.) the user to manage his/her data (e.g. a Step 5 The speaker is always listening for this smart speaker with a vocal assistant The speaker returns key phrase. It does not record anything from the company Snips); or to “stand-by” and does not perform any operations The assistant returns to a passive state until it hears this key phrase. This step sent to the cloud, or in other words to the of listening and waits to hear the key is performed locally and does not require company’s processing servers (which is phrase before being reactivated. any external exchanges. the case for example for Amazon Echo, Google Home and other speakers). Step 2 (optional) The speaker recognises the user Step 4 Some assistant models ask the user The audio recording is transcribed to pre-record samples of his/her voice into text and interpreted in order in order to create a voice profile and to provide a suitable response therefore recognise the user during Speech is automatically transcribed into interactions with the assistant. The text (speech-to-text) and interpreted purpose of identifying the user is to using Natural Language Processing be able to suggest different services to technology in order to provide a suitable the device’s different users (parents, response. A response is then synthesized children, guests, etc.). We call this vocal (text-to-speech) and then played and/ biometrics. It should be noted that or an order is executed (open the blinds, biometric data are considered sensitive increase the temperature, play music, data within the meaning of the GDPR, answer a question, etc.). and as such can only be processed with Whether these different processing the data subject’s explicit consent. operations are performed locally or on remote servers, the device (or its services) may store: a history of transcribed requests in order to enable the individual to view them and the publisher to adapt the service’s features; 04

The confidentiality The monetisation What are the of exchanges of intimate data stakes and While they are permanently on stand-by, As they mainly target homes by what are the vocal assistants can be activated and controlling smart devices and unexpectedly record a conversation entertainment services, devices with recommen- when they consider that they have a voice assistant are now central to dation to detected the key phrase. To better protect households. Users’ profiles are therefore users’ privacy and to avoid these types of fed by the different interactions between protect privacy? malfunctions, we advise: them and the assistant (e.g., life habits: favouring the use of devices with a the time they wake-up, heating settings, button to deactivate the microphone; cultural taste, purchases made, interests, The voice is an essential component of turn off the microphone/turn off the etc.). To control the use of such data, we human identity. As such, it is extremely device/unplug the device when users recommend: personal. Personal characteristics are do not want to be listened to. Some syncing services that are actually found both in acoustics (identity, age, devices do not include an on/off button useful to the user, while weighing gender, geographical and sociocultural and must be unplugged; the risks of sharing intimate data or origin, physiognomy, state of health warn third parties/guests of sensitive features (opening doors, andemotional state, etc.) and linguistics the potential recording of any alarm systems, etc.); (meaning of the words used, vocabulary conversations had (or turn off the being aware that any speech around used, etc.). Thus, providing voice microphone when they are present); the device can serve to fuel marketing recordings is not an inconsequential act. supervise children’s interactions with profiles; Furthermore, by developing outside these devices (stay in the same room, do not hesitate to contact the support of telephones, vocal assistants have turn off the device when not with service of the company providing the progressively transitioned from a them); assistant in case of questions. “personal” status to “shared” status. They In this case, check that the device have now permeated intimate areas and is configured by default to filter areas shared by several individuals such information targeting children. as living rooms, bedrooms and even the insides of vehicles. These paradigm shifts raise new questions such as those relating to the collection and processing The lack of a screen of third-party data. The purpose of vocal assistants is to provide a human-machine interface Lastly, as stated above, in most cases, without any visual media. vocal assistants rely on the processing of speech signals on remote servers. As However, to both configure these a result, although speech is generally devices and manage data, tools such as associated with a certain degree of dashboards are still necessary. Without volatility, vocal requests are still recorded an external screen or any display in the cloud as are text requests when features, it is hard to see which traces entered into some search engines. are recorded, to assess the relevance of suggestions, to learn more or to have The CNIL has identified three areas to access to answers from other sources. In be aware of, which add to the various order to manage how such data is used, questions that users may be faced with: we recommend regularly visiting the dashboard (or the application) provided with the assistant to delete the history of conversations or questions asked and to personalise the device based on needs; e.g. configure the default search engine or source of information used by the assistant. 05

What future developments and CNIL works are yet to come?

Manufacturers are very active in improving voice assistants’ abilities and security. Although some are increasingly interested in putting an end to the use of a key phrase to wake assistants, others are working to carry out processing operations to separate sound sources in order to improve systems’ listening capacity - for example, to reduce TV volume, to separate one person’s speech from another’s, etc. Lastly, professionals are also looking into new places where these systems can be used, such as in hotels and workspaces, thereby reviving questions relating to the effective use of data.

The topic of vocal assistants was identified as an area for works to be carried out by the CNIL in 2017. The latter quickly made contact with various stakeholders in order to gain a perfect understanding of the systems deployed. It has led important discussions within the CNIL’s digital innovation laboratory (LINC), its structure dedicated to experiments and studies on emerging digital use trends. As such, a thematic dossier comprised of articles and interviews with professionals was published on its website.

In 2019, the CNIL plans to pursue these works, by continuing to exchange with the industrialists and academics working on these topics, but also by continuing tests on these devices. In particular, the aim is to study how we can guarantee that users are well informed of the data collected, the uses made of such data and the means at their disposal to exercise their rights of access, modification, erasure and portability, as well as to assess the security of the data processed and how the artificial intelligence algorithms that are inherent to these devices learn. 06

Cloud computing under the GDPR n 2012, the CNIL and the G29 published recommendations on cloud computing. Since then, use of the cloud has only increased: Gartner1 recently estimated that public cloud revenue had grown by 21.4% in 2018, and that this revenue would rise to $300 billion by 2021. At the same time, Eurostat2 stated that 55% of enterprises used the cloud for critical functions (finance, accounting, CRM or business applications). With the advent of the GDPR – which modernises the obligations of data controllers and subcontractors alike – a situational analysis of how the cloud is used within organisations is necessary. Have the 2012 recommendations been applied? Is personal data protection guaranteed when migrating data to the cloud?

challenge of finding a balance in terms Concentration of risks and of privacy when using these services. limited bargaining power The stakes in terms of data protection, and more widely in terms of economy In December 2018, a survey3 published The movement of data within these and strategy, are high. It is therefore vi- by the Independent Oracle Users Group infrastructures can increase systemic tal that the CNIL carries out a technical estimated that one quarter of all en- risks, even when the enterprises pro- and precise situational analysis of these terprises’ data were now stored on the viding these services – whether data infrastructures and services. cloud. Of course, such data includes controllers or subcontractors – fall wit- industrial data, but it also includes a hin the material scope of the GDPR. 1/4 of enterprises’ data large amount of often-sensitive perso- stored on the cloud nal data, which are usually hosted and Furthermore, in 2018, the adoption of the (according to the Independent Oracle processed within the infrastructures of Cloud Act by the United States – which Users Group) a very limited number of major cloud provides American authorities with a computing providers. Yet, opposite legal framework enabling them to ac- these players, those who wish to use cess data beyond their borders – and the cloud actually have very limited the European proposal for a regulation contractual bargaining power. on terrorist content online, present the

1 https://www.gartner.com/en/newsroom/press-releases/2018-04-12-gartner-forecasts-worldwide-public- cloud- revenue-to-grow-21-percent-in-2018. 2 https://ec.europa.eu/eurostat/statistics-explained/index.php/ Cloud_computing_-_statistics_on_the_use_by_enterprises 3 « 2019 IOUG databases in the cloud survey” par Joseph McKendrick, produit par Unisphere Research, division de Today, Inc. Décembre 2018 https://www.ioug.org/d/do/8551 07

The end of the contract The future During its previous works, data control- lers informed the CNIL of significant of the cloud difficulties in retrieving their data at examined the end of a contract and of the com- plexity of ensuring that data are proper- by the CNIL ly erased. Has management of account closures improved? At first glance, it appears that progress has been made in Recommendations The CNIL would first like to delve further this field, but is such progress enough relating to into the technical aspects of the cloud to enable data controllers to effectively to better understand the infrastructures meet their obligations? the cloud: used by the main cloud service provi- ders and, more generally, this ecosys- Information on data localisation 1. Map data and processing tem. As a second step, it will analyse the Several improvements have been noted in the cloud constraints and risks which client com- in the information provided to clients panies are actually faced with today. and to data controllers as regards data 2. Set out technical and Lastly, these works will allow the CNIL localisation. But, how do data localisa- legal requirements to update its recommendations and to tion clauses take account of exceptional 3. Carry out a PIA or at least identify new levers to regulate this sec- situations such as cases of force ma- a privacy risk analysis tor which should be mobilised. jeure or hardware connections by admi- nistrators or remote support? 4. Identify the relevant type In this context, the aim will be to deve- of cloud for each processing lop the following lines of focus: Data breach notifications operation ◦ How have cloud suppliers set-up perso- Clauses between data controllers nal data breach notifications? Particu- 5. Choose a service provider and processors larly when such suppliers are regarded offering sufficient guarantees For cloud services considered proces- as processors, what means are provided 6. Update the internal security sors, what margins for negotiation are to their clients, data controllers, to meet policy clients afforded in terms of security their obligations (whether for a breach and privacy protection? What clauses detected by the service provider or for 7. Monitor development govern the relationships between data a breach notified to the client by a third over time controllers and processors? Do these party)? clauses cover all items set out in Article 28 of the GDPR? Lastly, in a time in which trust has be- come vital for cloud service providers, The impact of legislation standards, codes of conduct and other What impact do some specific laws - frameworks are multiplying. Their re- and particularly those with an anti-ter- levance must therefore be assessed in The CNIL rorist purpose - have on cloud contracts terms of data protection. A map of these and the cloud, and on privacy protection? Whether as frameworks would allow suppliers and regards the Freedom Act and the Cloud their clients to choose those that are an old story Act in the United States or the draft most relevant to them. regulation “on preventing the dissemi- nation of terrorist content online” pro- The CNIL has been working on this posed by the European Commission in topic since the early 2010s, alongside September 2018. other European data protection authori- ties. After a broad consultation, in June Encryption 2012, the CNIL published its first recom- Encryption is one of the most powerful mendations relating to cloud compu- methods for ensuring data confidentia- ting, which were followed by the WP29 lity: how is this method used and ap- one month later. The concept of joint plied in practice in the new cloud archi- responsibility therefore entered CNIL tectures? How can we comply with best doctrine. Today, this concept is also pro- practices in the field whilst managing vided for in the GDPR, and processors keys in a secure manner? For example, - who up until now had no legal res- when virtual machines are randomly ponsibilities with regard to the Data Pro- restarted, what means are put in place tection Act -, now have responsibilities. to ensure the secure sharing of keys between the client’s Hardware Security Module (HSM) and the new entity? c 08

Data sharing: public interest issues

In its IP5 report, “la plateforme d’une ville”, the CNIL explored four data-sharing scenarios to attempt to find a new balance between public and private stakeholders using data. Since then, the topic of data sharing has entered the public debate, presenting as a possible solution to several vital societal needs, and particularly in terms of regulation and research.

Since 2017, in keeping with the open The topic of data sharing has also been data movement and the notion of the subject of works in the framework of Sharing public-interest data, several works have the General Situation of the New Digital initiatives advocated for a wider sharing of data. Regulations, launched in July 2018. The President of the Republic made already this point during his speech on artificial The spectrum of data concerned by such intelligence in March 2018. In particular, sharing initiatives extends to all or part identified the Villani Report “Pour donner un of personal data, and these initiatives sens à l’intelligence artificielle” (“For must be undertaken in compliance a meaningful artificial intelligence”) with individuals’ rights. For this reason, Several data sharing initiatives have recommended: the CNIL considers that developing an already been developed: for example, “encouraging economic stakeholders effective and sustainable data-sharing in la Rochelle where inspiration has to pool data”; model, with a strong ethical component been drawn from the principle of citizen “providing for the opening of certain and based on compliance with portability to access certain data held by data held by private entities, fundamental rights - which naturally private operators. A different approach on a case-by-case basis”; includes personal data protection and has been taken by the CASD (the Secure “implementing portability with privacy - should be encouraged. Access Data Centre) which has extended a citizen-based approach”; its service offer to hosting private data “facilitating dialogue between AI (banks, services, transport, private stakeholders and regulators”, thereby health, etc.) and to their availability to picking up on some of the topics researchers or private operators, on a developed by the CNIL. purely voluntary database, in order to develop value added services. c 09

The “Health data hub”, containing Internally, the CNIL has already suggestions regarding organisation launched works to clarify the legal Capitalising arrangements for operating platforms framework applicable to data sharing, and the sharing of health data, was also by addressing the major cross-cutting on and provided to the Minister of Solidarity issues relating to compliance with the furthering and Health. In early 2019, this fuelled GDPR (legal ground for the provision the submission of the draft act on of data, modalities for the exercise of works already organising and transforming the health individuals’ rights throughout the share started system, which the CNIL provided an chain, etc.), in order to provide support opinion on in January 2019. On an with securing projects from a legal international scale, private stakeholders point of view. However, such a legal have designed data-sharing models: e.g. framework can only be very general, The CNIL’s experience in regulating the city of Toronto and Sidewalk Labs as the topics of respecting rights, data sharing platforms demonstrates are considering developing a locally governance, and sharing arrangements that there is no unique model but rather controversial civic data trust to manage (directly or through a third party) can several possible and desirable options. urban data. only be studied in light of concrete In particular, the scenarios set out in the projects. Yet, in general, beyond strict IP5 report for the reuse of private data by compliance with texts, the CNIL strongly public stakeholders could be completed encourages the integration of necessary and detailed, especially should it be personal data protection as from the decided to open certain personal data Setting out creation of sharing approaches, both on belonging to private stakeholders to a framework a legal and ethical basis. other private stakeholders. rather than The CNIL will continue its work on Depending on the sector or the project, frameworks and will combine these the data sharing scenarios available a unique works with a pro-active policy of must therefore be adapted and sharing model providing prior accompaniment for combined. In some cases, recourse given sectoral projects, including to intermediaries (platforms, data experiments, as part of its role to department) in charge of providing The GDPR was established to reconcile advise public authorities and assist access to data and, where necessary, both technological innovation and the professionals. For all of its tasks, it other functions in the general protection of individuals’ rights, with will ensure that the safeguards set out framework, could be particularly the belief that the former would become in terms of personal data protection suitable, particularly in cases in stronger and more sustainable if the are effective and, where necessary, which several stakeholders pool their latter is complied with and promoted. will suggest changes or corrections to data. However, recourse to these frameworks. intermediaries does not seem suited In this respect, in itself, the sharing to all sharing frameworks, particularly of data between public and private when data comes from only one stakeholders is not contrary to the right stakeholder and are made accessible to to personal data protection, especially one or more entities. In this case, simple when the clearly defined general sets of legal (licence) and technical interest purposes are involved. However, (API) rules, without any intermediary, these initiatives do call for more clarity could be quite suitable and sufficient in terms of applicable framework and to ensure that the framework complies for the accompaniment of project with data protection rules. leaders as from the early stages. It is by approaching these systems as from their creation that stakeholders will be able to produce virtuous and sustainable data sharing models which place the citizen and his/her rights at the centre of processes and of data governance for general interest “The sharing of general interest purposes. In this respect, the regulator has a role to play, by committing to data is an opportunity promoting innovative models and ensuring the highest level of protection to implement a European of individuals’ rights. ethical innovation model.” 10

Political communication & GDPR: updating recommendations and clarifying best practices

The CNIL assists all electoral process stakeholders, whether they be electoral candidates or their parties, elected representatives or voters, both in the context of bringing the processing implemented into compliance and to enable data subjects to exercise their rights. Following the entry into force of the GDPR and in a context of European and municipal elections, the CNIL intends to continue the works started on this topic and plans to update its 2012 recommendation.

The impact of the implementation of the GDPR on political communication: principles to observe and best practices.

While regulations on data protection have evolved since 25 May, the new Determining which rules are applicable scandal, have made the general public legal framework has not resulted in any in terms of political communication is aware of the stakes relating to the use changes as regards the qualification of one of the CNIL’s long-standing tasks, of their data and, in particular, of the political opinions as sensitive data, the with its first recommendation in the field need to prevent the misuse of personal principle of prohibition from collecting of political communication having been data during political campaigns. Over and processing such data, and the adopted in 1991. In recent years, this task the last year, complex legal challenges exceptions that the data controller may has taken on a particular dimension in particular have appeared around the raise to circumvent this prohibition with the development of digital tools notions of profiling, data localisation (Article 9 of the GDPR). and the use of social networks. We are and around the means that should be now witnessing a complex overlapping implemented to ensure the security In addition, the GDPR does not change of relations between stakeholders from of data. Lastly, in addition to the legal the main principles governing personal various backgrounds: individuals who, stakes, these different cases highlight data protection, compliance with which due to their activity, produce a large significant ethical issues, in particular should define the conditions for the use amount of data and voluntarily provide by touching on the fairness of elections. of data relating to political opinions such data to certain stakeholders, . platforms whose economic model In this context, the CNIL’s role in relies on the processing of such data, the field of political communication consultancy firms and research is to accompany innovation whilst structures, political parties likely to use ensuring that individual freedoms are such data. respected in an environment in which new technologies are now among the In addition, several scandals, and instruments frequently used during particularly the Cambridge Analytica political campaigns. 11

The evolution of applicable regulations in terms of personal data protection Respecting the golden has also led to the application of new Electoral campaign rules of data protection provisions contained in the regulation, and use of personal and particularly those relating to the data: main principles Illustrated through rights of individuals whose data is and areas for vigilance a few key principles: processed. The CNIL has drafted an • Determine the purpose of the The wide range of information that article for the AJCT journal processing carried out: precisely could be considered a political opinion (Actualité Juridique Collectivités identifying the purpose pursued means that the data collected must be territoriales) to be published in by the processing of data relating processed in a transparent manner. February 2019 within a dossier to political opinions is essential. Sufficient and easily accessible devoted to “The Internet, • Ensure the proportionality of the data information must therefore be provided social networks and collected: only data which is strictly to data subjects. electoral campaigns”. necessary for the identified purpose must be processed. The appearance of new rights (right to the restriction of processing, to • Securing the data collected: portability, etc.) must also be taken into in-depth analysis which will soon lead the particular nature of the data account in practices such as political it to adapt its recommendations and to processed requires that further prospection in order to guarantee set out best practices in this field. measures be taken to ensure their their effectiveness. Similarly, more security (setting out appropriate general discussions must be held At the same time, the CNIL intends to storage measures, access on the conditions under which such continue its works on new tools and uses management measures, etc.). prospection may be carried out, mobilised for political communication particularly when it relies on files with a view to improving and solidifying initially created for another purpose its doctrine on the topic during a period (such as commercial files). marked by the organisation of both European and national campaigns. In With the entry into force of the GDPR particular, through the works launched resulting in the disappearance of most on electoral prospection software, the These principles are more relevant than prior formalities and the improved aim is to specify under which conditions ever in order to prevent any misuse protection of each citizen by creating data from social networks can be used of personal data during electoral new rights and obligations for political and the role of each of the stakeholders campaigns. stakeholders, the CNIL has launched an involved in electoral campaigns.

Personal data, social networks and democracy: on the 2019 agenda for French-speaking authorities On 19 October 2018, the CNIL hosted the annual meeting held by the French-speaking association of personal data protection authorities (Association francophone des autorités de protection des données personnelles, AFAPDP). French-speaking authorities were invited to a morning of discussions centred around the topic of “Personal data, social networks and democracy”. This topic was a natural result of the revelations of the “Cambridge Analytica” case, but falls within the much broader scope of the discussions which have been carried out by the AFAPDP on electoral issues for several years now. In particular, the association took part in the creation of a practical guide on the consolidation of the civil registry, electoral lists and the protection of personal data4, published by the Organisation internationale de la Francophonie (OIF) in 2014. During this session, the AFAPDP decided to promote a cross-cutting approach, which goes beyond the mere framework of personal data protection. To do so, the AFAPDP called on the complementary expertise of the French-speaking network of electoral competencies (Réseau des compétences électorales francophones, RECEF), the French-speaking network of media regulators (Réseau francophone des régulateurs médias, REFRAM) and Reporters without borders (Reporters sans frontières, RSF). All of these actors are questioning their respective roles with the appearance of new digital tools, media and social networks, at different stages of the electoral process. While relevant legal principles already exist, these either completely or partially ignore social networks and, more generally, the digital sphere. This legal coverage, which can only be qualified as partial, could be improved with new legislative provisions. A multi-regulator approach could also consolidate the reliability and fairness of electoral processes under the digital era, and for this purpose a multi- network working group is already being put in place by the AFAPDP, the RECEF and the REFRAM, with support from the OIF.

4 https://www.francophonie.org/IMG/pdf/oif_guide-pratique_etatcivil-27-11-14.pdf 12

The reuse of data accessible “online” by the research community: challenges and prospects

Recent revelations and scandals have illustrated the potential abuses arising from the exploitation of data that is freely accessible on the internet for research purposes: misuse or uncontrolled dissemination of results resulting in data protection breaches. More generally, given the interest of using data accessible “online” for research purposes, the CNIL has decided to look into the conditions under which these operations are carried out.

Research and use of personal data: It is against this backdrop that the CNIL has begun to think about the legal inextricable links conditions under which data that is accessible online can be reused for research purposes. This reflection must The pools of data created by use of the It can be difficult to articulate some also serve to clarify which regulations Internet and social networks attract of the principles arising from these are applicable in a political and social many stakeholders with the intention regulations with the aims of research. context marked by several “scandals” of exploiting them and drawing added In terms of referrals to the CNIL, it was concerning the reuse of data accessible value from them for their activities. The noted that some stakeholders either “online” by researchers. It appears vital use of such data presents significant considered that the Data Protection that personal data protection reflexes challenges and risks in terms of data Regulation did not apply to their are included in research projects as protection. The “Dinsinfo Lab” scandal works, or that the corresponding legal from their creation, at the risk of these illustrated the sensitivity of issues principles either prohibited or went projects being carried out without relating to reuse of data that is accessible against most of these works. Indeed, taking them into account. “online” for research purposes; the carrying out research most often revelations of the Cambridge Analytica requires the collection of a large amount scandal showed how data initially of data, for which setting a precise data collected for research purposes could be retention period in advance can be ultimately misused for other purposes. difficult.

Researchers are some of the actors As such, the compilation of datasets using this wealth of data. As such, from that may be useful for other studies, time to time, research projects using and access to said datasets to ensure such data are referred to the CNIL to that research can be reproduced are ensure that these projects comply with issues that must be reconciled with the applicable regulations on personal data requirement of complying with the Data protection. Protection Regulation. 13

“Carrying out research requires the collection of a large amount of data, for which it can be difficult to precisely set a data retention period beforehand.”

platforms in terms of marketing or A Data Protection Act serving commercial effectiveness is able to rely the research sector on access to the data possessed The approach set out by the CNIL therefore aims to: In a time in which the regulations fairness of the collection of data carried ensure better legal security to research applicable to personal data protection out and to check that data subjects stakeholders by clarifying the legal are being strengthened, the CNIL recalls have been provided with the necessary framework applicable to research that many provisions can be mobilised information. projects based on the processing to favour the processing of data for of data accessible online and by research purposes. For example, As regards information in particular, providing these stakeholders with under certain conditions, the GDPR one of the difficulties raised by the simple tools to understand the GDPR sets out exceptions to the prohibition application of the provisions contained applied to their projects; from processing sensitive data, to the in the GDPR resides in the balance that issue recommendations in line with obligation of informing the individuals must be found between the requirement the needs and constraints experienced whose data is collected, and to the of delivering sufficient information and by research organisations, in order exercise of the rights to erasure and to that of encouraging participation in a to enable them to successfully object. study or in research. carry out their projects whist ensuring their compliance with the The GDPR differentiates between Similarly, these works must clarify applicable legal framework. These scientific research, historical research the relations that may exist between recommendations could be supported and statistical research. It states that the different fields of research (public by consultation with stakeholders scientific research must be understood research, private research) and any from the research community, in the broad sense and must include “for potential differences that must be made particularly on the following topics: example technological development depending on which field is concerned. access to data deposited on platforms, and demonstration, fundamental Each of the stakeholders involved must implementation of data subjects’ research, applied research and privately be able to understand what can and rights, definition of data retention funded research”, but also “studies cannot be done with the data collected, periods to be applied, provision of conducted in the public interest in the depending on the context in which personal datasets to the research area of public health” (recital 159 of the they find themselves. This clarification community or security measures to GDPR). appears all the more important given be implemented. that some researchers are, for example, As such, it appeared necessary that likely to move from one field to another the CNIL launch works to present and during their career. more clearly define the applicable legal framework, both prior to and after the These works, which will be carried entry into force of the GDPR. These out in collaboration with research works must therefore accompany stakeholders, must also determine both public and private stakeholders which appropriate safeguards must be from the research community in their implemented to govern the collection, compliance efforts. In particular, the use and reuse of data accessible “online” aim is to further explain the conditions by the research community. under which the planned processing operations can be implemented - whilst In particular, the aim is to determine recalling that consent is only one of under which conditions researchers the legal grounds that can be invoked, may access online data on social alongside, for example, the performance network platforms, for example to of a task carried out inthe public interest avoid situations in which only some or the legitimate interests pursued researchers may access such data or by the data controller -, to verify the to avoid that only research serving 14

How is children’s data protected?

In 2018, 83% of 12 to 17-year-olds owned a . 91% of them connected to the Internet every day and spent on average 27 hours per week surfing the web. 37% of this age group made online purchases (source: le baromètre 2018 du numérique - the 2018 digital barometer - by Crédoc).

If underaged individuals have become increased which, as such, are extremely sought digital stakeholders in their own right, after. How can such data - which is it is precisely because of their massive Lastly, digital technologies have particularly sensitive when it comes to use of (instant messaging, invaded the education sector, and children - be protected? social networks, photo- and video- particularly teaching methods, whether sharing networks) to share information through tools to manage school life, At the same time, our societies favour on themselves or on other underaged digital workspaces, online educational the development of autonomy and individuals. And when children services or the development of learning personal accountability, regardless of themselves aren’t doing it, it is their analytics. age. In the words of Michel Foucault, it is parents who are posting photos and a question of making each individual an videos of their children on the Internet. These digital practices and uses are “Entrepreneur of the Self” and therefore the result of a large amount of data preferring to assist children towards The number of connected objects for processing and of trace analyses, which discovering their own personal identity children - watches, bracelets, connected provide a wealth of information on and originality. toys and even soft toys (cloudpets) - and children, their interests, their behaviour, for families - “smart” speakers, home their movements, their intellectual automation, “smart homes”, etc. - have potential, their personality profile - and 14

. open a bank account with a bank card alone, but only after having gained his/ Ensuring The growing her parents’ consent. children an recognition of In other cases, the child’s opinion must effective right to children’s rights always prevail as a last resort. When a minor receives medical care without data protection his/her parents’ knowledge, said minor The 1989 International Convention is entitled to object to his/her parents In France, children’s right to personal on the Rights of the Child set out being informed. The doctor must play data protection witnessed its first major this concept by acknowledging that the role of conciliator in the event of a legal step forward in the field of medical each child has not only a right to family conflict, with the final decision research. The act for a digital republic protection – to compensate for his/ always being made by the minor. of October 2016 provided that a 15-year- her vulnerability – but also a right to a old minor can object to his/her parents series of social benefits, to accompany This is also the case when medical having access to data collected for his/her development, and to “freedom” research on a minor is considered. The this purpose concerning him/her. Said rights which must prepare the child for authorisation to participate in such minor can also object to his/her parents his/her future adult life. The Convention research must be collected from the being informed of any preventive, sets out the principle of “the best interest individuals with parental authority. screening or diagnostic acts. The minor of the child”, a dynamic concept whose However, the participating minor is the only individual to receive such scope can only be assessed in concreto, must also be consulted, when his/ information and able to exercise his/her on an individual basis. her condition allows it, and provided rights. with information adapted to his/her In France, the societal changes in ability to understand. His/her personal The CNIL – considering that French progress have already been translated acceptance of the research must be law already took into consideration a into law. sought. If the minor does not wish or minor’s age in order to allow them to no longer wishes to participate, his/her perform certain acts – has questioned, Many legal provisions authorise decision must be respected under all as from the early 2000s, whether minors unemancipated minors to take circumstances. beyond a certain age of maturity should initiatives. The principle of legal be allowed to perform certain “daily incapacity does not mean that they do acts” on the Internet, for example the not have any individual rights. mere act of creating an electronic messaging account or registering on a In the justice sector, the general rule is child’s website. that a child is entitled to exercise his/ her rights alone, without age restriction, since 2007 and the entry into force of the European Convention on the Exercise of Children’s Rights of 25 January 1996 which aims “in the best interests of children, to promote their rights, to grant them procedural rights and to facilitate the exercise of these rights”. Thus, a child may make a complaint, bear witness, be heard by a judge in civil or criminal proceedings, request legal aid or refer to the Defender of Rights. Furthermore, no age conditions may restrict a minor’s right to request political asylum, to request to give birth anonymously or to become a member of an association.

From the age of 13, a minor is entitled to request an organ donor card. He/she must consent to his/her full adoption and to the change of his/her name.

On occasion, the youth’s independence requires prior intervention from his/her parents. Thus, from the age of 16, a youth may sign an employment contract or 16

The advice that it provides on its website, “The GDPR recognises that minors on the Educnum website or through the awareness-raising initiatives that have individual rights”. it carries out targeting youths attest to its desire to improve children’s rights over their personal data, whilst ensuring them further protection. These discussions must take into account the new legal framework set out by the GDPR. For the first time, the latter has introduced provisions especially relating to minors into personal data Several questions are raised as regards the interpretation and application protection law, considering that they are of the GDPR. However, beyond the general regulation, we must also consider particularly vulnerable individuals who under which conditions children should be able to exercise their rights, must benefit from specific protection as is the case in the works carried out by other European data protection due to their lesser awareness of the risks authorities: run in the event of the processing of their personal data, on social networks Which operational mechanisms should be promoted to verify for example. This is true in terms of children’s age? both advertising and profiling. Thus, the GDPR provides that, for online services Under what circumstances must the prior consent of those targeting minors under the age of 16 with parental authority be requested? and requiring consent, parental consent is necessary. However, Member States How should such consent be collected? may provide for a lower minimum age in their national law, no lower than 13 How can we ensure that a child has indeed been authorised years old; in France, this age is set at 15 by his/her parents to consent? years old. How can information targeting minors be adapted to ensure For this reason, the GDPR encourages that it is easily understandable? the adoption of codes of conduct specific to the protection of children’s data How can we facilitate the exercise of the right to rectification and urges data protection authorities and the right to be forgotten especially granted to minors? to pay special attention to activities specifically targeting children. More generally, under what conditions can minors be granted the ability to directly exercise their rights (right to access, to erasure, etc.) However, the GDPR does not merely and what safeguards can be put in place, particularly as regards the use content itself with setting out a of their data for commercial purposes? protective framework for the processing of data relating to children. It also recognises that they have specific individual rights: As from a certain age, set out by each State (15 years old in France), their consent may be considered as a legal ground for the processing of data related to direct service offers via the In this context, in light of the GDPR, At the same time, the International Internet, notwithstanding exceptions it is important to start a comprehensive Working Group on Digital Education - for being underaged. discussion on the rights of minors created by the International Conference Compliance with the principle over their personal data, and on the of Data Protection and Privacy of transparency implies that the conditions under which they should be Commissioners and steered by the information targeting children encouraged to exercise them. CNIL - has also started working on best is drafted in a way that is easily practices to inform children of their understandable for them. For all these reasons, the CNIL has rights. The rights to rectification and to be decided to launch discussions on these forgotten are considered especially topics this year, in collaboration with important when consent to the all stakeholders involved, including processing of information has been parents, youths, the educational given by a minor. community and professionals from the digital sphere.