2019-2024

Committee on Women's Rights and Gender Equality

2020/0361(COD)

15.7.2021

AMENDMENTS 28 - 169

Draft opinion Jadwiga Wiśniewska (PE693.717v02-00)

Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC

Proposal for a regulation (COM(2020)0825 – C9-0418/2020 – 2020/0361(COD))

AM\1236481EN.docx PE695.317v01-00

EN United in diversityEN AM_Com_LegOpinion

PE695.317v01-00 2/105 AM\1236481EN.docx EN Amendment 28 Lena Düpont, Frances Fitzgerald, Elissavet Vozemberg-Vrionidi, Christine Schneider

Proposal for a regulation Recital 2

Text proposed by the Commission Amendment

(2) Member States are increasingly (2) Up till now, politics has relied on introducing, or are considering introducing, voluntary cooperation with a view to national laws on the matters covered by address these risks and challenges. Since this Regulation, imposing, in particular, this has proved insufficient and there has diligence requirements for providers of been a lack of harmonised rules at Union intermediary services. Those diverging level, Member States have been national laws negatively affect the internal increasingly introducing, or are considering market, which, pursuant to Article 26 of introducing, national laws on the matters the Treaty, comprises an area without covered by this Regulation, imposing, in internal frontiers in which the free particular, diligence requirements for movement of goods and services and providers of intermediary services. Those freedom of establishment are ensured, diverging national laws negatively affect taking into account the inherently cross- the internal market, which, pursuant to border nature of the internet, which is Article 26 of the Treaty, comprises an area generally used to provide those services. without internal frontiers in which the free The conditions for the provision of movement of goods and services and intermediary services across the internal freedom of establishment are ensured, market should be harmonised, so as to taking into account the inherently cross- provide businesses with access to new border nature of the internet, which is markets and opportunities to exploit the generally used to provide those services. benefits of the internal market, while The conditions for the provision of allowing consumers and other recipients of intermediary services across the internal the services to have increased choice. market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice.

Or. en

Amendment 29 Hilde Vautmans, Karen Melchior, María Soraya Rodríguez Ramos, Samira Rafaela

Proposal for a regulation Recital 3

Text proposed by the Commission Amendment

AM\1236481EN.docx 3/105 PE695.317v01-00 EN (3) Responsible and diligent behaviour (3) Responsible and diligent behaviour by providers of intermediary services is by providers of intermediary services is essential for a safe, predictable and trusted essential for a safe, predictable and trusted online environment and for allowing Union online environment and for allowing Union citizens and other persons to exercise their citizens and other persons to exercise their fundamental rights guaranteed in the fundamental rights guaranteed in the Charter of Fundamental Rights of the Charter of Fundamental Rights of the (‘Charter’), in particular European Union (‘Charter’), in particular the freedom of expression and information the freedom of expression and information and the freedom to conduct a business, and and the freedom to conduct a business, and the right to non-discrimination. the right to gender equality and non- discrimination. Children, especially girls, have specific rights enshrined in Article 24 of the Charter and in the United Nations Convention on the Rights of the Child. As such, the best interests of the child should be a primary consideration in all matters affecting them. The UNCRC General comment No. 25 on children’s rights in relation to the digital environment formally sets out how these rights apply to the digital world.

Or. en

Amendment 30 Lena Düpont, Frances Fitzgerald, Elissavet Vozemberg-Vrionidi, Christine Schneider

Proposal for a regulation Recital 3

Text proposed by the Commission Amendment

(3) Responsible and diligent behaviour (3) Responsible and diligent behaviour by providers of intermediary services is by providers of intermediary services is essential for a safe, predictable and trusted essential for a safe, predictable and trusted online environment and for allowing Union online environment and for allowing Union citizens and other persons to exercise their citizens and other persons to exercise their fundamental rights guaranteed in the fundamental rights guaranteed in the Charter of Fundamental Rights of the Charter of Fundamental Rights of the European Union (‘Charter’), in particular European Union (‘Charter’), in particular the freedom of expression and information the freedom of expression and information and the freedom to conduct a business, and and the freedom to conduct a business, the the right to non-discrimination. gender equality principle and non- discrimination. In order to exercise these rights, the online world needs to be a safe space, especially for women and girls, where everybody can move freely.

PE695.317v01-00 4/105 AM\1236481EN.docx EN Therefore, measures to protect from, and prevent, phenomena such as online violence, cyberstalking, harassment, hate speech and exploitation of women and girls are essential.

Or. en

Amendment 31 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Recital 3

Text proposed by the Commission Amendment

(3) Responsible and diligent behaviour (3) Responsible and diligent behaviour by providers of intermediary services is by providers of intermediary services is essential for a safe, predictable and trusted essential for a safe, predictable and trusted online environment and for allowing Union online environment and for allowing Union citizens and other persons to exercise their citizens and other persons to exercise their fundamental rights guaranteed in the fundamental rights guaranteed in the Charter of Fundamental Rights of the Charter of Fundamental Rights of the European Union (‘Charter’), in particular European Union (‘Charter’), in particular the freedom of expression and information the freedom of expression and information and the freedom to conduct a business, and and the freedom to conduct a business, and the right to non-discrimination. the right to non-discrimination based on any grounds such as sex, race, colour, ethnic or social origin, genetic features, language, religion or belief, political or any other opinion, membership of a national minority, property, birth, disability, age or sexual orientation.

Or. en

Amendment 32 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Recital 3 a (new)

Text proposed by the Commission Amendment

(3 a) Gender equality is one of the founding values of the European Union

AM\1236481EN.docx 5/105 PE695.317v01-00 EN (Articles 2 and 3(3) of the Treaty on European Union (TEU)). These objectives are also enshrined in Article 21 of the Charter of Fundamental Rights. Article 8 TFEU gives the Union the task of eliminating inequalities and promoting equality between genders in all of its activities and policies. In order to protect women's rights and tackle gender-based online violence, the principle of gender mainstreaming should be applied in all European policy, including regulating the functioning of the internal market and its digital services.

Or. en

Amendment 33

Proposal for a regulation Recital 4

Text proposed by the Commission Amendment

(4) Therefore, in order to safeguard deleted and improve the functioning of the internal market, a targeted set of uniform, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers and fostering interoperability. By using requirements that are technology neutral, innovation should not be hampered but instead be stimulated.

Or. en

PE695.317v01-00 6/105 AM\1236481EN.docx EN Amendment 34 Lena Düpont, Frances Fitzgerald, Elissavet Vozemberg-Vrionidi, Christine Schneider

Proposal for a regulation Recital 5

Text proposed by the Commission Amendment

(5) This Regulation should apply to (5) This Regulation should apply to providers of certain information society providers of certain information society services as defined in Directive (EU) services as defined in Directive (EU) 2015/1535 of the European Parliament and 2015/1535 of the European Parliament and of the Council26 , that is, any service of the Council26 , that is, any service normally provided for remuneration, at a normally provided for remuneration, at a distance, by electronic means and at the distance, by electronic means and at the individual request of a recipient. individual request of a recipient. Specifically, this Regulation should apply Specifically, this Regulation should apply to providers of intermediary services, and to providers of intermediary services, and in particular intermediary services in particular intermediary services consisting of services known as ‘mere consisting of services known as ‘mere conduit’, ‘caching’ and ‘hosting’ services, conduit’, ‘caching’ and ‘hosting’ services, given that the exponential growth of the given that the exponential growth of the use made of those services, mainly for use made of those services, mainly for legitimate and socially beneficial purposes legitimate and socially beneficial purposes of all kinds, has also increased their role in of all kinds, has also increased their role in the intermediation and spread of unlawful the intermediation and spread of unlawful or otherwise harmful information and or otherwise harmful information and activities. activities. Given that online platforms are part of our everyday-life and have become indispensable, even more so since the pandemic, the spread of illegal and harmful content, such as child sexual abuse material, online sexual harassment, unlawful non-consensual sharing of private images and videos, cyber violence, has risen dramatically as well. Ensuring a safe space online implies targeted actions against all phenomena harmfully affecting our social life, including through an awaited proposal on how to deal with harmful but not illegal content online. ______26 Directive (EU) 2015/1535 of the 26 Directive (EU) 2015/1535 of the European Parliament and of the Council of European Parliament and of the Council of 9 September 2015 laying down a procedure 9 September 2015 laying down a procedure for the provision of information in the field for the provision of information in the field of technical regulations and of rules on of technical regulations and of rules on Information Society services (OJ L 241, Information Society services (OJ L 241,

AM\1236481EN.docx 7/105 PE695.317v01-00 EN 17.9.2015, p. 1). 17.9.2015, p. 1).

Or. en

Amendment 35 Lena Düpont, Frances Fitzgerald, Elissavet Vozemberg-Vrionidi, Christine Schneider

Proposal for a regulation Recital 9

Text proposed by the Commission Amendment

(9) This Regulation should (9) This Regulation should complement, yet not affect the application complement, yet not affect the application of rules resulting from other acts of Union of rules resulting from other acts of Union law regulating certain aspects of the law regulating certain aspects of the provision of intermediary services, in provision of intermediary services, in particular Directive 2000/31/EC, with the particular Directive 2000/31/EC, with the exception of those changes introduced by exception of those changes introduced by this Regulation, Directive 2010/13/EU of this Regulation, Directive 2010/13/EU of the European Parliament and of the the European Parliament and of the Council as amended,28 and Regulation Council as amended,28 Regulation (EU) (EU) …/.. of the European Parliament and 2021/784 of the European Parliament and of the Council29 – proposed Terrorist of the Council29 and the recently adopted Content Online Regulation. Therefore, this Regulation of the European Parliament Regulation leaves those other acts, which and of the Council on a temporary are to be considered lex specialis in derogation from certain provisions of relation to the generally applicable Directive 2002/58/EC of the European framework set out in this Regulation, Parliament and of the Council as regards unaffected. However, the rules of this the use of technologies by number- Regulation apply in respect of issues that independent interpersonal are not or not fully addressed by those communications service providers for the other acts as well as issues on which those processing of personal and other data for other acts leave Member States the the purpose of combatting child sexual possibility of adopting certain measures at abuse online. Therefore, this Regulation national level. leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. ______28 Directive 2010/13/EU of the European 28 Directive 2010/13/EU of the European

PE695.317v01-00 8/105 AM\1236481EN.docx EN Parliament and of the Council of 10 March Parliament and of the Council of 10 March 2010 on the coordination of certain 2010 on the coordination of certain provisions laid down by law, regulation or provisions laid down by law, regulation or administrative action in Member States administrative action in Member States concerning the provision of audiovisual concerning the provision of audiovisual media services (Audiovisual Media media services (Audiovisual Media Services Directive) (Text with EEA Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . relevance), OJ L 95, 15.4.2010, p. 1 . 29 Regulation (EU) …/.. of the European 29 Regulation (EU) 2021/784 of the Parliament and of the Council – proposed European Parliament and of the Council of Terrorist Content Online Regulation 29 April 2021 on addressing the dissemination of terrorist content online (OJ L 172, 17.5.2021, p. 79).

Or. en

Amendment 36 Silvia Modig

Proposal for a regulation Recital 12

Text proposed by the Commission Amendment

(12) In order to achieve the objective of (12) In order to achieve the objective of ensuring a safe, predictable and trusted ensuring a safe, predictable, accessible online environment, for the purpose of this (including persons with disabilities) and Regulation the concept of “illegal content” trusted online environment, for the purpose should be defined broadly and also covers of this Regulation the concept of “illegal information relating to illegal content, content” should be defined broadly within products, services and activities. In the principle that what is illegal offline particular, that concept should be should also be illegal online. The concept understood to refer to information, should cover information relating to illegal irrespective of its form, that under the content, products, services and activities. applicable law is either itself illegal, such In particular, that concept should be as illegal hate speech or terrorist content understood to refer to information, and unlawful discriminatory content, or irrespective of its form, that under the that relates to activities that are illegal, applicable law is either itself illegal, such such as the sharing of images depicting as illegal hate speech or terrorist content child sexual abuse, unlawful non- and unlawful discriminatory content, or consensual sharing of private images, that relates to activities that are illegal, online stalking, the sale of non-compliant such as trafficking in human beings and or counterfeit products, the non-authorised online sexual exploitation of women and use of copyright protected material or girls, forced marriages, the sharing of activities involving infringements of images depicting child sexual abuse, consumer protection law. In this regard, it unlawful non-consensual sharing of private is immaterial whether the illegality of the images, online stalking, hacking, doxing,

AM\1236481EN.docx 9/105 PE695.317v01-00 EN information or activity results from Union mobbing, sextortion, grooming law or from national law that is consistent adolescents, online sexual harassment, with Union law and what the precise nature impersonation, cyberbullying, cyber or subject matter is of the law in question. sexual harassment, intersectional hate speech online against women and LGBTIQ+ persons, the sale of non- compliant or counterfeit products, the non- authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.

Or. en

Amendment 37 Lena Düpont, Frances Fitzgerald, Elissavet Vozemberg-Vrionidi, Christine Schneider

Proposal for a regulation Recital 12

Text proposed by the Commission Amendment

(12) In order to achieve the objective of (12) In order to achieve the objective of ensuring a safe, predictable and trusted ensuring a safe, predictable and trusted online environment, for the purpose of this online environment, for the purpose of this Regulation the concept of “illegal content” Regulation the concept of “illegal content” should be defined broadly and also covers should be defined broadly in order to information relating to illegal content, underpin the general idea that what is products, services and activities. In illegal offline should also be illegal particular, that concept should be online. The concept should also cover understood to refer to information, information relating to illegal content, irrespective of its form, that under the products, services and activities. In applicable law is either itself illegal, such particular, that concept should be as illegal hate speech or terrorist content understood to refer to information, and unlawful discriminatory content, or irrespective of its form, that under the that relates to activities that are illegal, applicable law is either itself illegal, such such as the sharing of images depicting as illegal hate speech, child sexual abuse child sexual abuse, unlawful non- material or terrorist content and unlawful consensual sharing of private images, discriminatory content, or that relates to online stalking, the sale of non-compliant activities that are illegal, such as or counterfeit products, the non-authorised trafficking in human beings, sexual use of copyright protected material or exploitation of women and girls, the activities involving infringements of sharing of images depicting child sexual

PE695.317v01-00 10/105 AM\1236481EN.docx EN consumer protection law. In this regard, it abuse, unlawful non-consensual sharing of is immaterial whether the illegality of the private images and videos, online stalking, information or activity results from Union grooming adolescents, online sexual law or from national law that is consistent harassment and other forms of gender with Union law and what the precise nature based violence, the sale of non-compliant or subject matter is of the law in question. or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.

Or. en

Amendment 38 Sirpa Pietikäinen

Proposal for a regulation Recital 12

Text proposed by the Commission Amendment

(12) In order to achieve the objective of (12) In order to achieve the objective of ensuring a safe, predictable and trusted ensuring a safe, predictable and trusted online environment, for the purpose of this online environment, for the purpose of this Regulation the concept of “illegal content” Regulation the concept of “illegal content” should be defined broadly and also covers should be defined broadly, considering information relating to illegal content, that what is illegal offline should also be products, services and activities. In illegal online and also covers information particular, that concept should be relating to illegal content, products, understood to refer to information, services and activities. In particular, that irrespective of its form, that under the concept should be understood to refer to applicable law is either itself illegal, such information, irrespective of its form, that as illegal hate speech or terrorist content under the applicable law is either itself and unlawful discriminatory content, or illegal, such as illegal hate speech or that relates to activities that are illegal, terrorist content and unlawful such as the sharing of images depicting discriminatory content, or that relates to child sexual abuse, unlawful non- activities that are illegal, such as the consensual sharing of private images, sharing of images depicting child sexual online stalking, the sale of non-compliant abuse, unlawful non-consensual sharing of or counterfeit products, the non-authorised private images, online stalking and use of copyright protected material or gendered hate speech, the sale of non- activities involving infringements of compliant or counterfeit products, the non- consumer protection law. In this regard, it authorised use of copyright protected is immaterial whether the illegality of the material or activities involving

AM\1236481EN.docx 11/105 PE695.317v01-00 EN information or activity results from Union infringements of consumer protection law. law or from national law that is consistent In this regard, it is immaterial whether the with Union law and what the precise nature illegality of the information or activity or subject matter is of the law in question. results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question. The social media platforms and other online operating actors need to be legally obliged to act against and remove all hate speech targeting women and girls.

Or. en

Justification

Hate speech targeting women and girls is multiform, can focus on appearance, sexuality, take the form of general intimidation, belittlement or and inappropriate naming and shaming. More severe forms include violent threats, death threats and sexual harassment are part of gendered violence targeting women online. Online violence can target any gender, but it disproportionately affects women and LGBTI persons, who are more likely to be targeted by digital abuse. FRA survey on Violence against resulted that 1 in 10 women have experienced some kind of cyber violence since the age of 15.

Amendment 39 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Recital 12

Text proposed by the Commission Amendment

(12) In order to achieve the objective of (12) In order to achieve the objective of ensuring a safe, predictable and trusted ensuring a safe, predictable and trusted online environment, for the purpose of this online environment, for the purpose of this Regulation the concept of “illegal content” Regulation the concept of “illegal content” should be defined broadly and also covers should be defined broadly and also covers information relating to illegal content, information relating to illegal content, products, services and activities. In products, services and activities. In particular, that concept should be particular, that concept should be understood to refer to information, understood to refer to information, irrespective of its form, that under the irrespective of its form, that under the applicable law is either itself illegal, such applicable law is either itself illegal, such as illegal hate speech or terrorist content as illegal hate speech or terrorist content and unlawful discriminatory content, or and unlawful discriminatory content, or that relates to activities that are illegal, that relates to activities that are illegal, such as the sharing of images depicting such as the sharing of images depicting child sexual abuse, unlawful non- child sexual abuse, unlawful non-

PE695.317v01-00 12/105 AM\1236481EN.docx EN consensual sharing of private images, consensual sharing of private images, online stalking, the sale of non-compliant sexual exploitation and abuse of women or counterfeit products, the non-authorised and girls, revenge porn, online stalking, use of copyright protected material or online sexual harassment, the sale of non- activities involving infringements of compliant or counterfeit products, the non- consumer protection law. In this regard, it authorised use of copyright protected is immaterial whether the illegality of the material or activities involving information or activity results from Union infringements of consumer protection law. law or from national law that is consistent In this regard, it is immaterial whether the with Union law and what the precise nature illegality of the information or activity or subject matter is of the law in question. results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.

Or. en

Amendment 40 Christine Anderson

Proposal for a regulation Recital 12

Text proposed by the Commission Amendment

(12) In order to achieve the objective of (12) In order to achieve the objective of ensuring a safe, predictable and trusted ensuring a safe, predictable and trusted online environment, for the purpose of this online environment, for the purpose of this Regulation the concept of “illegal content” Regulation the concept of “illegal content” should be defined broadly and also covers should be defined clearly and also covers information relating to illegal content, information relating to illegal content, products, services and activities. In products, services and activities. In particular, that concept should be particular, that concept should be understood to refer to information, understood to refer to information, irrespective of its form, that under the irrespective of its form, that under the applicable law is either itself illegal, such applicable law is either itself illegal, such as illegal hate speech or terrorist content as illegal hate speech or terrorist content and unlawful discriminatory content, or and unlawful discriminatory content, or that relates to activities that are illegal, that relates to activities that are illegal, such as the sharing of images depicting such as the sharing of images depicting child sexual abuse, unlawful non- child sexual abuse for personal enjoyment, consensual sharing of private images, unlawful non-consensual sharing of private online stalking, the sale of non-compliant images, online stalking, the sale of non- or counterfeit products, the non-authorised compliant or counterfeit products, the non- use of copyright protected material or authorised use of copyright protected activities involving infringements of material or activities involving consumer protection law. In this regard, it infringements of consumer protection law. is immaterial whether the illegality of the In this regard, it is immaterial whether the

AM\1236481EN.docx 13/105 PE695.317v01-00 EN information or activity results from Union illegality of the information or activity law or from national law that is consistent results from Union law or from national with Union law and what the precise nature law that is consistent with Union law and or subject matter is of the law in question. what the precise nature or subject matter is of the law in question.

Or. en

Amendment 41 Silvia Modig

Proposal for a regulation Recital 12 a (new)

Text proposed by the Commission Amendment

(12 a) Online violence is a phenomenon which needs to be efficiently addressed for the safety of all users. It affects women and girls disproportionately not only causing them psychological harm and physical suffering but also deterring them from digital participation in political, social, cultural and economic life. As in real life, women, and particularly those women with intersecting identities and vulnerabilities, experience on the internet a continuum of aggressions that ranges from unwanted sexual advances, sexist and/or racist (ageist/ableist/homophobic, transphobic/sizeist etc.) insults, to frequent, harmful, frightening, sometimes life-threatening abuse. Research by the World Health Organization estimates that one in ten women has already experienced a form of cyber violence since the age of 15. In addition, the growing trend of online violence and abuse against women has accelerated during Covid-19. Access to the internet is fast becoming a necessity for economic well-being and must be viewed as a fundamental human right. In this regard, it is crucial to ensure that the digital public space is a safe and empowering place for everyone, particularly for women and girls in all their diversity.

PE695.317v01-00 14/105 AM\1236481EN.docx EN Or. en

Amendment 42 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Recital 12 a (new)

Text proposed by the Commission Amendment

(12 a) Online violence is a phenomenon which needs to be addressed for the safety of all users. In specific, special attention should be paid to tackling online gender- based violence against women. An European Union Agency for Fundamental Rights' survey of 2014, the most comprehensive at EU level in the field, has shown that 1 in 10 women in the EU aged 15 or over has faced online harassment. According to The Declaration on the Elimination of Violence against Women (A/RES/48/104) violence against women is "any act of gender-based violence that results in, or is likely to result in, physical, sexual or psychological harm or suffering to women, including threats of such acts, coercion or arbitrary deprivation of liberty, whether occurring in public or in private life". Online violence against women should, therefore, be understood as any act of gender-based violence against women that is committed, assisted or aggravated in part or fully by the use of ICT, such as mobile phones and smartphones, the Internet, social media platforms or email, against a woman because she is a woman, or affects women disproportionately.

Or. en

Amendment 43 Sirpa Pietikäinen

AM\1236481EN.docx 15/105 PE695.317v01-00 EN Proposal for a regulation Recital 12 a (new)

Text proposed by the Commission Amendment

(12 a) As there is no common definition accepted for the recognition of cyber violence and hate speech online against women, there is an urgent need to define and adopt a common definition to the various forms of violence and hate speech targeting women and sexual minorities online that would serve as a basis for legislation.

Or. en

Amendment 44 Silvia Modig

Proposal for a regulation Recital 12 b (new)

Text proposed by the Commission Amendment

(12 b) The Covid-19 pandemic has had a significant impact on almost all spheres of life, including on organised crime. For example, traffickers increasingly moved online for every phase of trafficking which has a gender dimension since women and girls are the majority (75%) of victims of trafficking for all purposes. Perpetrators of human trafficking for sexual exploitation are using cyberspace to recruit, advertise and exercise control over women and children, who are intrinsically more vulnerable to this crime. Other forms of organised crime facilitated by digital tools are different types of exploitation, forced begging, forced and sham marriages, forced criminality, the removal of organs and illegal adoption of children.

Or. en

PE695.317v01-00 16/105 AM\1236481EN.docx EN Amendment 45 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Recital 22

Text proposed by the Commission Amendment

(22) In order to benefit from the (22) In order to benefit from the exemption from liability for hosting exemption from liability for hosting services, the provider should, upon services, the provider should, upon obtaining actual knowledge or awareness obtaining actual knowledge or awareness of illegal content, act expeditiously to of illegal content, act expeditiously to remove or to disable access to that content. remove or to disable access to that content The removal or disabling of access should taking into account the potential harm the be undertaken in the observance of the illegal content in question may create. In principle of freedom of expression. The order to ensure a harmonised provider can obtain such actual knowledge implementation of illegal content removal or awareness through, in particular, its throughout the Union, the provider own-initiative investigations or notices should, within 24 hours, remove or submitted to it by individuals or entities in disable access to illegal content that can accordance with this Regulation in so far as seriously harm public policy, public those notices are sufficiently precise and security or public health or seriously adequately substantiated to allow a diligent harm consumers’ health or safety. For economic operator to reasonably identify, example, sharing of images depicting assess and where appropriate act against child sexual abuse, content related to the allegedly illegal content. sexual exploitation and abuse of women and girls and revenge porn. According to the well-established case-law of the Court of Justice and in line with Directive 2000/31/EC, the concept of ‘public policy’ involves a genuine, present and sufficiently serious threat which affects one of the fundamental interests of society, in particular for the prevention, investigation, detection and prosecution of criminal offences, including the protection of minors and the fight against any incitement to hatred on grounds of race, gender, religion or nationality, and violations of human dignity concerning individual persons. The concept of ‘public security’ as interpreted by the Court of Justice covers both the internal security of a Member State, which may be affected by, inter alia, a direct threat and physical security of the population of the Member State concerned, and the external security, which may be affected by, inter alia, the risk of a serious disturbance to

AM\1236481EN.docx 17/105 PE695.317v01-00 EN the foreign relations of that Member State or to the peaceful coexistence of nations. Where the illegal content does not seriously harm public policy, public security, public health or consumers’ health or safety, the provider should remove or disable access to illegal content within seven days. The deadlines referred to in this Regulation should be without prejudice to specific deadlines set out in Union law or within administrative or judicial orders. The provider may derogate from the deadlines referred to in this Regulation on the grounds of force majeure or for justifiable technical or operational reasons but it should be required to inform the competent authorities as provided for in this Regulation. The removal or disabling of access should be undertaken in the observance of the principle of the Charter of Fundamental Rights, including a high level of consumer protection and freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.

Or. en

Amendment 46 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Recital 25

Text proposed by the Commission Amendment

(25) In order to create legal certainty (25) In order to create legal certainty and not to discourage activities aimed at and not to discourage activities aimed at detecting, identifying and acting against detecting, identifying and acting against

PE695.317v01-00 18/105 AM\1236481EN.docx EN illegal content that providers of illegal content that providers of intermediary services may undertake on a intermediary services may undertake on a voluntary basis, it should be clarified that voluntary basis, it should be clarified that the mere fact that providers undertake such the mere fact that providers undertake such activities does not lead to the unavailability activities does not lead to the unavailability of the exemptions from liability set out in of the exemptions from liability set out in this Regulation, provided those activities this Regulation, provided those activities are carried out in good faith and in a are carried out in good faith and in a diligent manner. In addition, it is diligent and non-discriminatory manner. appropriate to clarify that the mere fact that In addition, it is appropriate to clarify that those providers take measures, in good the mere fact that those providers take faith, to comply with the requirements of measures, in good faith, to comply with the Union law, including those set out in this requirements of Union law, including those Regulation as regards the implementation set out in this Regulation as regards the of their terms and conditions, should not implementation of their terms and lead to the unavailability of those conditions, should not lead to the exemptions from liability. Therefore, any unavailability of those exemptions from such activities and measures that a given liability. Therefore, any such activities and provider may have taken should not be measures that a given provider may have taken into account when determining taken should not be taken into account whether the provider can rely on an when determining whether the provider can exemption from liability, in particular as rely on an exemption from liability, in regards whether the provider provides its particular as regards whether the provider service neutrally and can therefore fall provides its service neutrally and can within the scope of the relevant provision, therefore fall within the scope of the without this rule however implying that the relevant provision, without this rule provider can necessarily rely thereon. however implying that the provider can necessarily rely thereon.

Or. en

Amendment 47 Lena Düpont, Frances Fitzgerald, Elissavet Vozemberg-Vrionidi, Christine Schneider

Proposal for a regulation Recital 26 a (new)

Text proposed by the Commission Amendment

(26 a) Being aware that the intermediary services have already applied a risk assessment, there is still potential for improvement for the security and safety of all users, especially children, women, and other vulnerable groups. Therefore providers of intermediary services, more precisely online platforms and very large online platforms, shall regularly evaluate

AM\1236481EN.docx 19/105 PE695.317v01-00 EN their risk assessment and, if found necessary, improve it. Given the importance of providers of intermediary services and their potential to impact social life, common rules determining how users shall behave online, should be applied.The implementation of a code of conduct should be obligatory for every provider of intermediary services covered by this Regulation.

Or. en

Amendment 48 Lena Düpont, Frances Fitzgerald, Elissavet Vozemberg-Vrionidi, Christine Schneider

Proposal for a regulation Recital 30

Text proposed by the Commission Amendment

(30) Orders to act against illegal content (30) Orders to act against illegal content or to provide information should be issued or to provide information should be issued in compliance with Union law, in particular in compliance with Union law, in particular Regulation (EU) 2016/679 and the Regulation (EU) 2016/679, the recently prohibition of general obligations to adopted Regulation of the European monitor information or to actively seek Parliament and of the Council on a facts or circumstances indicating illegal temporary derogation from certain activity laid down in this Regulation. The provisions of Directive 2002/58/EC of the conditions and requirements laid down in European Parliament and of the Council this Regulation which apply to orders to act as regards the use of technologies by against illegal content are without number-independent interpersonal prejudice to other Union acts providing for communications service providers for the similar systems for acting against specific processing of personal and other data for types of illegal content, such as Regulation the purpose of combatting child sexual (EU) …/…. [proposed Regulation abuse online and the prohibition of general addressing the dissemination of terrorist obligations to monitor information or to content online], or Regulation (EU) actively seek facts or circumstances 2017/2394 that confers specific powers to indicating illegal activity laid down in this order the provision of information on Regulation. Member States should ensure Member State consumer law enforcement that the competent authorities fulfil their authorities, whilst the conditions and tasks in an objective, independent and requirements that apply to orders to non-discriminatory manner. The provide information are without prejudice conditions and requirements laid down in to other Union acts providing for similar this Regulation which apply to orders to act relevant rules for specific sectors. Those against illegal content are without conditions and requirements should be prejudice to other Union acts providing for without prejudice to retention and similar systems for acting against specific

PE695.317v01-00 20/105 AM\1236481EN.docx EN preservation rules under applicable types of illegal content, such as Regulation national law, in conformity with Union law (EU) 2021/784 addressing the and confidentiality requests by law dissemination of terrorist content online, enforcement authorities related to the non- the recently adopted Regulation of the disclosure of information. European Parliament and of the Council on a temporary derogation from certain provisions of Directive 2002/58/EC of the European Parliament and of the Council as regards the use of technologies by number-independent interpersonal communications service providers for the processing of personal and other data for the purpose of combatting child sexual abuse online or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non- disclosure of information.

Or. en

Amendment 49 Hilde Vautmans, Karen Melchior, María Soraya Rodríguez Ramos, Samira Rafaela

Proposal for a regulation Recital 34

Text proposed by the Commission Amendment

(34) In order to achieve the objectives of (34) In order to achieve the objectives of this Regulation, and in particular to this Regulation, and in particular to improve the functioning of the internal improve the functioning of the internal market and ensure a safe and transparent market and ensure a safe and transparent online environment, it is necessary to online environment, it is necessary to establish a clear and balanced set of establish a clear and balanced set of harmonised due diligence obligations for harmonised due diligence obligations for providers of intermediary services. Those providers of intermediary services. Those

AM\1236481EN.docx 21/105 PE695.317v01-00 EN obligations should aim in particular to obligations should aim in particular to guarantee different public policy objectives guarantee different public policy objectives such as the safety and trust of the recipients such as health, including mental health, of the service, including minors and the safety and trust of the recipients of the vulnerable users, protect the relevant service, including minors, women, fundamental rights enshrined in the LGBTIQ+ people and vulnerable users Charter, to ensure meaningful such as those with protected accountability of those providers and to characteristics under Article 21 of the empower recipients and other affected Charter, protect the relevant fundamental parties, whilst facilitating the necessary rights enshrined in the Charter, to ensure oversight by competent authorities. meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities. The World Health Organisation defines ‘health’ as a state of complete physical, mental and social well- being and not merely the absence of disease or infirmity. This definition supports the fact that the development of new technologies might bring new health risks to users, in particular for children and women, such as psychological risk, development risks, mental risks, depression, loss of sleep, or altered brain function.

Or. en

Amendment 50 Kira Marie Peter-Hansen

Proposal for a regulation Recital 34

Text proposed by the Commission Amendment

(34) In order to achieve the objectives of (34) In order to achieve the objectives of this Regulation, and in particular to this Regulation, and in particular to improve the functioning of the internal improve the functioning of the internal market and ensure a safe and transparent market and ensure a safe and transparent online environment, it is necessary to online environment, it is necessary to establish a clear and balanced set of establish a clear and balanced set of harmonised due diligence obligations for harmonised due diligence obligations for providers of intermediary services. Those providers of intermediary services. Those obligations should aim in particular to obligations should aim in particular to guarantee different public policy objectives guarantee different public policy objectives such as the safety and trust of the recipients such as the safety and trust of the recipients

PE695.317v01-00 22/105 AM\1236481EN.docx EN of the service, including minors and of the service, including minors, women vulnerable users, protect the relevant and vulnerable users, such as those with fundamental rights enshrined in the protected characteristics under Article 21 Charter, to ensure meaningful of the Charter, protect the fundamental accountability of those providers and to rights enshrined in the Charter, to ensure empower recipients and other affected meaningful accountability of those parties, whilst facilitating the necessary providers and to empower recipients and oversight by competent authorities. other affected parties, whilst facilitating the necessary oversight by competent authorities.

Or. en

Amendment 51 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Recital 34

Text proposed by the Commission Amendment

(34) In order to achieve the objectives of (34) In order to achieve the objectives of this Regulation, and in particular to this Regulation, and in particular to improve the functioning of the internal improve the functioning of the internal market and ensure a safe and transparent market, to ensure a safe and transparent online environment, it is necessary to online environment and to ensure the right establish a clear and balanced set of to non-discrimination, it is necessary to harmonised due diligence obligations for establish a clear and balanced set of providers of intermediary services. Those harmonised due diligence obligations for obligations should aim in particular to providers of intermediary services. Those guarantee different public policy objectives obligations should aim in particular to such as the safety and trust of the recipients guarantee different public policy objectives of the service, including minors and such as the safety and trust of the recipients vulnerable users, protect the relevant of the service, including minors and fundamental rights enshrined in the vulnerable users, protect the relevant Charter, to ensure meaningful fundamental rights enshrined in the accountability of those providers and to Charter, to ensure meaningful empower recipients and other affected accountability of those providers and to parties, whilst facilitating the necessary empower recipients and other affected oversight by competent authorities. parties, whilst facilitating the necessary oversight by competent authorities.

Or. en

Amendment 52 Lena Düpont, Frances Fitzgerald, Elissavet Vozemberg-Vrionidi, Christine Schneider

AM\1236481EN.docx 23/105 PE695.317v01-00 EN Proposal for a regulation Recital 34

Text proposed by the Commission Amendment

(34) In order to achieve the objectives of (34) In order to achieve the objectives of this Regulation, and in particular to this Regulation, and in particular to improve the functioning of the internal improve the functioning of the internal market and ensure a safe and transparent market and ensure a safe and transparent online environment, it is necessary to online environment, it is necessary to establish a clear and balanced set of establish a clear and balanced set of harmonised due diligence obligations for harmonised due diligence obligations for providers of intermediary services. Those providers of intermediary services. Those obligations should aim in particular to obligations should aim in particular to guarantee different public policy objectives guarantee different public policy objectives such as the safety and trust of the recipients such as the safety and trust of the recipients of the service, including minors and of the service, including minors, women vulnerable users, protect the relevant and girls, as well as vulnerable users, fundamental rights enshrined in the protect the relevant fundamental rights Charter, to ensure meaningful enshrined in the Charter, to ensure accountability of those providers and to meaningful accountability of those empower recipients and other affected providers and to empower recipients and parties, whilst facilitating the necessary other affected parties, whilst facilitating the oversight by competent authorities. necessary oversight by competent authorities.

Or. en

Amendment 53 Lena Düpont, Frances Fitzgerald, Elissavet Vozemberg-Vrionidi, Christine Schneider

Proposal for a regulation Recital 39

Text proposed by the Commission Amendment

(39) To ensure an adequate level of (39) To ensure an adequate level of transparency and accountability, providers transparency and accountability, providers of intermediary services should annually of intermediary services should annually report, in accordance with the harmonised report, in accordance with the harmonised requirements contained in this Regulation, requirements contained in this Regulation, on the content moderation they engage in, on the content moderation they engage in, including the measures taken as a result of including the measures taken as a result of the application and enforcement of their the application and enforcement of their terms and conditions. However, so as to terms and conditions. Providers offering avoid disproportionate burdens, those their services in more than one Member transparency reporting obligations should State should provide a breakdown of the

PE695.317v01-00 24/105 AM\1236481EN.docx EN not apply to providers that are micro- or information by Member State. However, small enterprises as defined in Commission so as to avoid disproportionate burdens, Recommendation 2003/361/EC.40 those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40Aligned with the annual reports broken down by actions of content moderation and Member State, the results of all forms of violence against women and girls online, hate speech and of other illegal content should reappear in the crime statistics. All forms of violence against women and girls shall be reported as an own category in those criminal statistics and law enforcement entities shall list them separately. ______40 Commission Recommendation 40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. sized enterprises (OJ L 124, 20.5.2003, p. 36). 36).

Or. en

Amendment 54 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Recital 39

Text proposed by the Commission Amendment

(39) To ensure an adequate level of (39) To ensure an adequate level of transparency and accountability, providers transparency and accountability, providers of intermediary services should annually of intermediary services should annually report, in accordance with the harmonised report, in accordance with the harmonised requirements contained in this Regulation, requirements contained in this Regulation, on the content moderation they engage in, on the content moderation they engage in, including the measures taken as a result of including the measures taken as a result of the application and enforcement of their the application and enforcement of their terms and conditions. However, so as to terms and conditions. Data should be avoid disproportionate burdens, those reported as disaggregated as possible. For transparency reporting obligations should example, anonymised individual not apply to providers that are micro- or characteristics such as gender, age group small enterprises as defined in Commission and social background of the notifying

AM\1236481EN.docx 25/105 PE695.317v01-00 EN Recommendation 2003/361/EC.40 parties should be reported, whenever available. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40 ______40 Commission Recommendation 40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. sized enterprises (OJ L 124, 20.5.2003, p. 36). 36).

Or. en

Amendment 55 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Recital 40

Text proposed by the Commission Amendment

(40) Providers of hosting services play a (40) Providers of hosting services play a particularly important role in tackling particularly important role in tackling illegal content online, as they store illegal content online, as they store information provided by and at the request information provided by and at the request of the recipients of the service and of the recipients of the service and typically give other recipients access typically give other recipients access thereto, sometimes on a large scale. It is thereto, sometimes on a large scale. It is important that all providers of hosting important that all providers of hosting services, regardless of their size, put in services, regardless of their size, put in place user-friendly notice and action place user-friendly notice and action mechanisms that facilitate the notification mechanisms that facilitate the notification of specific items of information that the of specific items of information that the notifying party considers to be illegal notifying party considers to be illegal content to the provider of hosting services content to the provider of hosting services concerned ('notice'), pursuant to which that concerned ('notice'), pursuant to which that provider can decide whether or not it provider can decide whether or not it agrees with that assessment and wishes to agrees with that assessment and wishes to remove or disable access to that content remove or disable access to that content ('action'). Provided the requirements on ('action'). Provided the requirements on notices are met, it should be possible for notices are met, it should be possible for individuals or entities to notify multiple individuals or entities to notify multiple specific items of allegedly illegal content specific items of allegedly illegal content through a single notice. The obligation to through a single notice. Online platforms

PE695.317v01-00 26/105 AM\1236481EN.docx EN put in place notice and action mechanisms in particular may also allow for users or should apply, for instance, to file storage trusted flaggers to notify content, and sharing services, web hosting services, including their own, to which others are advertising servers and paste bins, in as far responding with illegal content at large, as they qualify as providers of hosting such as illegal hate speech. The obligation services covered by this Regulation. to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.

Or. en

Amendment 56 Christine Anderson

Proposal for a regulation Recital 41

Text proposed by the Commission Amendment

(41) The rules on such notice and deleted action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of

AM\1236481EN.docx 27/105 PE695.317v01-00 EN property, including intellectual property, and the right to non-discrimination of parties affected by illegal content.

Or. en

Amendment 57 Lena Düpont, Frances Fitzgerald, Elissavet Vozemberg-Vrionidi, Christine Schneider

Proposal for a regulation Recital 41

Text proposed by the Commission Amendment

(41) The rules on such notice and action (41) The rules on such notice and action mechanisms should be harmonised at mechanisms should be harmonised at Union level, so as to provide for the timely, Union level, so as to provide for the timely, diligent and objective processing of notices diligent and objective processing of notices on the basis of rules that are uniform, on the basis of rules that are uniform, transparent and clear and that provide for transparent and clear and that provide for robust safeguards to protect the right and robust safeguards to protect the right and legitimate interests of all affected parties, legitimate interests of all affected parties, in particular their fundamental rights in particular their fundamental rights guaranteed by the Charter, irrespective of guaranteed by the Charter, irrespective of the Member State in which those parties the Member State in which those parties are established or reside and of the field of are established or reside and of the field of law at issue. The fundamental rights law at issue. The fundamental rights include, as the case may be, the right to include, as the case may be, the right to freedom of expression and information, the freedom of expression and information, the right to respect for private and family life, right to respect for private and family life, the right to protection of personal data, the the right to protection of personal data, the right to non-discrimination and the right to gender equality principle and the right to an effective remedy of the recipients of the non-discrimination and the right to an service; the freedom to conduct a business, effective remedy of the recipients of the including the freedom of contract, of service; the freedom to conduct a business, service providers; as well as the right to including the freedom of contract, of human dignity, the rights of the child, the service providers; as well as the right to right to protection of property, including human dignity, the rights of the child, the intellectual property, and the right to non- right to protection of property, including discrimination of parties affected by illegal intellectual property, and the right to non- content. discrimination of parties affected by illegal content.

Or. en

Amendment 58

PE695.317v01-00 28/105 AM\1236481EN.docx EN Kira Marie Peter-Hansen

Proposal for a regulation Recital 41

Text proposed by the Commission Amendment

(41) The rules on such notice and action (41) The rules on such notice and action mechanisms should be harmonised at mechanisms should be harmonised at Union level, so as to provide for the timely, Union level, so as to provide for the timely, diligent and objective processing of notices diligent and objective processing of notices on the basis of rules that are uniform, on the basis of rules that are uniform, transparent and clear and that provide for transparent and clear and that provide for robust safeguards to protect the right and robust safeguards to protect the right and legitimate interests of all affected parties, legitimate interests of all affected parties, in particular their fundamental rights in particular their fundamental rights guaranteed by the Charter, irrespective of guaranteed by the Charter, irrespective of the Member State in which those parties the Member State in which those parties are established or reside and of the field of are established or reside and of the field of law at issue. The fundamental rights law at issue. The fundamental rights include, as the case may be, the right to include, as the case may be, the right to freedom of expression and information, the freedom of expression and information, the right to respect for private and family life, right to respect for private and family life, the right to protection of personal data, the the right to protection of personal data, the right to non-discrimination and the right to right to non-discrimination, the right to an effective remedy of the recipients of the gender equality and the right to an service; the freedom to conduct a business, effective remedy of the recipients of the including the freedom of contract, of service; the freedom to conduct a business, service providers; as well as the right to including the freedom of contract, of human dignity, the rights of the child, the service providers; as well as the right to right to protection of property, including human dignity, the rights of the child, the intellectual property, and the right to non- right to protection of property, including discrimination of parties affected by illegal intellectual property, and the right to non- content. discrimination of parties affected by illegal content.

Or. en

Amendment 59 Hilde Vautmans, Karen Melchior, María Soraya Rodríguez Ramos, Samira Rafaela

Proposal for a regulation Recital 41

Text proposed by the Commission Amendment

(41) The rules on such notice and action (41) The rules on such notice and action mechanisms should be harmonised at mechanisms should be harmonised at

AM\1236481EN.docx 29/105 PE695.317v01-00 EN Union level, so as to provide for the timely, Union level, so as to provide for the timely, diligent and objective processing of notices diligent and objective processing of notices on the basis of rules that are uniform, on the basis of rules that are uniform, transparent and clear and that provide for transparent and clear and that provide for robust safeguards to protect the right and robust safeguards to protect the right and legitimate interests of all affected parties, legitimate interests of all affected parties, in particular their fundamental rights in particular their fundamental rights guaranteed by the Charter, irrespective of guaranteed by the Charter, irrespective of the Member State in which those parties the Member State in which those parties are established or reside and of the field of are established or reside and of the field of law at issue. The fundamental rights law at issue. The fundamental rights include, as the case may be, the right to include, as the case may be, the right to freedom of expression and information, the freedom of expression and information, the right to respect for private and family life, right to respect for private and family life, the right to protection of personal data, the the right to protection of personal data, the right to non-discrimination and the right to right to non-discrimination, the right to an effective remedy of the recipients of the gender equality and the right to an service; the freedom to conduct a business, effective remedy of the recipients of the including the freedom of contract, of service; the freedom to conduct a business, service providers; as well as the right to including the freedom of contract, of human dignity, the rights of the child, the service providers; as well as the right to right to protection of property, including human dignity, the rights of the child, the intellectual property, and the right to non- right to protection of property, including discrimination of parties affected by illegal intellectual property, and the right to non- content. discrimination of parties affected by illegal content.

Or. en

Amendment 60 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Recital 46

Text proposed by the Commission Amendment

(46) Action against illegal content can (46) Action against illegal content can be taken more quickly and reliably where be taken more quickly and reliably where online platforms take the necessary online platforms take the necessary measures to ensure that notices submitted measures to ensure that notices submitted by trusted flaggers through the notice and by trusted flaggers through the notice and action mechanisms required by this action mechanisms required by this Regulation are treated with priority, Regulation are treated with priority, without prejudice to the requirement to without prejudice to the requirement to process and decide upon all notices process and decide upon all notices submitted under those mechanisms in a submitted under those mechanisms in a timely, diligent and objective manner. Such timely, diligent and objective manner. Such

PE695.317v01-00 30/105 AM\1236481EN.docx EN trusted flagger status should only be trusted flagger status should only be awarded to entities, and not individuals, awarded to entities, and not individuals, that have demonstrated, among other that have demonstrated, among other things, that they have particular expertise things, that they have particular expertise and competence in tackling illegal content, and competence in tackling illegal content, that they represent collective interests and that they represent collective interests and that they work in a diligent and objective that they work in a diligent and objective manner. Such entities can be public in manner. Such entities can be public in nature, such as, for terrorist content, nature, such as, for terrorist content, internet referral units of national law internet referral units of national law enforcement authorities or of the European enforcement authorities or of the European Union Agency for Law Enforcement Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be Cooperation (‘Europol’) or they can be non-governmental organisations and semi- non-governmental organisations and semi- public bodies, such as the organisations public bodies, such as the organisations part of the INHOPE network of hotlines for part of the INHOPE network of hotlines for reporting child sexual abuse material and reporting child sexual abuse material, organisations committed to notifying organisations committed to notifying illegal racist and xenophobic expressions illegal racist and xenophobic expressions online. For intellectual property rights, online and women’s rights organizations organisations of industry and of right- such as the European Women’s Lobby. holders could be awarded trusted flagger For intellectual property rights, status, where they have demonstrated that organisations of industry and of right- they meet the applicable conditions. The holders could be awarded trusted flagger rules of this Regulation on trusted flaggers status, where they have demonstrated that should not be understood to prevent online they meet the applicable conditions. The platforms from giving similar treatment to rules of this Regulation on trusted flaggers notices submitted by entities or individuals should not be understood to prevent online that have not been awarded trusted flagger platforms from giving similar treatment to status under this Regulation, from notices submitted by entities or individuals otherwise cooperating with other entities, that have not been awarded trusted flagger in accordance with the applicable law, status under this Regulation, from including this Regulation and Regulation otherwise cooperating with other entities, (EU) 2016/794 of the European Parliament in accordance with the applicable law, and of the Council.43 including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 ______43 Regulation (EU) 2016/794 of the 43 Regulation (EU) 2016/794 of the European Parliament and of the Council of European Parliament and of the Council of 11 May 2016 on the European Union 11 May 2016 on the European Union Agency for Law Enforcement Cooperation Agency for Law Enforcement Cooperation (Europol) and replacing and repealing (Europol) and replacing and repealing Council Decisions 2009/371/JHA, Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53 135, 24.5.2016, p. 53

Or. en

AM\1236481EN.docx 31/105 PE695.317v01-00 EN Amendment 61 Silvia Modig

Proposal for a regulation Recital 52

Text proposed by the Commission Amendment

(52) Online advertisement plays an (52) Online advertisement plays an important role in the online environment, important role in the online environment, including in relation to the provision of the including in relation to the provision of services of online platforms. However, the services of online platforms. However, online advertisement can contribute to online advertisement can contribute significant risks, ranging from to significant risks, ranging from advertisement that is itself illegal content, advertisement that is itself illegal to contributing to financial incentives for content, to contributing to financial the publication or amplification of illegal incentives for the publication or or otherwise harmful content and activities amplification of illegal or otherwise online, or the discriminatory display of harmful content and activities online, or advertising with an impact on the equal the discriminatory display of advertising treatment and opportunities of citizens. In with an impact on the equal treatment and addition to the requirements resulting from opportunities of citizens in particular with Article 6 of Directive 2000/31/EC, online regard to gender equality as well platforms should therefore be required to as LGBTIQ+ equality and racial equality. ensure that the recipients of the service In addition to the requirements resulting have certain individualised information from Article 6 of Directive 2000/31/EC, necessary for them to understand when and online platforms should therefore be on whose behalf the advertisement is required to ensure that the recipients of displayed. In addition, recipients of the the service have certain individualised service should have information on the information necessary for them main parameters used for determining that to understand when and on whose behalf specific advertising is to be displayed to the advertisement is displayed. In addition, them, providing meaningful explanations recipients of the service should have of the logic used to that end, including information on the main parameters used when this is based on profiling. The for determining that specific advertising is requirements of this Regulation on the to be displayed to them, providing provision of information relating to meaningful explanations of the logic used advertisement is without prejudice to the to that end, including when this is based on application of the relevant provisions of profiling. The requirements of Regulation (EU) 2016/679, in particular this Regulation on the provision of those regarding the right to object, information relating to advertisement automated individual decision-making, is without prejudice to the application of including profiling and specifically the the relevant provisions of Regulation (EU) need to obtain consent of the data subject 2016/679, in particular those regarding the prior to the processing of personal data for right to object, automated individual targeted advertising. Similarly, it is without decision-making, including profiling and prejudice to the provisions laid down in specifically the need to obtain consent of Directive 2002/58/EC in particular those the data subject prior to the processing of

PE695.317v01-00 32/105 AM\1236481EN.docx EN regarding the storage of information in personal data for targeted advertising. terminal equipment and the access to Similarly, it is without prejudice to information stored therein. the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.

Or. en

Amendment 62 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Recital 52

Text proposed by the Commission Amendment

(52) Online advertisement plays an (52) Online advertisement plays an important role in the online environment, important role in the online environment, including in relation to the provision of the including in relation to the provision of the services of online platforms. However, services of online platforms. However, online advertisement can contribute to online advertisement can contribute to significant risks, ranging from significant risks, ranging from advertisement that is itself illegal content, advertisement that is itself illegal content, to contributing to financial incentives for to contributing to financial incentives for the publication or amplification of illegal the publication or amplification of illegal or otherwise harmful content and activities or otherwise harmful content and activities online, or the discriminatory display of online, or the discriminatory display of advertising with an impact on the equal advertising that can have both an impact treatment and opportunities of citizens. In on the equal treatment and opportunities of addition to the requirements resulting from citizens and on the perpetuation of Article 6 of Directive 2000/31/EC, online harmful gender stereotypes and norms. In platforms should therefore be required to addition to the requirements resulting from ensure that the recipients of the service Article 6 of Directive 2000/31/EC, online have certain individualised information platforms should therefore be required to necessary for them to understand when and ensure that the recipients of the service on whose behalf the advertisement is have certain individualised information displayed. In addition, recipients of the necessary for them to understand when and service should have information on the on whose behalf the advertisement is main parameters used for determining that displayed. In addition, recipients of the specific advertising is to be displayed to service should have information on the them, providing meaningful explanations main parameters used for determining that of the logic used to that end, including specific advertising is to be displayed to when this is based on profiling. The them, providing meaningful explanations requirements of this Regulation on the of the logic used to that end, including provision of information relating to when this is based on profiling. The advertisement is without prejudice to the requirements of this Regulation on the

AM\1236481EN.docx 33/105 PE695.317v01-00 EN application of the relevant provisions of provision of information relating to Regulation (EU) 2016/679, in particular advertisement is without prejudice to the those regarding the right to object, application of the relevant provisions of automated individual decision-making, Regulation (EU) 2016/679, in particular including profiling and specifically the those regarding the right to object, need to obtain consent of the data subject automated individual decision-making, prior to the processing of personal data for including profiling and specifically the targeted advertising. Similarly, it is without need to obtain consent of the data subject prejudice to the provisions laid down in prior to the processing of personal data for Directive 2002/58/EC in particular those targeted advertising. Similarly, it is without regarding the storage of information in prejudice to the provisions laid down in terminal equipment and the access to Directive 2002/58/EC in particular those information stored therein. regarding the storage of information in terminal equipment and the access to information stored therein.

Or. en

Amendment 63 Lena Düpont, Frances Fitzgerald, Elissavet Vozemberg-Vrionidi, Christine Schneider

Proposal for a regulation Recital 52

Text proposed by the Commission Amendment

(52) Online advertisement plays an (52) Online advertisement plays an important role in the online environment, important role in the online environment, including in relation to the provision of the including in relation to the provision of the services of online platforms. However, services of online platforms. However, online advertisement can contribute to online advertisement can contribute to significant risks, ranging from significant risks, ranging from advertisement that is itself illegal content, advertisement that is itself illegal content, to contributing to financial incentives for to contributing to financial incentives for the publication or amplification of illegal the publication or amplification of illegal or otherwise harmful content and activities or otherwise harmful content and activities online, or the discriminatory display of online, or the discriminatory display of advertising with an impact on the equal advertising reproducing stereotypical treatment and opportunities of citizens. In content with an impact on the equal addition to the requirements resulting from treatment and opportunities of citizens Article 6 of Directive 2000/31/EC, online against the gender equality principle. In platforms should therefore be required to addition to the requirements resulting from ensure that the recipients of the service Article 6 of Directive 2000/31/EC, online have certain individualised information platforms should therefore be required to necessary for them to understand when and ensure that the recipients of the service on whose behalf the advertisement is have certain individualised information displayed. In addition, recipients of the necessary for them to understand when and service should have information on the on whose behalf the advertisement is

PE695.317v01-00 34/105 AM\1236481EN.docx EN main parameters used for determining that displayed. In addition, recipients of the specific advertising is to be displayed to service should have information on the them, providing meaningful explanations main parameters used for determining that of the logic used to that end, including specific advertising is to be displayed to when this is based on profiling. The them, providing meaningful explanations requirements of this Regulation on the of the logic used to that end, including provision of information relating to when this is based on profiling. The advertisement is without prejudice to the requirements of this Regulation on the application of the relevant provisions of provision of information relating to Regulation (EU) 2016/679, in particular advertisement is without prejudice to the those regarding the right to object, application of the relevant provisions of automated individual decision-making, Regulation (EU) 2016/679, in particular including profiling and specifically the those regarding the right to object, need to obtain consent of the data subject automated individual decision-making, prior to the processing of personal data for including profiling and specifically the targeted advertising. Similarly, it is without need to obtain consent of the data subject prejudice to the provisions laid down in prior to the processing of personal data for Directive 2002/58/EC in particular those targeted advertising. Similarly, it is without regarding the storage of information in prejudice to the provisions laid down in terminal equipment and the access to Directive 2002/58/EC in particular those information stored therein. regarding the storage of information in terminal equipment and the access to information stored therein.

Or. en

Amendment 64 Kira Marie Peter-Hansen

Proposal for a regulation Recital 52

Text proposed by the Commission Amendment

(52) Online advertisement plays an (52) Online advertisement plays an important role in the online environment, important role in the online environment, including in relation to the provision of the including in relation to the provision of the services of online platforms. However, services of online platforms. However, online advertisement can contribute to online advertisement can contribute to significant risks, ranging from significant risks, ranging from advertisement that is itself illegal content, advertisement that is itself illegal content, to contributing to financial incentives for to contributing to financial incentives for the publication or amplification of illegal the publication or amplification of illegal or otherwise harmful content and activities or otherwise harmful content and activities online, or the discriminatory display of online, or the discriminatory display of advertising with an impact on the equal advertising with an impact on the equal treatment and opportunities of citizens. In treatment and opportunities of citizens, in addition to the requirements resulting from particular with regard to gender equality.

AM\1236481EN.docx 35/105 PE695.317v01-00 EN Article 6 of Directive 2000/31/EC, online In addition to the requirements resulting platforms should therefore be required to from Article 6 of Directive 2000/31/EC, ensure that the recipients of the service online platforms should therefore be have certain individualised information required to ensure that the recipients of the necessary for them to understand when and service have certain individualised on whose behalf the advertisement is information necessary for them to displayed. In addition, recipients of the understand when and on whose behalf the service should have information on the advertisement is displayed. In addition, main parameters used for determining that recipients of the service should have specific advertising is to be displayed to information on the main parameters used them, providing meaningful explanations for determining that specific advertising is of the logic used to that end, including to be displayed to them, providing when this is based on profiling. The meaningful explanations of the logic used requirements of this Regulation on the to that end, including when this is based on provision of information relating to profiling. The requirements of this advertisement is without prejudice to the Regulation on the provision of information application of the relevant provisions of relating to advertisement is without Regulation (EU) 2016/679, in particular prejudice to the application of the relevant those regarding the right to object, provisions of Regulation (EU) 2016/679, in automated individual decision-making, particular those regarding the right to including profiling and specifically the object, automated individual decision- need to obtain consent of the data subject making, including profiling and prior to the processing of personal data for specifically the need to obtain consent of targeted advertising. Similarly, it is without the data subject prior to the processing of prejudice to the provisions laid down in personal data for targeted advertising. Directive 2002/58/EC in particular those Similarly, it is without prejudice to the regarding the storage of information in provisions laid down in Directive terminal equipment and the access to 2002/58/EC in particular those regarding information stored therein. the storage of information in terminal equipment and the access to information stored therein.

Or. en

Amendment 65 Silvia Modig

Proposal for a regulation Recital 57

Text proposed by the Commission Amendment

(57) Three categories of systemic risks (57) Three categories of systemic risks should be assessed in-depth. A first should be assessed in-depth. A first category concerns the risks associated with category concerns the risks associated with the misuse of their service through the the misuse of their service through the dissemination of illegal content, such as the dissemination of illegal content, such as the dissemination of child sexual abuse dissemination of child sexual abuse

PE695.317v01-00 36/105 AM\1236481EN.docx EN material or illegal hate speech, and the material, unlawful non-consensual conduct of illegal activities, such as the sharing of private images, online stalking, sale of products or services prohibited by revenge porn, unsolicited receiving of Union or national law, including sexually explicit materials, doxing, counterfeit products. For example, and cyberbullying, rape threats or illegal hate without prejudice to the personal speech, including online sexist and anti- responsibility of the recipient of the service LGBTIQ+ hate speech or illegal hate of very large online platforms for possible speech, and the conduct of illegal illegality of his or her activity under the activities, such as the sale of products or applicable law, such dissemination or services prohibited by Union or national activities may constitute a significant law, including counterfeit products. systematic risk where access to such For example, and without prejudice to the content may be amplified through accounts personal responsibility of the recipient of with a particularly wide reach. A second the service of very large online platforms category concerns the impact of the service for possible illegality of his or her activity on the exercise of fundamental rights, as under the applicable law, protected by the Charter of Fundamental such dissemination or activities may Rights, including the freedom of constitute a significant systematic expression and information, the right to risk where access to such content may be private life, the right to non-discrimination amplified through accounts with and the rights of the child. Such risks may a particularly wide reach. A second arise, for example, in relation to the design category concerns the impact of the of the algorithmic systems used by the very service on the exercise of fundamental large online platform or the misuse of their rights, as protected by the Charter service through the submission of abusive of Fundamental Rights, including the notices or other methods for silencing freedom of expression and information, speech or hampering competition. A third the right to private life, the right to non- category of risks concerns the intentional discrimination the right to equality and, oftentimes, coordinated manipulation between women and men, the rights of the of the platform’s service, with a child and the right to full control over foreseeable impact on health, civic personal data and information online with discourse, electoral processes, public a rejection of practices by private security and protection of minors, having companies to use data for profit and regard to the need to safeguard public to manipulate behaviour, as well as order, protect privacy and fight fraudulent surveillance practices by any actor and deceptive commercial practices. Such to control and restrict women’s speech risks may arise, for example, through the and activism. Such risks may arise, for creation of fake accounts, the use of bots, example, in relation to the design of the and other automated or partially automated algorithmic systems used by the very large behaviours, which may lead to the rapid online platform or the misuse of their and widespread dissemination of service through the submission of abusive information that is illegal content or notices or other methods for silencing incompatible with an online platform’s speech or hampering competition or terms and conditions. putting gender equality at risk. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having

AM\1236481EN.docx 37/105 PE695.317v01-00 EN regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.

Or. en

Amendment 66 Lena Düpont, Frances Fitzgerald, Elissavet Vozemberg-Vrionidi, Christine Schneider

Proposal for a regulation Recital 57

Text proposed by the Commission Amendment

(57) Three categories of systemic risks (57) Three categories of systemic risks should be assessed in-depth. A first should be assessed in-depth. A first category concerns the risks associated with category concerns the risks associated with the misuse of their service through the the misuse of their service through the dissemination of illegal content, such as the dissemination of illegal content, such as the dissemination of child sexual abuse dissemination of child sexual abuse material or illegal hate speech, and the material, unlawful non-consensual conduct of illegal activities, such as the sharing of private images and videos, sale of products or services prohibited by online stalking or illegal hate speech, and Union or national law, including the conduct of illegal activities, such as the counterfeit products. For example, and sale of products or services prohibited by without prejudice to the personal Union or national law, including responsibility of the recipient of the service counterfeit products. For example, and of very large online platforms for possible without prejudice to the personal illegality of his or her activity under the responsibility of the recipient of the service applicable law, such dissemination or of very large online platforms for possible activities may constitute a significant illegality of his or her activity under the systematic risk where access to such applicable law, such dissemination or content may be amplified through accounts activities may constitute a significant with a particularly wide reach. A second systematic risk where access to such category concerns the impact of the service content may be amplified through accounts on the exercise of fundamental rights, as with a particularly wide reach. A second protected by the Charter of Fundamental category concerns the impact of the service Rights, including the freedom of on the exercise of fundamental rights, as expression and information, the right to protected by the Charter of Fundamental private life, the right to non-discrimination Rights, including the freedom of

PE695.317v01-00 38/105 AM\1236481EN.docx EN and the rights of the child. Such risks may expression and information, the right to arise, for example, in relation to the design private life, the gender equality principle of the algorithmic systems used by the very with the right to non-discrimination and large online platform or the misuse of their the rights of the child. The social service through the submission of abusive dimension, as online platforms play a notices or other methods for silencing major role in our everyday-life, is also speech or hampering competition. A third affected by phenomena as online category of risks concerns the intentional harassment and cyber violence. Such risks and, oftentimes, coordinated manipulation may arise, for example, in relation to the of the platform’s service, with a design of the algorithmic systems used by foreseeable impact on health, civic the very large online platform, including discourse, electoral processes, public when algorithms are misinformed causing security and protection of minors, having widening of gender gaps, or the misuse of regard to the need to safeguard public their service through the submission of order, protect privacy and fight fraudulent abusive notices or other methods for and deceptive commercial practices. Such silencing speech, causing harm, such as risks may arise, for example, through the long term mental health damage, creation of fake accounts, the use of bots, psychological damage and societal and other automated or partially automated damage, or hampering competition. A third behaviours, which may lead to the rapid category of risks concerns the intentional and widespread dissemination of and, oftentimes, coordinated manipulation information that is illegal content or of the platform’s service, with a incompatible with an online platform’s foreseeable impact on health, civic terms and conditions. discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.

Or. en

Amendment 67 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Recital 57

Text proposed by the Commission Amendment

(57) Three categories of systemic risks (57) Three categories of systemic risks

AM\1236481EN.docx 39/105 PE695.317v01-00 EN should be assessed in-depth. A first should be assessed in-depth. A first category concerns the risks associated with category concerns the risks associated with the misuse of their service through the the misuse of their service through the dissemination of illegal content, such as the dissemination of illegal content, such as the dissemination of child sexual abuse dissemination of child sexual abuse material or illegal hate speech, and the material, content related to the sexual conduct of illegal activities, such as the exploitation and abuse of women and sale of products or services prohibited by girls, revenge porn, online stalking or Union or national law, including illegal hate speech, and the conduct of counterfeit products. For example, and illegal activities, such as the sale of without prejudice to the personal products or services prohibited by Union or responsibility of the recipient of the service national law, including counterfeit of very large online platforms for possible products. For example, and without illegality of his or her activity under the prejudice to the personal responsibility of applicable law, such dissemination or the recipient of the service of very large activities may constitute a significant online platforms for possible illegality of systematic risk where access to such his or her activity under the applicable law, content may be amplified through accounts such dissemination or activities may with a particularly wide reach. A second constitute a significant systematic risk category concerns the impact of the service where access to such content may be on the exercise of fundamental rights, as amplified through advertising, protected by the Charter of Fundamental recommender systems or through Rights, including the freedom of accounts with a particularly wide reach. A expression and information, the right to second category concerns the impact of the private life, the right to non-discrimination service on the exercise of fundamental and the rights of the child. Such risks may rights, as protected by the Charter of arise, for example, in relation to the design Fundamental Rights, including the freedom of the algorithmic systems used by the very of expression and information, the right to large online platform or the misuse of their private life, the right to personal data service through the submission of abusive protection, the right to non-discrimination notices or other methods for silencing and the rights of the child. Such risks may speech or hampering competition. A third arise, for example, in relation to the design category of risks concerns the intentional of the algorithmic systems used by the very and, oftentimes, coordinated manipulation large online platform, which might amplify of the platform’s service, with a discriminatory speech and content, or the foreseeable impact on health, civic misuse of their service through the discourse, electoral processes, public submission of abusive notices or other security and protection of minors, having methods for silencing speech or hampering regard to the need to safeguard public competition. A third category of risks order, protect privacy and fight fraudulent concerns the intentional and, oftentimes, and deceptive commercial practices. Such coordinated manipulation of the platform’s risks may arise, for example, through the service, with a foreseeable impact on creation of fake accounts, the use of bots, health, civic discourse, electoral processes, and other automated or partially automated public security and protection of minors, behaviours, which may lead to the rapid having regard to the need to safeguard and widespread dissemination of public order, protect privacy and fight information that is illegal content or fraudulent and deceptive commercial incompatible with an online platform’s practices. Such risks may arise, for terms and conditions. example, through the creation of fake accounts, the use of bots, and other

PE695.317v01-00 40/105 AM\1236481EN.docx EN automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.

Or. en

Amendment 68 Kira Marie Peter-Hansen

Proposal for a regulation Recital 57

Text proposed by the Commission Amendment

(57) Three categories of systemic risks (57) Three categories of systemic risks should be assessed in-depth. A first should be assessed in-depth. A first category concerns the risks associated with category concerns the risks associated with the misuse of their service through the the misuse of their service through the dissemination of illegal content, such as the dissemination of illegal content, such as the dissemination of child sexual abuse dissemination of child sexual abuse material or illegal hate speech, and the material or illegal hate speech, and the conduct of illegal activities, such as the conduct of illegal activities, such as the sale of products or services prohibited by sale of products or services prohibited by Union or national law, including Union or national law, including counterfeit products. For example, and counterfeit products. For example, and without prejudice to the personal without prejudice to the personal responsibility of the recipient of the service responsibility of the recipient of the service of very large online platforms for possible of very large online platforms for possible illegality of his or her activity under the illegality of his or her activity under the applicable law, such dissemination or applicable law, such dissemination or activities may constitute a significant activities may constitute a significant systematic risk where access to such systematic risk where access to such content may be amplified through accounts content may be amplified through accounts with a particularly wide reach. A second with a particularly wide reach. A second category concerns the impact of the service category concerns the impact of the service on the exercise of fundamental rights, as on the exercise of fundamental rights, as protected by the Charter of Fundamental protected by the Charter of Fundamental Rights, including the freedom of Rights, including the freedom of expression and information, the right to expression and information, the right to private life, the right to non-discrimination private life, the right to non-discrimination, and the rights of the child. Such risks may the right to gender equality and the rights arise, for example, in relation to the design of the child. Such risks may arise, for of the algorithmic systems used by the very example, in relation to the design of the large online platform or the misuse of their algorithmic systems used by the very large service through the submission of abusive online platform or the misuse of their

AM\1236481EN.docx 41/105 PE695.317v01-00 EN notices or other methods for silencing service through the submission of abusive speech or hampering competition. A third notices or other methods for silencing category of risks concerns the intentional speech or hampering competition. A third and, oftentimes, coordinated manipulation category of risks concerns the intentional of the platform’s service, with a and, oftentimes, coordinated manipulation foreseeable impact on health, civic of the platform’s service, with a discourse, electoral processes, public foreseeable impact on health, civic security and protection of minors, having discourse, electoral processes, public regard to the need to safeguard public security and protection of minors, having order, protect privacy and fight fraudulent regard to the need to safeguard public and deceptive commercial practices. Such order, protect privacy and fight fraudulent risks may arise, for example, through the and deceptive commercial practices. Such creation of fake accounts, the use of bots, risks may arise, for example, through the and other automated or partially automated creation of fake accounts, the use of bots, behaviours, which may lead to the rapid and other automated or partially automated and widespread dissemination of behaviours, which may lead to the rapid information that is illegal content or and widespread dissemination of incompatible with an online platform’s information that is illegal content or terms and conditions. incompatible with an online platform’s terms and conditions.

Or. en

Amendment 69 Hilde Vautmans, Karen Melchior, María Soraya Rodríguez Ramos, Samira Rafaela

Proposal for a regulation Recital 57

Text proposed by the Commission Amendment

(57) Three categories of systemic risks (57) Three categories of systemic risks should be assessed in-depth. A first should be assessed in-depth. A first category concerns the risks associated with category concerns the risks associated with the misuse of their service through the the misuse of their service through the dissemination of illegal content, such as the dissemination of illegal content, such as the dissemination of child sexual abuse dissemination of child sexual abuse material or illegal hate speech, and the material or illegal hate speech, and the conduct of illegal activities, such as the conduct of illegal activities, such as the sale of products or services prohibited by sale of products or services prohibited by Union or national law, including Union or national law, including counterfeit products. For example, and counterfeit products. For example, and without prejudice to the personal without prejudice to the personal responsibility of the recipient of the service responsibility of the recipient of the service of very large online platforms for possible of very large online platforms for possible illegality of his or her activity under the illegality of his or her activity under the applicable law, such dissemination or applicable law, such dissemination or activities may constitute a significant activities may constitute a significant

PE695.317v01-00 42/105 AM\1236481EN.docx EN systematic risk where access to such systematic risk where access to such content may be amplified through accounts content may be amplified through accounts with a particularly wide reach. A second with a particularly wide reach. A second category concerns the impact of the service category concerns the impact of the service on the exercise of fundamental rights, as on the exercise of fundamental rights, as protected by the Charter of Fundamental protected by the Charter of Fundamental Rights, including the freedom of Rights, including the freedom of expression and information, the right to expression and information, the right to private life, the right to non-discrimination private life, the right to non-discrimination, and the rights of the child. Such risks may the right to gender equality and the rights arise, for example, in relation to the design of the child. Such risks may arise, for of the algorithmic systems used by the very example, in relation to the design of the large online platform or the misuse of their algorithmic systems used by the very large service through the submission of abusive online platform or the misuse of their notices or other methods for silencing service through the submission of abusive speech or hampering competition. A third notices or other methods for silencing category of risks concerns the intentional speech or hampering competition. A third and, oftentimes, coordinated manipulation category of risks concerns the intentional of the platform’s service, with a and, oftentimes, coordinated manipulation foreseeable impact on health, civic of the platform’s service, with a discourse, electoral processes, public foreseeable impact on health, civic security and protection of minors, having discourse, electoral processes, public regard to the need to safeguard public security and protection of minors, having order, protect privacy and fight fraudulent regard to the need to safeguard public and deceptive commercial practices. Such order, protect privacy and fight fraudulent risks may arise, for example, through the and deceptive commercial practices. Such creation of fake accounts, the use of bots, risks may arise, for example, through the and other automated or partially automated creation of fake accounts, the use of bots, behaviours, which may lead to the rapid and other automated or partially automated and widespread dissemination of behaviours, which may lead to the rapid information that is illegal content or and widespread dissemination of incompatible with an online platform’s information that is illegal content or terms and conditions. incompatible with an online platform’s terms and conditions.

Or. en

Amendment 70 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Recital 58

Text proposed by the Commission Amendment

(58) Very large online platforms should (58) Very large online platforms should deploy the necessary means to diligently deploy the necessary means to diligently mitigate the systemic risks identified in the cease, prevent and mitigate the systemic

AM\1236481EN.docx 43/105 PE695.317v01-00 EN risk assessment. Very large online risks identified in the risk assessment. Very platforms should under such mitigating large online platforms should under such measures consider, for example, enhancing mitigating measures consider, for example, or otherwise adapting the design and enhancing or otherwise adapting the design functioning of their content moderation, and functioning of their content algorithmic recommender systems and moderation, algorithmic recommender online interfaces, so that they discourage systems and online interfaces, so that they and limit the dissemination of illegal discourage and limit the dissemination of content, adapting their decision-making illegal content, adapting their decision- processes, or adapting their terms and making processes, or adapting their terms conditions. They may also include and conditions to cover aspects such as corrective measures, such as discontinuing online gender-based violence. They may advertising revenue for specific content, or also include corrective measures, such as other actions, such as improving the discontinuing advertising revenue for visibility of authoritative information specific content, or other actions, such as sources. Very large online platforms may improving the visibility of authoritative reinforce their internal processes or information sources. They may also supervision of any of their activities, in consider providing training to their staff particular as regards the detection of and specifically to content moderators so systemic risks. They may also initiate or they can stay up to date on covert increase cooperation with trusted flaggers, language used as a form of illegal hate organise training sessions and exchanges speech and violence against certain with trusted flagger organisations, and groups such as women and minorities. cooperate with other service providers, Very large online platforms may reinforce including by initiating or joining existing their internal processes or supervision of codes of conduct or other self-regulatory any of their activities, in particular as measures. Any measures adopted should regards the detection of systemic risks. respect the due diligence requirements of They may also initiate or increase this Regulation and be effective and cooperation with trusted flaggers and civil appropriate for mitigating the specific risks society organizations that represent the identified, in the interest of safeguarding interests of certain groups such as public order, protecting privacy and women’s rights organisations, organise fighting fraudulent and deceptive training sessions and exchanges with these commercial practices, and should be organisations, and cooperate with other proportionate in light of the very large service providers, including by initiating or online platform’s economic capacity and joining existing codes of conduct or other the need to avoid unnecessary restrictions self-regulatory measures. Any measures on the use of their service, taking due adopted should respect the due diligence account of potential negative effects on the requirements of this Regulation and be fundamental rights of the recipients of the effective and appropriate for mitigating the service. specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the

PE695.317v01-00 44/105 AM\1236481EN.docx EN service.

Or. en

Amendment 71 Silvia Modig

Proposal for a regulation Recital 58

Text proposed by the Commission Amendment

(58) Very large online platforms should (58) Very large online platforms deploy the necessary means to diligently should deploy the necessary means to mitigate the systemic risks identified in the diligently mitigate the systemic risk assessment. Very large online risks identified in the risk assessment. Very platforms should under such mitigating large online platforms should under such measures consider, for example, enhancing mitigating measures consider, for example, or otherwise adapting the design and enhancing or otherwise adapting the design functioning of their content moderation, and functioning of their content algorithmic recommender systems and moderation, algorithmic recommender online interfaces, so that they discourage systems and online interfaces, and limit the dissemination of illegal including gender equality so that they content, adapting their decision-making discourage and limit the dissemination of processes, or adapting their terms and illegal content, adapting their decision- conditions. They may also include making processes, or adapting their terms corrective measures, such as discontinuing and conditions. They should advertising revenue for specific content, or include corrective measures, such as other actions, such as improving the discontinuing advertising revenue for visibility of authoritative information specific content, or other actions, such as sources. Very large online platforms may improving the visibility of reinforce their internal processes or authoritative information sources. Very supervision of any of their activities, in large online platforms should reinforce particular as regards the detection of their internal processes or supervision of systemic risks. They may also initiate or any of their activities, in particular as increase cooperation with trusted flaggers, regards the detection of systemic risks. organise training sessions and exchanges They should also initiate or increase with trusted flagger organisations, and cooperation with trusted flaggers, organise cooperate with other service providers, training sessions and exchanges with including by initiating or joining existing trusted flagger organisations, and cooperate codes of conduct or other self-regulatory with other service providers and women’s measures. Any measures adopted should rights and human respect the due diligence requirements of rights organizations, including by this Regulation and be effective and initiating or joining existing codes of appropriate for mitigating the specific risks conduct or other self-regulatory measures. identified, in the interest of safeguarding Any measures adopted should respect the public order, protecting privacy and due diligence requirements of this fighting fraudulent and deceptive Regulation and be effective and

AM\1236481EN.docx 45/105 PE695.317v01-00 EN commercial practices, and should be appropriate for mitigating the specific risks proportionate in light of the very large identified, in the interest of online platform’s economic capacity and safeguarding public order, protecting the need to avoid unnecessary restrictions privacy and gender equality and fighting on the use of their service, taking due fraudulent and deceptive commercial account of potential negative effects on the practices, and should be proportionate in fundamental rights of the recipients of the light of the very large online platform’s service. economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.

Or. en

Amendment 72 Silvia Modig

Proposal for a regulation Recital 58 a (new)

Text proposed by the Commission Amendment

(58 a) Transparency and effectivity of processes is the key to make online platforms safer to use and tackling online violence and illegal content. The actions of online platform’s decisions on how they act, or not, to remove illegal, abusive and harmful content vary hugely and some of the reports can remain unanswered. There must be an easily accessible knowledge for all users on how and why the content is being removed. These processes needs to be fully transparent. Very large online platforms should actively report and publish meaningful data on how they handle gender and other identity-based violence and they should share this information in an easy and accessible way on their platforms on an annual basis. This should include the number of reports they receive per year, and also the number of reports they receive that failed to receive any response from them, disaggregated by the category of the illegal, harmful and abusive content being reported. Very large

PE695.317v01-00 46/105 AM\1236481EN.docx EN platforms should ensure that experts and academics have access to the relevant data, e.g. to enable them to compare and evaluate how measures are working in order to gain a better understanding of the extent of the problem. They should also link their measures to international human rights, regularly evaluate, and update the implementation of their own ethical standards.

Or. en

Amendment 73 Silvia Modig

Proposal for a regulation Recital 58 b (new)

Text proposed by the Commission Amendment

(58 b) There is a need for sufficient resources to remove illegal, abusive and harmful content on the services that very large platforms offer to their users and a need for transparency on how and how much the very large online platforms educate their workers on this issue. Neither users nor trusted flaggers are fully capable of providing the effective moderation needed to ensure that the very large online platforms become safer. Very large online platforms should actively share information on the number of content moderators they employ, including the number of moderators employed per region and per language. Continuous training is also needed on issues such as gender, anti-Roma and anti-LGBTIQ+ discrimination and other identity-based violence and racism. Very large online platforms should actively share the information on how moderators are trained to identify gender and other identity-based violence and abuse against users, as well as how moderators are trained about international human rights standards and about their responsibility to

AM\1236481EN.docx 47/105 PE695.317v01-00 EN respect the rights of users on their platform. They should also actively involve human rights organisations in their training programmes.

Or. en

Amendment 74 Silvia Modig

Proposal for a regulation Recital 58 c (new)

Text proposed by the Commission Amendment

(58 c) The content of very large online platforms needs to be fully and easily accessible to all of their users. This can be achieved by implementing user- friendly measures into the services that very large online platforms offer. Very large online platforms should present their terms of service in machine-readable format and also make all their previous versions of their terms of service easily accessible to the public, including by persons with disabilities. Options to report potentially illegal, abusive and harmful content must be easy to find and to use in the native language of the user. Information on support for persons affected and on national contact points should be easily achievable. Very large online platforms should offer and evolve easily accessible services to all users in these kind of and similar cases. They should also make moderation as easy as possible, with the help of tools, training etc. for people administrating and moderating online groups that are using their platforms and services. They should also improve and ensure accessibility of elements and functions of their services for persons with disabilities.

Or. en

PE695.317v01-00 48/105 AM\1236481EN.docx EN Amendment 75 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Recital 59

Text proposed by the Commission Amendment

(59) Very large online platforms should, (59) Very large online platforms should, where appropriate, conduct their risk where appropriate, conduct their risk assessments and design their risk assessments and design their risk mitigation measures with the involvement mitigation measures with the involvement of representatives of the recipients of the of representatives of the recipients of the service, representatives of groups service, representatives of groups potentially impacted by their services, potentially impacted by their services such independent experts and civil society as consumer’s and women’s rights organisations. organizations, independent experts and civil society organisations.

Or. en

Amendment 76 Silvia Modig

Proposal for a regulation Recital 62

Text proposed by the Commission Amendment

(62) A core part of a very large online (62) A core part of a very large online platform’s business is the manner in which platform’s business is the manner in which information is prioritised and presented on information is prioritised and presented on its online interface to facilitate and its online interface to facilitate and optimise access to information for the optimise access to information for the recipients of the service. This is done, for recipients of the service. This is done, for example, by algorithmically suggesting, example, by algorithmically suggesting, ranking and prioritising information, ranking and prioritising information, distinguishing through text or other visual distinguishing through text or other visual representations, or otherwise curating representations, or otherwise curating information provided by recipients. Such information provided by recipients. Such recommender systems can have a recommender systems can have a significant impact on the ability of significant impact on the ability of recipients to retrieve and interact with recipients to retrieve and interact with information online. They also play an information online. They also play an important role in the amplification of important role in the amplification of certain messages, the viral dissemination of certain messages, the viral dissemination of information and the stimulation of online information and the stimulation of online behaviour. Consequently, very large online behaviour. These algorithms may lead to

AM\1236481EN.docx 49/105 PE695.317v01-00 EN platforms should ensure that recipients are setbacks in gender equality, such as an appropriately informed, and can influence increase in cases of gender and racial the information presented to them. They discrimination, cyber violence against should clearly present the main parameters women and LGBTIQ+ persons, or they for such recommender systems in an easily can exacerbate toxic masculinity as well comprehensible manner to ensure that the as amplify existing harmful gender recipients understand how information is stereotypes, including ableist, ageist and prioritised for them. They should also racial stereotyping. Consequently, very ensure that the recipients enjoy alternative large online platforms should be obliged to options for the main parameters, including regularly review their algorithms in an options that are not based on profiling of intersectional gender-responsive manner the recipient. in order to minimise any negative consequences on gender equality and should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible and accessible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient. They should provide meaningful information about the algorithmic tools they use in content moderation and content curation. Very large online platforms should also offer easily accessible and comprehensive explanations that allow users to understand when, why, for which tasks, and to which extent algorithmic tools are used. They should let users in an easy and accessible way to accept or not the algorithms being used in the provided service. They should allow independent researchers and relevant regulators to audit their algorithmic tools to make sure they are used as intended.

Or. en

Amendment 77 Lena Düpont, Frances Fitzgerald, Elissavet Vozemberg-Vrionidi, Christine Schneider

Proposal for a regulation Recital 62

PE695.317v01-00 50/105 AM\1236481EN.docx EN Text proposed by the Commission Amendment

(62) A core part of a very large online (62) A core part of a very large online platform’s business is the manner in which platform’s business is the manner in which information is prioritised and presented on information is prioritised and presented on its online interface to facilitate and its online interface to facilitate and optimise access to information for the optimise access to information for the recipients of the service. This is done, for recipients of the service. This is done, for example, by algorithmically suggesting, example, by algorithmically suggesting, ranking and prioritising information, ranking and prioritising information, distinguishing through text or other visual distinguishing through text or other visual representations, or otherwise curating representations, or otherwise curating information provided by recipients. Such information provided by recipients. Such recommender systems can have a recommender systems can have a significant impact on the ability of significant impact on the ability of recipients to retrieve and interact with recipients to retrieve and interact with information online. They also play an information online. They also play an important role in the amplification of important role in the amplification of certain messages, the viral dissemination of certain messages, the viral dissemination of information and the stimulation of online information and the stimulation of online behaviour. Consequently, very large online behaviour. Consequently, very large online platforms should ensure that recipients are platforms should be obliged to regularly appropriately informed, and can influence review their algorithms to minimise such the information presented to them. They negative consequences and should ensure should clearly present the main parameters that recipients are appropriately informed, for such recommender systems in an easily and can influence the information comprehensible manner to ensure that the presented to them. They should clearly recipients understand how information is present the main parameters for such prioritised for them. They should also recommender systems in an easily ensure that the recipients enjoy alternative comprehensible manner to ensure that the options for the main parameters, including recipients understand how information is options that are not based on profiling of prioritised for them. They should also the recipient. ensure that the recipients have alternative options for the main parameters, including a visible, user-friendly and readily available option to turn off algorithmic selection with the recommender system entirely and options that are not based on profiling of the recipient. The gender- based algorithm bias must be prevented to avoid discriminatory impact on women and girls.

Or. en

Amendment 78 Silvia Modig

AM\1236481EN.docx 51/105 PE695.317v01-00 EN Proposal for a regulation Recital 63

Text proposed by the Commission Amendment

(63) Advertising systems used by very (63) Advertising systems used by very large online platforms pose particular risks large online platforms pose particular risks and require further public and regulatory and require further public and regulatory supervision on account of their scale and supervision on account of their scale and ability to target and reach recipients of the ability to target and reach recipients of the service based on their behaviour within and service based on their behaviour within and outside that platform’s online interface. outside that platform’s online interface. Very large online platforms should ensure Very large online platforms should ensure public access to repositories of public access to repositories of advertisements displayed on their online advertisements displayed on their online interfaces to facilitate supervision and interfaces to facilitate supervision and research into emerging risks brought about research into emerging risks brought about by the distribution of advertising online, by the distribution of advertising online, for example in relation to illegal for example in relation to illegal advertisements or manipulative techniques advertisements or manipulative techniques and disinformation with a real and and disinformation with a real and foreseeable negative impact on public foreseeable negative impact on public health, public security, civil discourse, health, public security, civil discourse, political participation and equality. political participation and equality. Repositories should include the content of Repositories should include the content of advertisements and related data on the advertisements and related data on the advertiser and the delivery of the advertiser and the delivery of the advertisement, in particular where targeted advertisement, in particular where targeted advertising is concerned. advertising is concerned. Disinformation, especially political disinformation has become a huge problem and very large online platforms have more and more become the platforms of sharing this kind of content, especially via advertising. Very large online platforms should take out extremist actors in consultation with independent experts, in case of repeated violations. Very large online platforms should implement comprehensive and verifiable standards and measures to limit the scope of extremist actors and purposeful disinformation.

Or. en

Amendment 79 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

PE695.317v01-00 52/105 AM\1236481EN.docx EN Proposal for a regulation Recital 64

Text proposed by the Commission Amendment

(64) In order to appropriately supervise (64) In order to appropriately supervise the compliance of very large online the compliance of very large online platforms with the obligations laid down platforms with the obligations laid down by this Regulation, the Digital Services by this Regulation, the Digital Services Coordinator of establishment or the Coordinator of establishment or the Commission may require access to or Commission may require access to or reporting of specific data. Such a reporting of specific data. Such a requirement may include, for example, the requirement may include, for example, the data necessary to assess the risks and data necessary to assess the risks and possible harms brought about by the possible harms brought about by the platform’s systems, data on the accuracy, platform’s systems, data on the accuracy, functioning and testing of algorithmic functioning and testing of algorithmic systems for content moderation, systems for content moderation, recommender systems or advertising recommender systems or advertising systems, or data on processes and outputs systems, or data on processes and outputs of content moderation or of internal of content moderation or of internal complaint-handling systems within the complaint-handling systems within the meaning of this Regulation. Investigations meaning of this Regulation. Investigations by researchers on the evolution and by researchers on the evolution and severity of online systemic risks are severity of online systemic risks are particularly important for bridging particularly important for bridging information asymmetries and establishing a information asymmetries and establishing a resilient system of risk mitigation, resilient system of risk mitigation, informing online platforms, Digital informing online platforms, Digital Services Coordinators, other competent Services Coordinators, other competent authorities, the Commission and the public. authorities, the Commission and the public. This Regulation therefore provides a This Regulation therefore provides a framework for compelling access to data framework for compelling access to data from very large online platforms to vetted from very large online platforms to vetted researchers. All requirements for access to researchers. This data should be provided data under that framework should be as disaggregated as possible in order to proportionate and appropriately protect the allow for meaningful conclusions to be rights and legitimate interests, including drawn from it. For example, it is trade secrets and other confidential important that very large online platforms information, of the platform and any other provided gender disaggregated data as parties concerned, including the recipients much as possible in order for vetted of the service. researchers to have the possibility to explore whether and in what way certain online risks are experienced differently between men and women. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets

AM\1236481EN.docx 53/105 PE695.317v01-00 EN and other confidential information, of the platform and any other parties concerned, including the recipients of the service.

Or. en

Amendment 80 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Recital 74

Text proposed by the Commission Amendment

(74) The Digital Services Coordinator, (74) The Digital Services Coordinator, as well as other competent authorities as well as other competent authorities designated under this Regulation, play a designated under this Regulation, play a crucial role in ensuring the effectiveness of crucial role in ensuring the effectiveness of the rights and obligations laid down in this the rights and obligations laid down in this Regulation and the achievement of its Regulation and the achievement of its objectives. Accordingly, it is necessary to objectives. Accordingly, it is necessary to ensure that those authorities act in ensure that those authorities act in complete independence from private and complete independence from private and public bodies, without the obligation or public bodies, without the obligation or possibility to seek or receive instructions, possibility to seek or receive instructions, including from the government, and including from the government, and without prejudice to the specific duties to without prejudice to the specific duties to cooperate with other competent authorities, cooperate with other competent authorities, the Digital Services Coordinators, the the Digital Services Coordinators, the Board and the Commission. On the other Board and the Commission. On the other hand, the independence of these authorities hand, the independence of these authorities should not mean that they cannot be should not mean that they cannot be subject, in accordance with national subject, in accordance with national constitutions and without endangering the constitutions and without endangering the achievement of the objectives of this achievement of the objectives of this Regulation, to national control or Regulation, to national control or monitoring mechanisms regarding their monitoring mechanisms regarding their financial expenditure or to judicial review, financial expenditure or to judicial review, or that they should not have the possibility or that they should not have the possibility to consult other national authorities, to consult other national authorities, including law enforcement authorities or including law enforcement authorities or crisis management authorities, where crisis management authorities, where appropriate. appropriate. Moreover, it is important to assure that the Digital Services Coordinator, as well as other competent authorities have the necessary knowledge to guarantee the rights and obligations of this Regulation. Therefore, they should

PE695.317v01-00 54/105 AM\1236481EN.docx EN promote education and training on fundamental rights and discrimination for their staff, including training in partnership with law enforcement authorities, crisis management authorities or civil society organisations that support victims of illegal online and offline activities such as harassment, gender- based violence and illegal hate speech.

Or. en

Amendment 81 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Recital 82

Text proposed by the Commission Amendment

(82) Member States should ensure that (82) Member States should ensure that Digital Services Coordinators can take Digital Services Coordinators can take measures that are effective in addressing measures that are effective in addressing and proportionate to certain particularly and proportionate to certain particularly serious and persistent infringements. serious and persistent infringements. Especially where those measures can affect Especially where those measures can affect the rights and interests of third parties, as the rights and interests of third parties, as may be the case in particular where the may be the case in particular where the access to online interfaces is restricted, it is access to online interfaces is restricted, it is appropriate to require that the measures be appropriate to require that the measures be ordered by a competent judicial authority ordered by a competent judicial authority at the Digital Service Coordinators’ request at the Digital Service Coordinators’ request and are subject to additional safeguards. In and are subject to additional safeguards. In particular, third parties potentially affected particular, third parties potentially affected should be afforded the opportunity to be should be afforded the opportunity to be heard and such orders should only be heard and such orders should only be issued when powers to take such measures issued when powers to take such measures as provided by other acts of Union law or as provided by other acts of Union law or by national law, for instance to protect by national law, for instance to protect collective interests of consumers, to ensure collective interests of consumers, to ensure the prompt removal of web pages the prompt removal of web pages containing or disseminating child containing or disseminating child pornography, or to disable access to pornography, content associated with the services are being used by a third party to sexual exploitation and abuse of women infringe an intellectual property right, are and girls and revenge porn, or to disable not reasonably available. access to services are being used by a third party to infringe an intellectual property

AM\1236481EN.docx 55/105 PE695.317v01-00 EN right, are not reasonably available.

Or. en

Amendment 82 Silvia Modig

Proposal for a regulation Recital 88

Text proposed by the Commission Amendment

(88) In order to ensure a consistent (88) In order to ensure a consistent application of this Regulation, it is application of this Regulation, it is necessary to set up an independent necessary to set up an independent advisory group at Union level, which advisory group at Union level, which should support the Commission and help should support the Commission and help coordinate the actions of Digital Services coordinate the actions of Digital Coordinators. That European Board for Services Coordinators. This advisory Digital Services should consist of the group should strive to achieve a gender- Digital Services Coordinators, without balanced representation in its prejudice to the possibility for Digital composition. That European Board for Services Coordinators to invite in its Digital Services should consist of the meetings or appoint ad hoc delegates from Digital Services Coordinators, other competent authorities entrusted with without prejudice to the possibility for specific tasks under this Regulation, where Digital Services Coordinators to invite in that is required pursuant to their national its meetings or appoint ad hoc delegates allocation of tasks and competences. In from other competent authorities entrusted case of multiple participants from one with specific tasks under this Regulation, Member State, the voting right should where that is required pursuant to their remain limited to one representative per national allocation of tasks and Member State. competences. In case of multiple participants from one Member State, the voting right should remain limited to one representative per Member State.

Or. en

Amendment 83 Silvia Modig

Proposal for a regulation Recital 91

Text proposed by the Commission Amendment

PE695.317v01-00 56/105 AM\1236481EN.docx EN (91) The Board should bring together (91) The Board should bring together the representatives of the Digital Services the representatives of the Digital Services Coordinators and possible other competent Coordinators and possible other competent authorities under the chairmanship of the authorities under the chairpersonship of Commission, with a view to ensuring an the Commission, with a view to ensuring assessment of matters submitted to it in a an assessment of matters submitted to it in fully European dimension. In view of a fully European dimension. In view of possible cross-cutting elements that may be possible cross-cutting elements that may be of relevance for other regulatory of relevance for other regulatory frameworks at Union level, the Board frameworks at Union level, the Board should be allowed to cooperate with other should be allowed to cooperate with other Union bodies, offices, agencies and Union bodies, offices, agencies and advisory groups with responsibilities in advisory groups with responsibilities in fields such as equality, including equality fields such as equality, in particular between women and men, and non- gender equality, LGBTIQ+ equality, and discrimination, data protection, electronic non-discrimination, gender and other communications, audiovisual services, identity-based online violence and detection and investigation of frauds harassment, online stalking, online against the EU budget as regards custom sex trafficking, child abuse, data duties, or consumer protection, as protection, electronic communications, necessary for the performance of its tasks. audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.

Or. en

Amendment 84 Lena Düpont, Frances Fitzgerald, Elissavet Vozemberg-Vrionidi, Christine Schneider

Proposal for a regulation Recital 91

Text proposed by the Commission Amendment

(91) The Board should bring together (91) The Board should bring together the representatives of the Digital Services the representatives of the Digital Services Coordinators and possible other competent Coordinators and possible other competent authorities under the chairmanship of the authorities under the chairmanship of the Commission, with a view to ensuring an Commission, with a view to ensuring an assessment of matters submitted to it in a assessment of matters submitted to it in a fully European dimension. In view of fully European dimension. In view of possible cross-cutting elements that may be possible cross-cutting elements that may be of relevance for other regulatory of relevance for other regulatory frameworks at Union level, the Board frameworks at Union level, the Board should be allowed to cooperate with other should be allowed to cooperate with other Union bodies, offices, agencies and Union bodies, offices, agencies and

AM\1236481EN.docx 57/105 PE695.317v01-00 EN advisory groups with responsibilities in advisory groups with responsibilities in fields such as equality, including equality fields such as gender equality and non- between women and men, and non- discrimination, eradicating all forms of discrimination, data protection, electronic violence against women and girls, communications, audiovisual services, including online violence, harassment detection and investigation of frauds and sexual exploitation, online stalking, against the EU budget as regards custom child abuse, data protection, electronic duties, or consumer protection, as communications, audiovisual services, necessary for the performance of its tasks. detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.

Or. en

Amendment 85 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Recital 91

Text proposed by the Commission Amendment

(91) The Board should bring together (91) The Board should bring together the representatives of the Digital Services the representatives of the Digital Services Coordinators and possible other competent Coordinators and possible other competent authorities under the chairmanship of the authorities such as the European Data Commission, with a view to ensuring an Protection Supervisor and the European assessment of matters submitted to it in a Union Agency for Fundamental Rights fully European dimension. In view of under the chairmanship of the possible cross-cutting elements that may be Commission, with a view to ensuring an of relevance for other regulatory assessment of matters submitted to it in a frameworks at Union level, the Board fully European dimension. In view of should be allowed to cooperate with other possible cross-cutting elements that may be Union bodies, offices, agencies and of relevance for other regulatory advisory groups with responsibilities in frameworks at Union level, the Board fields such as equality, including equality should be allowed to cooperate with other between women and men, and non- Union bodies, offices, agencies and discrimination, data protection, electronic advisory groups with responsibilities in communications, audiovisual services, fields such as equality, including equality detection and investigation of frauds between women and men, and non- against the EU budget as regards custom discrimination, data protection, electronic duties, or consumer protection, as communications, audiovisual services, necessary for the performance of its tasks. detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.

PE695.317v01-00 58/105 AM\1236481EN.docx EN Or. en

Amendment 86 Hilde Vautmans, Karen Melchior, María Soraya Rodríguez Ramos, Samira Rafaela

Proposal for a regulation Recital 91

Text proposed by the Commission Amendment

(91) The Board should bring together (91) The Board should bring together the representatives of the Digital Services the representatives of the Digital Services Coordinators and possible other competent Coordinators and possible other competent authorities under the chairmanship of the authorities under the chairmanship of the Commission, with a view to ensuring an Commission, with a view to ensuring an assessment of matters submitted to it in a assessment of matters submitted to it in a fully European dimension. In view of fully European dimension. In view of possible cross-cutting elements that may be possible cross-cutting elements that may be of relevance for other regulatory of relevance for other regulatory frameworks at Union level, the Board frameworks at Union level, the Board should be allowed to cooperate with other should be allowed to cooperate with other Union bodies, offices, agencies and Union bodies, offices, agencies and advisory groups with responsibilities in advisory groups with responsibilities in fields such as equality, including equality fields such as equality, including gender between women and men, and non- equality, and non-discrimination, data discrimination, data protection, electronic protection, electronic communications, communications, audiovisual services, audiovisual services, detection and detection and investigation of frauds investigation of frauds against the EU against the EU budget as regards custom budget as regards custom duties, or duties, or consumer protection, as consumer protection, as necessary for the necessary for the performance of its tasks. performance of its tasks.

Or. en

Amendment 87 Kira Marie Peter-Hansen

Proposal for a regulation Recital 91

Text proposed by the Commission Amendment

(91) The Board should bring together (91) The Board should bring together the representatives of the Digital Services the representatives of the Digital Services Coordinators and possible other competent Coordinators and possible other competent authorities under the chairmanship of the authorities under the chairmanship of the

AM\1236481EN.docx 59/105 PE695.317v01-00 EN Commission, with a view to ensuring an Commission, with a view to ensuring an assessment of matters submitted to it in a assessment of matters submitted to it in a fully European dimension. In view of fully European dimension. In view of possible cross-cutting elements that may be possible cross-cutting elements that may be of relevance for other regulatory of relevance for other regulatory frameworks at Union level, the Board frameworks at Union level, the Board should be allowed to cooperate with other should be allowed to cooperate with other Union bodies, offices, agencies and Union bodies, offices, agencies and advisory groups with responsibilities in advisory groups with responsibilities in fields such as equality, including equality fields such as equality, including gender between women and men, and non- equality and non-discrimination, data discrimination, data protection, electronic protection, electronic communications, communications, audiovisual services, audiovisual services, detection and detection and investigation of frauds investigation of frauds against the EU against the EU budget as regards custom budget as regards custom duties, or duties, or consumer protection, as consumer protection, as necessary for the necessary for the performance of its tasks. performance of its tasks.

Or. en

Amendment 88 Silvia Modig

Proposal for a regulation Article 1 – paragraph 2 – point b

Text proposed by the Commission Amendment

(b) set out uniform rules for a safe, (b) set out uniform rules for a safe, predictable and trusted online environment, accessible, including persons with where fundamental rights enshrined in the disabilities, predictable and trusted online Charter are effectively protected. environment, where fundamental rights enshrined in the Charter, including non- discrimination and gender equality, are effectively protected.

Or. en

Amendment 89 Lena Düpont, Frances Fitzgerald, Elissavet Vozemberg-Vrionidi, Christine Schneider

Proposal for a regulation Article 1 – paragraph 2 – point b

Text proposed by the Commission Amendment

PE695.317v01-00 60/105 AM\1236481EN.docx EN (b) set out uniform rules for a safe, (b) set out uniform rules for a safe, predictable and trusted online environment, predictable and trusted online environment, where fundamental rights enshrined in the where fundamental rights enshrined in the Charter are effectively protected. Charter, including equality, are effectively protected.

Or. en

Amendment 90 Lena Düpont, Frances Fitzgerald, Elissavet Vozemberg-Vrionidi, Christine Schneider

Proposal for a regulation Article 1 – paragraph 5 – point d

Text proposed by the Commission Amendment

(d) Regulation (EU) …/…. on (d) Regulation (EU) 2021/784 on preventing the dissemination of terrorist addressing the dissemination of terrorist content online [TCO once adopted]; content online;

Or. en

Amendment 91 Lena Düpont, Frances Fitzgerald, Elissavet Vozemberg-Vrionidi, Christine Schneider

Proposal for a regulation Article 1 – paragraph 5 – point d a (new)

Text proposed by the Commission Amendment

(d a) Regulation of the European Parliament and of the on a temporary derogation from certain provisions of Directive 2002/58/EC of the European Parliament and of the Council as regards the use of technologies by number-independent interpersonal communications service providers for the processing of personal and other data for the purpose of combatting child sexual abuse online

Or. en

AM\1236481EN.docx 61/105 PE695.317v01-00 EN Amendment 92 Hilde Vautmans, Karen Melchior, María Soraya Rodríguez Ramos, Samira Rafaela

Proposal for a regulation Article 2 – paragraph 1 – point d a (new)

Text proposed by the Commission Amendment

(d a) ‘child’ means any natural person under the age of 18;

Or. en

Amendment 93 Frances Fitzgerald

Proposal for a regulation Article 2 – paragraph 1 – point g

Text proposed by the Commission Amendment

(g) ‘illegal content’ means any (g) ‘illegal content’ means any information,, which, in itself or by its information,, which, in itself or by its reference to an activity, including the sale reference to an activity, including the sale of products or provision of services is not of products or provision of services is in compliance with Union law or the law of manifestly not in compliance with Union a Member State, irrespective of the precise law or the law of a Member State, subject matter or nature of that law; irrespective of the precise subject matter or nature of that law; Reporting or warning of an illegal act shall not be deemed illegal content;

Or. en

Amendment 94 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Article 2 – paragraph 1 – point q a (new)

Text proposed by the Commission Amendment

(q a) "Gender-based online violence" means any act of gender-based violence that is committed, assisted or aggravated in part or fully by the use of ICT, such as

PE695.317v01-00 62/105 AM\1236481EN.docx EN mobile phones and smartphones, the Internet, social media platforms or email, against a woman because she is a woman or affects women disproportionately, or against LGBTI people because of their gender identity, gender expression or sex characteristics, and results in, or is likely to result in physical, sexual, psychological or economic harm, including threats to carry out such acts, coercion or arbitrary deprivation of liberty, in public or private life

Or. en

Amendment 95 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Article 5 – paragraph 1 a (new)

Text proposed by the Commission Amendment

1 a. Without prejudice to specific deadlines set out in Union law or in administrative or legal orders, providers of hosting services shall, upon obtaining actual knowledge or awareness, remove or disable access to illegal content as soon as possible and in any event: (a) within 24 hours where the illegal content can seriously harm public policy, public security or public health or seriously harm consumers’ health or safety; (b) within seven days where the illegal content cannot seriously harm public policy, public security, public health or consumers’ health or safety; Where the provider of hosting services cannot comply with the obligation in paragraph 1a on grounds of force majeure or for objectively justifiable technical or operational reasons, it shall, without undue delay, inform the competent authority that has issued an order pursuant to Article 8 or the

AM\1236481EN.docx 63/105 PE695.317v01-00 EN recipient of the service that has submitted a notice pursuant to Article 14, of those grounds.

Or. en

Amendment 96

Proposal for a regulation Article 8 a (new)

Text proposed by the Commission Amendment

Article 8 a Injunction orders Member States shall ensure that recipients of a service are entitled under their national law to seek an injunction order as an interim measure for removing manifestly illegal content.

Or. en

Amendment 97 Arba Kokalari

Proposal for a regulation Article 10 a (new)

Text proposed by the Commission Amendment

Article 10 a Point of contact for recipients of a service 1. Providers of intermediary services shall establish a single point of contact allowing for direct communication, by electronic means, with the recipients of their services. The means of communication shall be user-friendly and easily accessible. 2. Providers of intermediary services shall make public the information necessary to easily identify and communicate with

PE695.317v01-00 64/105 AM\1236481EN.docx EN their single points of contact for recipients.

Or. en

Amendment 98 Lena Düpont, Frances Fitzgerald, Elissavet Vozemberg-Vrionidi, Christine Schneider

Proposal for a regulation Article 12 – title

Text proposed by the Commission Amendment

Terms and conditions Terms and conditions, code of conduct

Or. en

Amendment 99 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Article 12 – paragraph 1

Text proposed by the Commission Amendment

1. Providers of intermediary services 1. Providers of intermediary services shall include information on any shall include information on any restrictions that they impose in relation to restrictions that they impose in relation to the use of their service in respect of the use of their service in respect of information provided by the recipients of information provided by the recipients of the service, in their terms and conditions. the service, in their terms and conditions. That information shall include information That information shall include information on any policies, procedures, measures and on any policies, procedures, measures and tools used for the purpose of content tools used for the purpose of content moderation, including algorithmic moderation, including algorithmic decision-making and human review. It decision-making and human review. It shall be set out in clear and unambiguous shall be set out in clear and unambiguous language and shall be publicly available in language and shall be publicly available in an easily accessible format. an easily accessible format, in a searchable archive of all the previous versions with their date of application.

Or. en

AM\1236481EN.docx 65/105 PE695.317v01-00 EN Amendment 100 Hilde Vautmans, Karen Melchior, María Soraya Rodríguez Ramos, Samira Rafaela

Proposal for a regulation Article 12 – paragraph 1 a (new)

Text proposed by the Commission Amendment

1 a. Providers of intermediary services shall ensure their terms and conditions are age-appropriate, promote gender equality and the rights of LGBTIQ+ people and meet the highest European or International standards, pursuant to Article 34.

Or. en

Amendment 101 Lena Düpont, Frances Fitzgerald, Elissavet Vozemberg-Vrionidi, Christine Schneider

Proposal for a regulation Article 12 – paragraph 2

Text proposed by the Commission Amendment

2. Providers of intermediary services 2. Providers of intermediary services shall act in a diligent, objective and shall act in a diligent, non-discriminatory proportionate manner in applying and and transparent, objective and enforcing the restrictions referred to in proportionate manner in applying and paragraph 1, with due regard to the rights enforcing the restrictions referred to in and legitimate interests of all parties paragraph 1, with due regard to the rights involved, including the applicable and legitimate interests of all parties fundamental rights of the recipients of the involved, including the applicable service as enshrined in the Charter. fundamental rights of the recipients of the service as enshrined in the Charter.

Or. en

Amendment 102 Silvia Modig

Proposal for a regulation Article 12 – paragraph 2 a (new)

PE695.317v01-00 66/105 AM\1236481EN.docx EN Text proposed by the Commission Amendment

2 a. Very large online platforms as defined in Article 25(1) shall publish their terms and conditions in all official languages of the Union. They shall present their terms and conditions in machine-readable format and make all previous versions of their terms and conditions easily accessible to the public, including by persons with disabilities. Options to report potentially illegal, abusive and harmful content shall be easy to find and use in the native language of the user. They shall also make moderation as easy as possible, with the help of tools, training etc. for people administrating and moderating online groups using their platforms and services.

Or. en

Amendment 103 Lena Düpont, Frances Fitzgerald, Elissavet Vozemberg-Vrionidi, Christine Schneider

Proposal for a regulation Article 12 – paragraph 2 a (new)

Text proposed by the Commission Amendment

2 a. Providers of intermediary services shall be obliged to include on their platforms a code of conduct, setting out behavioral rules for their users. These rules shall be publicly accessible in an easy format and shall be set out in clear and unambiguous language.

Or. en

Amendment 104 Kira Marie Peter-Hansen

Proposal for a regulation Article 12 – paragraph 2 a (new)

AM\1236481EN.docx 67/105 PE695.317v01-00 EN Text proposed by the Commission Amendment

2 a. Terms and conditions of providers of intermediary services shall respect the essential principles of human rights as enshrined in the Charter and international law

Or. en

Amendment 105 Hilde Vautmans, Karen Melchior, María Soraya Rodríguez Ramos, Samira Rafaela

Proposal for a regulation Article 12 a (new)

Text proposed by the Commission Amendment

Article 12 a Child impact assessment 1. All providers shall assess whether their services are accessed by, likely to be accessed by, or impact children, especially girls. Providers of services likely to impact children, especially girls, shall identify, analyse and assess, during the design and development of new services, on an ongoing basis and at least once a year, any systemic risks stemming from the functioning and use of their services in the Union for children, especially girls. These risk impact assessments shall be specific to their services, meet the highest European or International standards detailed in Article 34, and shall consider all known content, contact, conduct or commercial risks included in the contract. Assessments shall also include the following systemic risks: (a) the dissemination of illegal content or behaviour enabled, manifested on or as a result of their services; (b) any negative effects for the exercise of the rights of the child, as enshrined in Article 24 of the Charter and in the UN

PE695.317v01-00 68/105 AM\1236481EN.docx EN Convention on the Rights of the Child, and detailed in the United Nations Committee on the Rights of the Child General comment No.25 as regards the digital environment; (c) any negative effects on the the right to gender equality, as enshrined in Article 23 of the Charter, particularly the right to live free from violence as envisaged by the Council of Europe Convention on preventing and combating violence against women and girls (Istanbul Convention); (d) any negative effects on the right to non-discrimination, as enshrined in Article 21 of the Charter; (e) any intended or unintended consequences resulting from the operation or intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on children's rights, especially of girls. 2. When conducting such impact assessments, providers of intermediary services likely to impact children, especially girls, shall take into account, in particular, how their terms and conditions, content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions or with the rights of the child, especially of girls.

Or. en

Amendment 106 Hilde Vautmans, Karen Melchior, María Soraya Rodríguez Ramos, Samira Rafaela

Proposal for a regulation

AM\1236481EN.docx 69/105 PE695.317v01-00 EN Article 12 b (new)

Text proposed by the Commission Amendment

Article 12 b Mitigation of risks to children, especially girls Providers of intermediary services likely to impact children, especially girls, shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 12 a. Such measures shall include, where applicable: (a) implementing mitigation measures identified in Article 27 with regard for children’s best interests; (b) adapting or removing system design features that expose children to content, contact, conduct and contract risks, as identified in the process of conducting child impact assessments; (c) implementing proportionate and privacy preserving age assurance, meeting the standard outlined in Article 34; (d) adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions to ensure they prioritise the best interests of the child and gender equality; (e) ensuring the highest levels of privacy, safety, and security by design and default for users under the age of 18; (f) preventing profiling of children, including for commercial purposes like targeted advertising; (g) ensuring published terms are age appropriate and uphold children’s rights and gender equality; (h) providing child-friendly and inclusive mechanisms for remedy and redress, including easy access to expert advice and

PE695.317v01-00 70/105 AM\1236481EN.docx EN support.

Or. en

Amendment 107 Lena Düpont, Frances Fitzgerald, Elissavet Vozemberg-Vrionidi, Christine Schneider

Proposal for a regulation Article 13 – paragraph 1 – introductory part

Text proposed by the Commission Amendment

1. Providers of intermediary services 1. Providers of intermediary services shall publish, at least once a year, clear, shall publish, at least once a year, clear, easily comprehensible and detailed reports easily comprehensible and detailed reports on any content moderation they engaged in on any content moderation they engaged in during the relevant period. Those reports during the relevant period. Those reports shall include, in particular, information on shall include breakdowns at Member the following, as applicable: States level and, in particular, information on the following, as applicable:

Or. en

Amendment 108 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Article 13 – paragraph 1 – point b

Text proposed by the Commission Amendment

(b) the number of notices submitted in (b) the number of notices submitted in accordance with Article 14, categorised by accordance with Article 14, categorised by the type of alleged illegal content the type of alleged illegal content concerned, any action taken pursuant to the concerned, anonymised data on individual notices by differentiating whether the characteristics of those who submit these action was taken on the basis of the law or notices such as gender, age group and the terms and conditions of the provider, social background, any action taken and the average time needed for taking the pursuant to the notices by differentiating action; whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action;.

Or. en

AM\1236481EN.docx 71/105 PE695.317v01-00 EN Amendment 109 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Article 13 – paragraph 1 – point d

Text proposed by the Commission Amendment

(d) the number of complaints received (d) the number of complaints received through the internal complaint-handling through the internal complaint-handling system referred to in Article 17, the basis system referred to in Article 17, for those complaints, decisions taken in anonymised data on individual respect of those complaints, the average characteristics of those who submit these time needed for taking those decisions and complaints, such as gender, age group the number of instances where those and social background, the basis for those decisions were reversed. complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed.

Or. en

Amendment 110 Hilde Vautmans, Karen Melchior, María Soraya Rodríguez Ramos, Samira Rafaela

Proposal for a regulation Article 13 – paragraph 1 – point d

Text proposed by the Commission Amendment

(d) the number of complaints received (d) the number of complaints received through the internal complaint-handling through the internal complaint-handling system referred to in Article 17, the basis system referred to in Article 17, gender for those complaints, decisions taken in and (if children) the age of complainants, respect of those complaints, the average the basis for those complaints, decisions time needed for taking those decisions and taken in respect of those complaints, the the number of instances where those average time needed for taking those decisions were reversed. decisions and the number of instances where those decisions were reversed.

Or. en

Amendment 111

PE695.317v01-00 72/105 AM\1236481EN.docx EN Hilde Vautmans, Karen Melchior, María Soraya Rodríguez Ramos, Samira Rafaela

Proposal for a regulation Article 13 – paragraph 1 a (new)

Text proposed by the Commission Amendment

1 a. Providers of intermediary services that impact children, especially girls, shall publish, at least once a year: (a) child impact assessments to identify known harms, unintended consequences and emerging risks; these impact assessments shall comply with the standards outlined in Article 34; (b) clear, easily comprehensible and detailed reports outlining the gender equality and child risk mitigation measures undertaken, their efficacy and any outstanding actions required; these reports shall comply with the standards outlined in Article 34, including as regards age assurance and age verification, in line with a child-centred design that equally promotes gender equality.

Or. en

Amendment 112 Frances Fitzgerald

Proposal for a regulation Article 13 – paragraph 1 a (new)

Text proposed by the Commission Amendment

1 a. Protection of the identity of the victims concerned shall be ensured, in line with GDPR standards;

Or. en

Amendment 113 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

AM\1236481EN.docx 73/105 PE695.317v01-00 EN Proposal for a regulation Article 13 a (new)

Text proposed by the Commission Amendment

Article 13 a Recipients' consent for advertising practices 1. Providers of intermediary services shall, by default, not make the recipients of their services subject to targeted, micro- targeted and behavioural advertisement unless the recipient of the service has expressed a freely given, specific, informed and unambiguous consent. Providers of intermediary services shall ensure that recipients of services can easily make an informed choice when expressing their consent by providing them with meaningful information, including information about the value of giving access to and about the use of their data, including an explicit mention in cases where data are collected with resource to trackers and if these data are collected in websites from other providers. 2. When asking for the consent of recipients of their services considered as vulnerable consumers, providers of intermediary services shall implement all the necessary measures to ensure that such consumers have received sufficient and relevant information before they give their consent. 3. When processing data for targeted, micro-targeted and behavioural advertising, online intermediaries shall comply with relevant Union law and shall not engage in activities that can lead to pervasive tracking, such as disproportionate combination of data collected by platforms, or disproportionate processing of special categories of data that might be used to exploit vulnerabilities. 4. Providers of intermediary services shall organise their online interface in such a

PE695.317v01-00 74/105 AM\1236481EN.docx EN way that recipients of services, in particular those considered as vulnerable consumers, can easily and efficiently access and modify advertising parameters. Providers of intermediary services shall monitor the use of advertising parameters by recipients of services on a regular basis and make best efforts to improve their awareness about the possibility to modify those parameters.

Or. en

Amendment 114 Kira Marie Peter-Hansen

Proposal for a regulation Article 14 – paragraph 1

Text proposed by the Commission Amendment

1. Providers of hosting services shall 1. Providers of hosting services shall put mechanisms in place to allow any put mechanisms in place to allow any individual or entity to notify them of the individual or entity to notify them, in at presence on their service of specific items least any of the official EU languages the of information that the individual or entity notifier may wish, of the presence on their considers to be illegal content. Those service of specific items of information mechanisms shall be easy to access, user- that the individual or entity considers to be friendly, and allow for the submission of illegal content. Those mechanisms shall be notices exclusively by electronic means. easy to access, user-friendly, and allow for the submission of notices exclusively by electronic means.

Or. en

Amendment 115 Silvia Modig

Proposal for a regulation Article 14 – paragraph 1

Text proposed by the Commission Amendment

1. Providers of hosting services shall 1. Providers of hosting services shall put mechanisms in place to allow any put mechanisms in place to allow any individual or entity to notify them of the individual or entity to notify them of the

AM\1236481EN.docx 75/105 PE695.317v01-00 EN presence on their service of specific items presence on their service of specific items of information that the individual or entity of information that the individual or entity considers to be illegal content. Those considers to be illegal or harmful content. mechanisms shall be easy to access, user- Those mechanisms shall be easy to access, friendly, and allow for the submission of user-friendly, gender-responsive and allow notices exclusively by electronic means. for the submission of notices exclusively by electronic means.

Or. en

Amendment 116 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Article 14 – paragraph 2 – point d a (new)

Text proposed by the Commission Amendment

(d a) the option for those submitting notices to outline some of their individual characteristics, such as gender, age group and social background; the providers shall make clear that this information shall not be part of the decision-making process with regard to the notice, shall be completely anonymised and used solely for reporting purposes.

Or. en

Amendment 117 Kira Marie Peter-Hansen

Proposal for a regulation Article 14 – paragraph 5 a (new)

Text proposed by the Commission Amendment

5 a. The provider of intermediary services shall also notify the recipient who provided the information, where contact details are available, giving them the opportunity to reply before taking a decision, unless this would obstruct the prevention and prosecution of serious criminal offences.

PE695.317v01-00 76/105 AM\1236481EN.docx EN Or. en

Amendment 118 Kira Marie Peter-Hansen

Proposal for a regulation Article 14 – paragraph 6 a (new)

Text proposed by the Commission Amendment

6 a. Upon receipt of a valid notice, providers of hosting services shall act expeditiously to disable access to content which is manifestly illegal.

Or. en

Amendment 119 Kira Marie Peter-Hansen

Proposal for a regulation Article 14 – paragraph 6 b (new)

Text proposed by the Commission Amendment

6 b. The provider of hosting services shall ensure that processing of notices is undertaken by qualified staff to whom adequate initial and ongoing training on the applicable legislation and international human rights standards, including anti-discrimination, as well as appropriate working conditions are to be provided, including, where relevant, professional support, qualified psychological assistance and legal advice.

Or. en

Amendment 120 Arba Kokalari

Proposal for a regulation Article 17 – paragraph 1 – point a

AM\1236481EN.docx 77/105 PE695.317v01-00 EN Text proposed by the Commission Amendment

(a) decisions to remove or disable (a) decisions to remove or not to access to the information; remove or disable access to the information;

Or. en

Amendment 121 Arba Kokalari

Proposal for a regulation Article 17 – paragraph 1 – point b

Text proposed by the Commission Amendment

(b) decisions to suspend or terminate (b) decisions to suspend or terminate the provision of the service, in whole or in or not to suspend or terminate the part, to the recipients; provision of the service, in whole or in part, to the recipients;

Or. en

Amendment 122 Arba Kokalari

Proposal for a regulation Article 17 – paragraph 1 – point c

Text proposed by the Commission Amendment

(c) decisions to suspend or terminate (c) decisions to suspend or terminate the recipients’ account. or not to suspend or terminate the recipients’ account.

Or. en

Amendment 123 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Article 17 – paragraph 1 – point c a (new)

PE695.317v01-00 78/105 AM\1236481EN.docx EN Text proposed by the Commission Amendment

(c a) decisions whether or not to restrict the ability to monetize content provided by the recipients.

Or. en

Amendment 124 Hilde Vautmans, Karen Melchior, María Soraya Rodríguez Ramos, Samira Rafaela

Proposal for a regulation Article 17 – paragraph 2

Text proposed by the Commission Amendment

2. Online platforms shall ensure that 2. Online platforms shall ensure that their internal complaint-handling systems their internal complaint-handling and are easy to access, user-friendly and enable redress systems are easy to access and and facilitate the submission of sufficiently user-friendly, including for children, precise and adequately substantiated especially girls and enable and facilitate complaints. the submission of sufficiently precise and adequately substantiated complaints.

Or. en

Amendment 125 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Article 17 – paragraph 4 a (new)

Text proposed by the Commission Amendment

4 a. Online platforms shall give the option for those submitting complaints to outline some of their individual characteristics, such as gender, age group and social background. Online platforms shall make clear that this information shall not be part of the decision-making process in regards to the complaint, shall be completely anonymised and used solely for reporting purposes.

Or. en

AM\1236481EN.docx 79/105 PE695.317v01-00 EN Amendment 126 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Article 24 a (new)

Text proposed by the Commission Amendment

Article 24 a Recommender systems 1. Online platforms shall not make the recipients of their services subject to recommender systems based on profiling, unless the recipient of the service has expressed a freely given, specific, informed and unambiguous consent. Online platforms shall ensure that the option that is not based on profiling is activated by default. 2. Online platforms shall set out in their terms and conditions and when content is recommended, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they have made available, including at least one option which is not based on profiling, within the meaning of Article 4(4) of Regulation (EU) 2016/679 of the European Parliament and of the Council. Online platforms shall also enable the recipients of the service to view, in a user- friendly manner, any profile or profiles used to curate their own content. They shall provide users with an easily accessible option to delete their profile or profiles used to curate the content the recipient sees. 3. The parameters referred to in paragraph 2 shall include, at a minimum: (a) the recommendation criteria used by the relevant system; (b) how these criteria are weighted

PE695.317v01-00 80/105 AM\1236481EN.docx EN against each other; (c) what goals the relevant system has been optimised for; and (d) if applicable, an explanation of the role that the behaviour of the recipients of the service plays in how the relevant system produces its outputs. 4. Where several options are available pursuant to paragraph 1, very large online platforms shall provide an easily accessible function on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them. 5. Online platforms shall inform their users about the identity of the person responsible for the recommender system. 6. Online platforms shall ensure that the algorithm used by their recommender system is designed in such a way that it does not risk misleading or manipulating the recipients of the service when they use it. 7. Online platforms shall ensure that information from trustworthy sources, such as information from public authorities or from scientific sources, is displayed as first results following search queries that are related to areas of public interest.

Or. en

Amendment 127 Kira Marie Peter-Hansen

Proposal for a regulation Article 24 a (new)

Text proposed by the Commission Amendment

Article 24 a

AM\1236481EN.docx 81/105 PE695.317v01-00 EN Protections against image-based sexual abuse Where an online platform is primarily used for the dissemination of user generated pornographic content, the platform shall take the necessary technical and organisational measures to ensure (a) that users who disseminate content have verified themselves through a double opt-in e-mail and cell phone registration; (b) professional human-powered content moderation in line with Article 14(6d), where content having a high probability of being illegal, such as content depicting to be voyeuristic or enacting rape scenes, is reviewed; (c) the accessibility of an anonymous qualified notification procedure in the form that additionally to the mechanism referred to in Article 14 and respecting the same principles with the exception of paragraph 5a of that Article, individuals may notify the platform with the claim that image material depicting them or purporting to be depicting them is being disseminated without their consent and supply the platform with prima facie evidence of their physical identity; content notified through this procedure shall be considered manifestly illegal in terms of Article 14(6a) and shall be suspended within 48 hours.

Or. en

Amendment 128 Hilde Vautmans, Karen Melchior, María Soraya Rodríguez Ramos, Samira Rafaela

Proposal for a regulation Article 26 – paragraph 1 – introductory part

Text proposed by the Commission Amendment

1. Very large online platforms shall 1. Very large online platforms shall identify, analyse and assess, from the date identify, analyse and assess, from the date

PE695.317v01-00 82/105 AM\1236481EN.docx EN of application referred to in the second of application referred to in the second subparagraph of Article 25(4), at least subparagraph of Article 25(4), on an once a year thereafter, any significant ongoing basis, the probability and severity systemic risks stemming from the of any significant systemic risks stemming functioning and use made of their services from the functioning and use made of their in the Union. This risk assessment shall be services in the Union. This risk assessment specific to their services and shall include shall be specific to their services and shall the following systemic risks: include the following systemic risks:

Or. en

Amendment 129 Kira Marie Peter-Hansen

Proposal for a regulation Article 26 – paragraph 1 – introductory part

Text proposed by the Commission Amendment

1. Very large online platforms shall 1. Very large online platforms shall identify, analyse and assess, from the date identify, analyse and assess, from the date of application referred to in the second of application referred to in the second subparagraph of Article 25(4), at least subparagraph of Article 25(4), on an once a year thereafter, any significant ongoing basis, the probability and severity systemic risks stemming from the of any significant systemic risks stemming functioning and use made of their services from the functioning and use made of their in the Union. This risk assessment shall be services in the Union. This risk assessment specific to their services and shall include shall be specific to their services and shall the following systemic risks: include the following systemic risks:

Or. en

Amendment 130 Hilde Vautmans, Karen Melchior, María Soraya Rodríguez Ramos, Samira Rafaela

Proposal for a regulation Article 26 – paragraph 1 – point b

Text proposed by the Commission Amendment

(b) any negative effects for the exercise (b) any negative effects for the exercise of the fundamental rights to respect for of any of the fundamental rights listed in private and family life, freedom of the Charter, in particular on the expression and information, the prohibition fundamental rights to respect for private of discrimination and the rights of the and family life, freedom of expression and child, as enshrined in Articles 7, 11, 21 and information, the prohibition of

AM\1236481EN.docx 83/105 PE695.317v01-00 EN 24 of the Charter respectively; discrimination, the right to gender equality and the rights of the child, as enshrined in Articles 7, 11, 21, 23 and 24 of the Charter respectively;

Or. en

Amendment 131 Kira Marie Peter-Hansen

Proposal for a regulation Article 26 – paragraph 1 – point b

Text proposed by the Commission Amendment

(b) any negative effects for the exercise (b) any negative effects for the exercise of the fundamental rights to respect for of the fundamental rights listed in the private and family life, freedom of Charter, in particular the right to respect expression and information, the prohibition for private and family life, freedom of of discrimination and the rights of the expression and information, the prohibition child, as enshrined in Articles 7, 11, 21 and of discrimination, the right to gender 24 of the Charter respectively; equality and the rights of the child, as enshrined in Articles 7, 11, 21, 23 and 24 of the Charter respectively;

Or. en

Amendment 132 Silvia Modig

Proposal for a regulation Article 26 – paragraph 1 – point b

Text proposed by the Commission Amendment

(b) any negative effects for the exercise (b) any negative effects for the exercise of the fundamental rights to respect for of the fundamental rights to respect for private and family life, freedom of private and family life, freedom of expression and information, the prohibition expression and information, the prohibition of discrimination and the rights of the of discrimination, the right to equality child, as enshrined in Articles 7, 11, 21 and between women and men and the rights of 24 of the Charter respectively; the child, as enshrined in Articles 7, 11, 21, 23 and 24 of the Charter respectively;

Or. en

PE695.317v01-00 84/105 AM\1236481EN.docx EN Amendment 133 Silvia Modig

Proposal for a regulation Article 26 – paragraph 1 – point c

Text proposed by the Commission Amendment

(c) intentional manipulation of their (c) intentional manipulation of service, including by means of inauthentic their service, including by means of use or automated exploitation of the inauthentic use or automated exploitation service, with an actual or foreseeable of the service, with an actual or foreseeable negative effect on the protection of public negative effect on gender equality or on health, minors, civic discourse, or actual or the protection of public health (including foreseeable effects related to electoral mental health), minors, civic discourse, processes and public security. or actual or foreseeable effects related to electoral processes and public security.

Or. en

Amendment 134 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Article 26 – paragraph 1 – point c

Text proposed by the Commission Amendment

(c) intentional manipulation of their (c) intentional manipulation of their service, including by means of inauthentic service, including by means of inauthentic use or automated exploitation of the use or automated exploitation of the service, with an actual or foreseeable service, with an actual or foreseeable negative effect on the protection of public negative effect on online violence, the health, minors, civic discourse, or actual or protection of public health, minors, civic foreseeable effects related to electoral discourse, or actual or foreseeable effects processes and public security. related to electoral processes and public security.

Or. en

Amendment 135 Lena Düpont, Frances Fitzgerald, Elissavet Vozemberg-Vrionidi, Christine Schneider

Proposal for a regulation

AM\1236481EN.docx 85/105 PE695.317v01-00 EN Article 26 – paragraph 1 – point c

Text proposed by the Commission Amendment

(c) intentional manipulation of their (c) intentional manipulation of their service, including by means of inauthentic service, including by means of inauthentic use or automated exploitation of the use or automated exploitation of the service, with an actual or foreseeable service, with an actual or foreseeable negative effect on the protection of public negative effect on the protection of public health, minors, civic discourse, or actual or health, gender equality, minors, civic foreseeable effects related to electoral discourse, or actual or foreseeable effects processes and public security. related to electoral processes and public security.

Or. en

Amendment 136 Silvia Modig

Proposal for a regulation Article 26 – paragraph 2

Text proposed by the Commission Amendment

2. When conducting risk assessments, 2. When conducting risk assessments, very large online platforms shall take into very large online platforms shall take into account, in particular, how their content account, in particular, how their content moderation systems, recommender systems moderation systems, recommender systems and systems for selecting and displaying and systems for selecting and displaying advertisement influence any of the advertisement influence any of the systemic risks referred to in paragraph 1, systemic risks referred to in paragraph 1, including the potentially rapid and wide including the potentially rapid and wide dissemination of illegal content and of dissemination of illegal content or the information that is incompatible with their content risking an increase in online terms and conditions. violence, gender and other identity-based violence and discrimination and disinformation against women, LGBTIQ+ persons and persons with disabilities or deepening the marginalization of vulnerable communities who are often targets of hate speech online and of information that is incompatible with their terms and conditions.

Or. en

PE695.317v01-00 86/105 AM\1236481EN.docx EN Amendment 137 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Article 26 – paragraph 2

Text proposed by the Commission Amendment

2. When conducting risk assessments, 2. When conducting risk assessments, very large online platforms shall take into very large online platforms shall take into account, in particular, how their content account, in particular, how their content moderation systems, recommender systems moderation systems, recommender systems and systems for selecting and displaying and systems for selecting and displaying advertisement influence any of the advertisement influence any of the systemic risks referred to in paragraph 1, systemic risks referred to in paragraph 1, including the potentially rapid and wide including the potentially rapid and wide dissemination of illegal content and of dissemination of illegal content or content information that is incompatible with their associated with online violence and terms and conditions. gender-based violence and of information that is incompatible with their terms and conditions.

Or. en

Amendment 138 Lena Düpont, Frances Fitzgerald, Elissavet Vozemberg-Vrionidi, Christine Schneider

Proposal for a regulation Article 26 – paragraph 2

Text proposed by the Commission Amendment

2. When conducting risk assessments, 2. When conducting risk assessments, very large online platforms shall take into very large online platforms shall take into account, in particular, how their content account, in particular, how their content moderation systems, recommender systems moderation systems, recommender systems and systems for selecting and displaying and systems for selecting and displaying advertisement influence any of the advertisement influence any of the systemic risks referred to in paragraph 1, systemic risks referred to in paragraph 1, including the potentially rapid and wide including the potentially rapid and wide dissemination of illegal content and of dissemination of illegal content or the information that is incompatible with their content that risks an increase in online terms and conditions. violence and of information that is incompatible with their terms and conditions.

Or. en

AM\1236481EN.docx 87/105 PE695.317v01-00 EN Amendment 139 Silvia Modig

Proposal for a regulation Article 26 – paragraph 2 a (new)

Text proposed by the Commission Amendment

2 a. Very large online platforms shall regularly review their algorithms in an intersectional gender-responsive manner in order to minimise negative consequences on gender equality such as an increase in cases of online violence and abuse against women, girls and LGBTIQ+ persons, and consequently physical violence, or the promotion of contents spreading harmful gender stereotypes, including ableist, ageist and racial stereotyping. Very large online platforms shall implement comprehensive and verifiable standards and measures to limit the scope of extremist actors and purposeful disinformation.

Or. en

Amendment 140 Silvia Modig

Proposal for a regulation Article 26 – paragraph 2 b (new)

Text proposed by the Commission Amendment

2 b. Very large online platforms shall offer easily accessible explanations that allow users to understand when, why, for which tasks, and to which extent algorithmic tools are used. They shall let users in an easy and accessible way to accept or not the algorithms being used in their provided platforms and services. They shall allow independent researchers and relevant regulators to audit their algorithmic tools to make sure they are

PE695.317v01-00 88/105 AM\1236481EN.docx EN used as intended.

Or. en

Amendment 141 Silvia Modig

Proposal for a regulation Article 26 – paragraph 2 c (new)

Text proposed by the Commission Amendment

2 c. Very large online platforms shall yearly share information on the number of content moderators they employ, including the number of moderators employed per region and by language. They shall also share information on how moderators are trained to identify gender and other identity-based violence, disinfomation and abuse against users, as well as how moderators are trained about international human rights standards and their responsibility to respect the rights of the users on their platforms and services. They shall also actively involve human rights organisations in their training programmes.

Or. en

Amendment 142 Silvia Modig

Proposal for a regulation Article 27 – paragraph 1 – introductory part

Text proposed by the Commission Amendment

1. Very large online platforms shall 1. Very large online platforms shall put in place reasonable, proportionate and put in place reasonable, proportionate and effective mitigation measures, tailored to effective and gender-responsive mitigation the specific systemic risks identified measures, tailored to the specific systemic pursuant to Article 26. Such measures may risks identified pursuant to Article 26. Such include, where applicable: measures shall include, where applicable:

AM\1236481EN.docx 89/105 PE695.317v01-00 EN Or. en

Amendment 143 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Article 27 – paragraph 1 – introductory part

Text proposed by the Commission Amendment

1. Very large online platforms shall 1. Very large online platforms shall put in place reasonable, proportionate and put in place reasonable, proportionate and effective mitigation measures, tailored to effective measures to cease, prevent and the specific systemic risks identified mitigate systemic risks, tailored to the pursuant to Article 26. Such measures may specific systemic risks identified pursuant include, where applicable: to Article 26. Such measures may include, where applicable:

Or. en

Amendment 144 Hilde Vautmans, Karen Melchior, María Soraya Rodríguez Ramos, Samira Rafaela

Proposal for a regulation Article 27 – paragraph 1 – introductory part

Text proposed by the Commission Amendment

1. Very large online platforms shall 1. Very large online platforms shall put in place reasonable, proportionate and put in place reasonable, proportionate and effective mitigation measures, tailored to effective mitigation measures, tailored to the specific systemic risks identified the specific systemic risks identified pursuant to Article 26. Such measures may pursuant to Article 26. Such measures shall include, where applicable: include, where applicable:

Or. en

Amendment 145 Silvia Modig

Proposal for a regulation Article 27 – paragraph 1 – point a

Text proposed by the Commission Amendment

PE695.317v01-00 90/105 AM\1236481EN.docx EN (a) adapting content moderation or (a) adapting content moderation or recommender systems, their decision- recommender systems, their decision- making processes, the features or making processes, the features or functioning of their services, or their terms functioning of their services, or their terms and conditions; and conditions, including gender equality;

Or. en

Amendment 146 Silvia Modig

Proposal for a regulation Article 27 – paragraph 1 – point b

Text proposed by the Commission Amendment

(b) targeted measures aimed at limiting (b) targeted measures aimed at limiting the display of advertisements in association the display of advertisements or harmful with the service they provide; content or the discriminatory display of advertising with an impact on gender equality in association with the service they provide;

Or. en

Amendment 147 Kira Marie Peter-Hansen

Proposal for a regulation Article 27 – paragraph 1 – point b

Text proposed by the Commission Amendment

(b) targeted measures aimed at limiting (b) targeted measures aimed at limiting the display of advertisements in association the display of advertisements or illegal with the service they provide; content in association with the service they provide;

Or. en

Amendment 148 Lena Düpont, Frances Fitzgerald, Elissavet Vozemberg-Vrionidi, Christine Schneider

Proposal for a regulation

AM\1236481EN.docx 91/105 PE695.317v01-00 EN Article 27 – paragraph 1 – point b

Text proposed by the Commission Amendment

(b) targeted measures aimed at limiting (b) targeted measures aimed at limiting the display of advertisements in association the display of advertisements or harmful with the service they provide; content in association with the service they provide;

Or. en

Amendment 149 Hilde Vautmans, Karen Melchior, María Soraya Rodríguez Ramos, Samira Rafaela

Proposal for a regulation Article 27 – paragraph 1 a (new)

Text proposed by the Commission Amendment

1 a. Where a very large online platform decides not to put in place any of the mitigation measures listed in article 27(1), it shall provide a written explanation that describes the reasons why those measures were not put in place, which shall be provided to the independent auditors in order to prepare the audit report referred to in Article 28(3).

Or. en

Amendment 150 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Article 27 – paragraph 2 – point b

Text proposed by the Commission Amendment

(b) best practices for very large online (b) best practices for very large online platforms to mitigate the systemic risks platforms to cease, prevent and mitigate identified. the systemic risks identified.

Or. en

PE695.317v01-00 92/105 AM\1236481EN.docx EN Amendment 151 Hilde Vautmans, Karen Melchior, María Soraya Rodríguez Ramos, Samira Rafaela

Proposal for a regulation Article 28 – paragraph 1 – point a

Text proposed by the Commission Amendment

(a) the obligations set out in Chapter (a) the obligations set out in Chapter III; III; in particular the quality of the identification, analysis and assessment of the risks referred to in Article 26, and the necessity, proportionality and effectiveness of the risk mitigation measures referred to in Article 27;

Or. en

Amendment 152 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Article 29

Text proposed by the Commission Amendment

Article 29 deleted Recommender systems 1. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679. 2. Where several options are available pursuant to paragraph 1, very large online platforms shall provide an easily accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time

AM\1236481EN.docx 93/105 PE695.317v01-00 EN their preferred option for each of the recommender systems that determines the relative order of information presented to them.

Or. en

Justification

Article moved to the previous Section.

Amendment 153 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Article 30 – paragraph 1

Text proposed by the Commission Amendment

1. Very large online platforms that 1. Very large online platforms that display advertising on their online display advertising on their online interfaces shall compile and make publicly interfaces shall compile and make publicly available through application programming available and searchable through easy to interfaces a repository containing the access, functionable and reliable tools information referred to in paragraph 2, through application programming until one year after the advertisement was interfaces a repository containing the displayed for the last time on their online information referred to in paragraph 2, interfaces. They shall ensure that the until five years after the advertisement was repository does not contain any personal displayed for the last time on their online data of the recipients of the service to interfaces. They shall ensure that the whom the advertisement was or could have repository does not contain any personal been displayed. data of the recipients of the service to whom the advertisement was or could have been displayed.

Or. en

Amendment 154 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Article 31 – paragraph 2

Text proposed by the Commission Amendment

2. Upon a reasoned request from the 2. Upon a reasoned request from the

PE695.317v01-00 94/105 AM\1236481EN.docx EN Digital Services Coordinator of Digital Services Coordinator of establishment or the Commission, very establishment or the Commission, very large online platforms shall, within a large online platforms shall, within a reasonable period, as specified in the reasonable period, as specified in the request, provide access to data to vetted request, provide access to data to vetted researchers who meet the requirements in researchers who meet the requirements in paragraphs 4 of this Article, for the sole paragraphs 4 of this Article, for the sole purpose of conducting research that purpose of conducting research that contributes to the identification and contributes to the identification and understanding of systemic risks as set out understanding of systemic risks as set out in Article 26(1). in Article 26(1) and to verify the effectiveness of the risk mitigation measures taken by the very large online platform in question under Article 27.

Or. en

Amendment 155 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Article 31 – paragraph 3 a (new)

Text proposed by the Commission Amendment

3 a. The data provided to vetted researchers should be as disaggregated as possible, unless the researcher requests it otherwise.

Or. en

Amendment 156 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Article 31 – paragraph 4

Text proposed by the Commission Amendment

4. In order to be vetted, researchers 4. In order to be vetted, researchers shall be affiliated with academic shall be affiliated with academic institutions, be independent from institutions or civil society organisations commercial interests, have proven records representing the public interest, be of expertise in the fields related to the risks independent from commercial interests, investigated or related research disclose the funding financing the

AM\1236481EN.docx 95/105 PE695.317v01-00 EN methodologies, and shall commit and be in research, have proven records of expertise a capacity to preserve the specific data in the fields related to the risks investigated security and confidentiality requirements or related research methodologies, and corresponding to each request. shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request.

Or. en

Amendment 157 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Article 33 a (new)

Text proposed by the Commission Amendment

Article 33 a Algorithm accountability 1. When using automated decision- making, the very large online platform shall perform an assessment of the algorithms used. 2. When carrying out the assessment referred into paragraph 1, the very large online platform shall assess the following elements: (a) the compliance with corresponding Union requirements; (b) how the algorithm is used and its impact on the provision of the service; (c) the impact on fundamental rights, including on consumer rights, as well as the social effect of the algorithms; and (d) whether the measures implemented by the very large online platform to ensure the resilience of the algorithm are appropriate with regard to the importance of the algorithm for the provision of the service and its impact on elements referred to in point (c). 3. When performing its assessment, the very large online platform may seek

PE695.317v01-00 96/105 AM\1236481EN.docx EN advice from relevant national public authorities, researchers and non- governmental organisations. 4. Following the assessment referred to in paragraph 2, the very large online platform shall communicate its findings to the Commission. The Commission shall be entitled to request additional explanation on the conclusion of the findings, or when the additional information on the findings provided are not sufficient, any relevant information on the algorithm in question in relation to points (a), (b), (c) and (d) of paragraph 2. The very large online platform shall communicate such additional information within a period of two weeks following the request of the Commission. 5. Where the very large online platform finds that the algorithm used does not comply with point (a) or (d) of paragraph 2, the provider of the very large online platform shall take appropriate and adequate corrective measures to ensure the algorithm complies with the criteria set out in paragraph 2. 6. Where the Commission finds that the algorithm used by the very large online platform does not comply with point (a), (c), or (d) of paragraph 2, on the basis of the information provided by the very large online platform, and that the very large online platform has not undertaken corrective measures as referred into paragraph 5, the Commission shall recommend appropriate measures laid down in this Regulation to stop the infringement.

Or. en

Amendment 158 Hilde Vautmans, Karen Melchior, María Soraya Rodríguez Ramos, Samira Rafaela

Proposal for a regulation Article 34 – paragraph 2 a (new)

AM\1236481EN.docx 97/105 PE695.317v01-00 EN Text proposed by the Commission Amendment

2 a. The Commission shall support and promote the development and implementation of industry standards set by relevant European and international standardisation bodies for the protection and promotion of the rights of the child and the right to gender equality, observance of which, once adopted, will be mandatory, at least for the following: (a) age assurance and age verification pursuant to Article 13; (b) child impact assessments pursuant to Article 13; (c) age-appropriate terms and conditions that equally promote gender equality pursuant to Article 12; (d) child-centred design that equally promotes gender equality and pursuant to Article 13.

Or. en

Amendment 159 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Article 35 – paragraph 2

Text proposed by the Commission Amendment

2. Where significant systemic risk 2. Where significant systemic risk within the meaning of Article 26(1) emerge within the meaning of Article 26(1) emerge and concern several very large online and concern several very large online platforms, the Commission may invite the platforms, the Commission and the Board very large online platforms concerned, may invite the very large online platforms other very large online platforms, other concerned, other very large online online platforms and other providers of platforms, other online platforms and other intermediary services, as appropriate, as providers of intermediary services, as well as civil society organisations and appropriate, as well as civil society other interested parties, to participate in the organisations, trusted flaggers and other drawing up of codes of conduct, including interested parties, to participate in the by setting out commitments to take specific drawing up of codes of conduct, including risk mitigation measures, as well as a by setting out commitments to take specific

PE695.317v01-00 98/105 AM\1236481EN.docx EN regular reporting framework on any risk mitigation measures, as well as a measures taken and their outcomes. regular reporting framework on any measures taken and their outcomes. Trusted flaggers and vetted researchers may submit to the Commission and the Board requests for codes of conduct to be considered based on the systemic risk reports referred to in Article 13 and research evaluating the impact of the measures put in place by online platforms to address these.

Or. en

Amendment 160 Kira Marie Peter-Hansen

Proposal for a regulation Article 35 – paragraph 2

Text proposed by the Commission Amendment

2. Where significant systemic risk 2. Where significant systemic risk within the meaning of Article 26(1) emerge within the meaning of Article 26(1) emerge and concern several very large online and concern several very large online platforms, the Commission may invite the platforms, the Commission may invite the very large online platforms concerned, very large online platforms concerned, other very large online platforms, other other very large online platforms, other online platforms and other providers of online platforms and other providers of intermediary services, as appropriate, as intermediary services, as appropriate, as well as civil society organisations and well as civil society organisations, other interested parties, to participate in the including civil society organisations drawing up of codes of conduct, including working on gender equality, experts on by setting out commitments to take specific fundamental rights and other interested risk mitigation measures, as well as a parties, to participate in the drawing up of regular reporting framework on any codes of conduct, including by setting out measures taken and their outcomes. commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.

Or. en

Amendment 161 Silvia Modig

AM\1236481EN.docx 99/105 PE695.317v01-00 EN Proposal for a regulation Article 35 – paragraph 2

Text proposed by the Commission Amendment

2. Where significant systemic risk 2. Where significant systemic within the meaning of Article 26(1) emerge risk within the meaning of Article 26(1) and concern several very large online emerge and concern several very platforms, the Commission may invite the large online platforms, the Commission very large online platforms concerned, shall request the very large online other very large online platforms, other platforms concerned, other very large online platforms and other providers of online platforms, other online platforms intermediary services, as appropriate, as and other providers of intermediary well as civil society organisations and services, as appropriate, as well as civil other interested parties, to participate in the society organisations, women’s drawing up of codes of conduct, including rights organizations, LGBTIQ+ support by setting out commitments to take specific groups and other interested parties, to risk mitigation measures, as well as a participate in the drawing up of codes regular reporting framework on any of conduct, including by setting out measures taken and their outcomes. commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.

Or. en

Amendment 162 Maria-Manuel Leitão-Marques, Vilija Blinkevičiūtė, Alessandra Moretti

Proposal for a regulation Article 36 a (new)

Text proposed by the Commission Amendment

Article 36 a Codes of conduct for online violence 1. The Commission should encourage the development of codes of conduct for online violence at Union level, between online platforms and other relevant service providers, organisations representing victims of online violence, civil society organisations, and law enforcement authorities. These codes of conduct shall contribute to further transparency and reporting requirements with regard to instances of online

PE695.317v01-00 100/105 AM\1236481EN.docx EN violence, with special attention to gender- based violence, beyond the requirements of Article 13 and 26. These codes of conduct shall also strengthen requirements on how online platforms and other service providers deal with these instances, beyond the requirements of Articles 14 and 21. 2. The Commission shall aim to ensure that the codes of conduct pursue an effective transmission of information, in full respect for the rights of all parties involved, and clarify how online platforms and other relevant service providers should deal with particularly sensitive cases of illegal content, such as is the case of content related to sexual exploitation and abuse of women and girls, and revenge porn, in accordance with Union and national law. The Commission shall aim to ensure that the codes of conduct address at least: (a) the categories of illegal content related to online violence which should be used by providers of intermediary services in the detailed reports mentioned in Article 13; (b) types of illegal content associated with online violence, such as content related to sexual exploitation and abuse of women and girls and revenge porn, that all very large online platforms should consider as possible systemic risks when pursuing their risks assessments mentioned in Article 26; (c) the information that online platforms and other relevant service providers should provide to law enforcement or judicial authorities when there is suspicion of a serious criminal offence related to online violence, such as sexual exploitation and abuse of women and girls and revenge porn, as per Article 21; (d) any standardised information that should be provided in addition to Article 14(4) to the individual or entity that submitted a notice of the presence of alleged illegal content related to online

AM\1236481EN.docx 101/105 PE695.317v01-00 EN violence, such as the contact of organisations supporting victims of gender-based violence as well as how to access relevant public services, such as psychological support. 3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date.

Or. en

Amendment 163 Silvia Modig

Proposal for a regulation Article 37 – paragraph 4 – point e

Text proposed by the Commission Amendment

(e) safeguards to address any negative (e) safeguards to address any negative effects on the exercise of the fundamental effects on the exercise of the fundamental rights enshrined in the Charter, in rights enshrined in the Charter, in particular the freedom of expression and particular the freedom of expression and information and the right to non- information, the right to equality between discrimination; women and men, the right to non- discrimination and the rights of the child;

Or. en

Amendment 164 Silvia Modig

Proposal for a regulation Article 48 – paragraph 5

Text proposed by the Commission Amendment

5. The Board may invite experts and 5. The Board may invite experts and observers to attend its meetings, and may observers to attend its meetings, and may cooperate with other Union bodies, offices, cooperate with other Union bodies, offices, agencies and advisory groups, as well as agencies and advisory groups, as well as external experts as appropriate. The Board external experts, in areas such as equality, shall make the results of this cooperation in particular gender equality, LGBTIQ+

PE695.317v01-00 102/105 AM\1236481EN.docx EN publicly available. equality and non-discrimination, gender and other identity-based online violence and harassment, online sex trafficking, online stalking, child abuse, disinformation, data protection, electronic communications, audiovisual services, detection and investigation of frauds against the Union budget as regards custom duties, or consumer protection, if necessary for the performance of its tasks. The Board shall make the results of this cooperation publicly available and easily accessible, including persons with disabilities.

Or. en

Amendment 165 Kira Marie Peter-Hansen

Proposal for a regulation Article 48 – paragraph 5

Text proposed by the Commission Amendment

5. The Board may invite experts and 5. The Board may invite experts and observers to attend its meetings, and may observers to attend its meetings, in cooperate with other Union bodies, offices, particular on fundamental rights and agencies and advisory groups, as well as gender equality, and may cooperate with external experts as appropriate. The Board other Union bodies, offices, agencies and shall make the results of this cooperation advisory groups, as well as external experts publicly available. as appropriate. The Board shall make the results of this cooperation publicly available.

Or. en

Amendment 166 Kira Marie Peter-Hansen

Proposal for a regulation Article 48 – paragraph 5 a (new)

Text proposed by the Commission Amendment

5 a. The composition of the Board

AM\1236481EN.docx 103/105 PE695.317v01-00 EN shall be gender balanced.

Or. en

Amendment 167 Hilde Vautmans, Karen Melchior, María Soraya Rodríguez Ramos, Samira Rafaela

Proposal for a regulation Article 50 – paragraph 1 – subparagraph 1

Text proposed by the Commission Amendment

The Commission acting on its own The Commission acting on its own initiative, or the Board acting on its own initiative, or the Board acting on its own initiative or upon request of at least three initiative or upon request of at least three Digital Services Coordinators of Digital Services Coordinators of destination, may, where it has reasons to destination, shall, where it has reasons to suspect that a very large online platform suspect that a very large online platform infringed any of those provisions, infringed any of those provisions, recommend the Digital Services recommend the Digital Services Coordinator of establishment to investigate Coordinator of establishment to investigate the suspected infringement with a view to the suspected infringement with a view to that Digital Services Coordinator adopting that Digital Services Coordinator adopting such a decision within a reasonable time such a decision without undue delay and period. in any event within two months.

Or. en

Amendment 168 Hilde Vautmans, Karen Melchior, María Soraya Rodríguez Ramos, Samira Rafaela

Proposal for a regulation Article 51 – paragraph 1 – introductory part

Text proposed by the Commission Amendment

1. The Commission, acting either 1. The Commission, acting either upon the Board’s recommendation or on its upon the Board’s recommendation or on its own initiative after consulting the Board, own initiative after consulting the Board, may initiate proceedings in view of the shall initiate proceedings in view of the possible adoption of decisions pursuant to possible adoption of decisions pursuant to Articles 58 and 59 in respect of the Articles 58 and 59 in respect of the relevant conduct by the very large online relevant conduct by the very large online platform that: platform that:

Or. en

PE695.317v01-00 104/105 AM\1236481EN.docx EN Amendment 169 Hilde Vautmans, Karen Melchior, María Soraya Rodríguez Ramos, Samira Rafaela

Proposal for a regulation Article 51 – paragraph 2 – introductory part

Text proposed by the Commission Amendment

2. Where the Commission decides to 2. When the Commission initiates initiate proceedings pursuant to paragraph proceedings pursuant to paragraph 1, it 1, it shall notify all Digital Services shall notify all Digital Services Coordinators, the Board and the very large Coordinators, the Board and the very large online platform concerned. online platform concerned.

Or. en

AM\1236481EN.docx 105/105 PE695.317v01-00 EN