On 11 May 2022, the EU Commission proposed a new set of rules to combat child sexual abuse material (CSAM)Child Sexual Abuse Material refers to any material that portrays sexual activity involving a person who is below the legal age of consent. To accurately describe the exploitation and abuse of children and protect the dignity of victims, the European Parliament recommends using the term "child sexual abuse material" instead of "child pornography." More online, laying out new obligations for providers to detect, report, and remove CSAM from their services. This follows efforts such as the 2020 EU Strategy for a More Effective Fight Against Child Sexual Abuse and falls under their recent strategy on the Rights of the Child.
Concretely, the new proposed regulation builds on the upcoming Digital Services Act and it aims to replace the current interim solution regarding the processing of personal and other data for the purpose of combating online child sexual abuse.
New obligations for Online Service Providers
The obligations set forth in the new proposal are directed at all online service providers (OSPs) operating in EU Member States, including providers of hosting, interpersonal communication services, and app stores. Currently, the new obligations discussed include:
|Mandatory risk assessmentIt refers to the process of identifying, analyzing, and evaluating the severity and probability of risks and threats associated with business or product development, deployment, and maintenance. In the context of the DSA, very large online platforms are required to annually assess the systemic risks stemming from the design, functioning or use of the platforms, including any actual or foreseeable... More and risk mitigation measures||Online Service Providers (OSPs) will be required to assess the risks of their services being misused for groomingA form of child sexual exploitation whereby a person attempts to establish some form of connection (such as building trust in a relationship) with a child, potentially with an aim of sexual abuse or exploitation either online or offline. More (solicitation of children) or for the dissemination of CSAM.|
Appropriate risk mitigation measures will subsequently need to be taken by the OSPs.
OSPs will be required to report the results of risk assessments to the competent national authorities in their relevant Member State.
|Reduced exposure to grooming||App stores will need to assess whether any apps on their platform are at risk of being used for solicitation.|
Reasonable measures should subsequently be taken to identify child users and prevent them from accessing such apps.
|Proactive content detection||Proactive content detection should be carried out by OSPs using indicators of child sexual abuse verified and provided by the EU Centre.|
Detection technologies put in place by OSPs should only be used to detect child sexual abuse.
OSPs will need to prove that the technology used for proactive content detection is proportionate.
|National authorities can issue removal ordersIn the Regulation on Terrorist Content Online, this refers to a legal order from a national authority to remove content established to be terrorist content from an onlineservice. Upon reception of the order, the content then needs to be removed within an hour. Read more More in cases where the CSAM is not swiftly taken down and hosting providers will be required to disable access to a server hosting CSAM that cannot be taken down.|
|Reporting obligations||OSPs have to report any detected CSAM to relevant authorities and the newly created EU center.|
|Data collection and Transparency obligations||OSPs will be required to collect aggregated data relating to their processes and activities under this regulation and make the relevant information available to the EU Centre.|
An annual transparency reportA transparency report is a document released by an organization that discloses information related to its policies, practices, and actions. Typically, transparency reports provide details on the handling of requests for user data and content removal, as well as government requests for user records, among other relevant metrics and insights. More should be published and made accessible to the general public.
|Single point of contact||OSPs should establish a single point of contact for direct communication with Coordinating Authorities, other competent authorities of the Member States, the Commission, and the EU Centre.|
Enforcement Measures and Heavy Penalties
The regulation presented by the Commission also proposes the creation of an independent EU Centre on Child Sexual Abuse that will act as a “hub of expertise, provide reliable information, identify and analyse any erroneous reports, forward relevant reports to law enforcement, and provide victim support”. The EU Centre will work alongside online service providers, national law enforcement agencies and EuropolThe European Union’s law enforcement agency, based in The Hague. Their activities focus on a broad range of serious and organized crimes, including cybercrime, terrorism, and intellectual property crime. Europol also hosts the EU internet referral unit that detects and investigates malicious content on the internet and in social media. More, Member States, and victims.
In addition, and in line with the Digital Services Act, Member States will be required to designate competent authorities, called Coordinating Authorities, who will be responsible for the application and enforcement of the regulation. They will have the power to impose fines or request a judicial authority in the Member State to do so, the power to impose a periodic penalty payment to ensure that an infringement of the Regulation is curbed, and the power to adopt interim measures to avoid the risk of serious harm, to name a few.
Such penalties for infringements can go up to 6% of the annual income or global turnover of the preceding business year of the provider.
Next steps in the legislative process
The proposed regulation has a long way to go before being adopted: the initial proposal of the Commission will need to be agreed on by both the European Parliament and EU Council (separately and collectively). This process is likely to take up to two years.
The proposal comes as part of a European strategy for a Better Internet for Kids (BIK+) that rests on the pillars of creating a safe digital environment, empowering children in the digital world, and improving children’s active participation.
For further information on this regulation and how it can affect your organization you can contact us at firstname.lastname@example.org
Tremau Policy Team