Lead Forensics

New EU Proposal to Combat Child Sexual Abuse Online

On 11 May 2022, the EU Commission proposed a new set of rules to combat child sexual abuse material (CSAM) online, laying out new obligations for providers to detect, report, and remove CSAM from their services. This follows efforts such as the 2020 EU Strategy for a More Effective Fight Against Child Sexual Abuse and falls under their recent strategy on the Rights of the Child. 

Concretely, the new proposed regulation builds on the upcoming Digital Services Act and it aims to replace the current interim solution regarding the processing of personal and other data for the purpose of combating online child sexual abuse

New obligations for Online Service Providers

The obligations set forth in the new proposal are directed at all online service providers (OSPs) operating in EU Member States, including providers of hosting, interpersonal communication services, and app stores. Currently, the new obligations discussed include:

Mandatory risk assessment and risk mitigation measuresOnline Service Providers (OSPs) will be required to assess the risks of their services being misused for grooming (solicitation of children) or for the dissemination of CSAM.

Appropriate risk mitigation measures will subsequently need to be taken by the OSPs.

OSPs will be required to report the results of risk assessments to the competent national authorities in their relevant Member State.
Reduced exposure to groomingApp stores will need to assess whether any apps on their platform are at risk of being used for solicitation.

Reasonable measures should subsequently be taken to identify child users and prevent them from accessing such apps.
Proactive content detectionProactive content detection should be carried out by OSPs using indicators of child sexual abuse verified and provided by the EU Centre.

Detection technologies put in place by OSPs should only be used to detect child sexual abuse.

OSPs will need to prove that the technology used for proactive content detection is proportionate.

Effective removal
National authorities can issue removal orders in cases where the CSAM is not swiftly taken down and hosting providers will be required to disable access to a server hosting CSAM that cannot be taken down.
Reporting obligationsOSPs have to report any detected CSAM to relevant authorities and the newly created EU center.
Data collection and Transparency obligationsOSPs will be required to collect aggregated data relating to their processes and activities under this regulation and make the relevant information available to the EU Centre.

An annual transparency report should be published and made accessible to the general public.
Single point of contactOSPs should establish a single point of contact for direct communication with Coordinating Authorities, other competent authorities of the Member States, the Commission, and the EU Centre.

Enforcement Measures and Heavy Penalties

The regulation presented by the Commission also proposes the creation of an independent EU Centre on Child Sexual Abuse that will act as a “hub of expertise, provide reliable information, identify and analyse any erroneous reports, forward relevant reports to law enforcement, and provide victim support”. The EU Centre will work alongside online service providers, national law enforcement agencies and Europol, Member States, and victims.

In addition, and in line with the Digital Services Act, Member States will be required to designate competent authorities, called Coordinating Authorities, who will be responsible for the application and enforcement of the regulation. They will have the power to impose fines or request a judicial authority in the Member State to do so, the power to impose a periodic penalty payment to ensure that an infringement of the Regulation is curbed, and the power to adopt interim measures to avoid the risk of serious harm, to name a few.

Such penalties for infringements can go up to 6% of the annual income or global turnover of the preceding business year of the provider.

Next steps in the legislative process

The proposed regulation has a long way to go before being adopted: the initial proposal of the Commission will need to be agreed on by both the European Parliament and EU Council (separately and collectively). This process is likely to take up to two years.  

The proposal comes as part of a European strategy for a Better Internet for Kids (BIK+) that rests on the pillars of creating a safe digital environment, empowering children in the digital world, and improving children’s active participation.

For further information on this regulation and how it can affect your organization you can contact us at info@tremau.com

Tremau Policy Team

JOIN OUR COMMUNITY

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.

Share This Post

Further articles

Global Regulations

What does Canada’s proposed Online Harms Act mean for your platform?

In the last three years, roughly 540 million people—representing some of the most lucrative markets—have come under the protection of next-generation online safety laws in the European Union (EU), United Kingdom (UK), and Australia. The dominoes are falling, and Canada is suiting up to join the party.  Canada’s Bill C-63, otherwise known as the Online

Trust & Safety Software
Best practices in trust & safety

Making the Right Choice: Buy or Build Your Trust & Safety Software?

In the realm of software development, the age-old question about building software in-house or buying it from vendors is very common.  It is not surprising that this question is also very common when it comes to enterprise-level Trust & Safety (T&S) tools. Luckily, there is a long history of research on this question, starting from

Join our community

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.