A 2022 Pew Research Survey found that 95% of teenagers (aged 13-17) use YouTube and 67% use TikTok, with nearly one in three reporting near-constant use. The amount of screen time has also increased in recent years and it hovers around five and a half hours on average.
With a greater number of underage users and increasing opportunities to create and share content, comes a greater risk of exposure to illegal and harmful content online. The EU’s landmark legislation, the Digital Services Act (DSA), responds to these challenges around child protection and sets out a number of obligations which aim to keep children safe online.
How will the DSA change platforms’ trust and safety policies related to minors?
The obligations addressing child protection in the DSA are spread throughout the text. On the basic level, any service provider which is directed at or used by minors and children has to make their terms of service understandable to minors. The most impacted, however, are likely to be online platformsAn online platform refers to a digital service that enables interactions between two or more sets of users who are distinct but interdependent and use the service to communicate via the internet. The phrase "online platform" is a broad term used to refer to various internet services such as marketplaces, search engines, social media, etc. In the DSA, online platforms.... For example, social media, video sharing services, and many online gaming platforms, need to take measures to ensure a high level of privacy, safety, and security of minors when considering the design of their platforms.
The broad nature of the new obligation is challenging as it gives little information or detail on what exact measures will achieve compliance and what falls short. Diving into the DSA, there are hints of what compliance could mean — for example, services should ensure that minors can easily access mechanisms referenced in the DSA such as notice and actionNotice-and-action is a mechanism that allows users to notify or flag illegal content to an online service. Under the DSA, notice and action mechanisms are mandatory for all hosting service providers and they must be easy to access and user-friendly. and complaint mechanisms. They should also take measures to protect minors from content that may impair their physical, mental or moral development and provide tools that enable conditional access to such information.
Will there be guidance on compliant content moderation practices?
There is no obligation on the Commission to publish guidance on how platforms should safeguard their younger user base before the overall compliance deadline in February 2024. However, we can expect some co-regulatory measures to be in development as part of the Better Internet for Kids+ strategy. In the meantime, companies must seek out and apply existing best practices and develop their own measures in order to comply.
Future best practices on keeping children safe online will likely be developed in the risk assessmentIt refers to the process of identifying, analyzing, and evaluating the severity and probability of risks and threats associated with business or product development, deployment, and maintenance. In the context of the DSA, very large online platforms are required to annually assess the systemic risks stemming from the design, functioning or use of the platforms, including any actual or foreseeable... cycles of very large online platforms as well. Platforms with more than 45 million monthly active users will have to assess systemic risks related to minors and children such as risks of them being exposed to content which may harm their physical or mental health, or promote addictive behavior.
How can Tremau help you?
If you are an online platform, you are likely already working hard to ensure children are protected on your platform. However, whether your existing measures are enough to comply with the new obligations in the DSA needs careful assessment and benchmarking against best practices.