August 26, 2023, marked a historic day for the Digital Services Act (DSA) as Very Large Online PlatformsAn online platform refers to a digital service that enables interactions between two or more sets of users who are distinct but interdependent and use the service to communicate via the internet. The phrase "online platform" is a broad term used to refer to various internet services such as marketplaces, search engines, social media, etc. In the DSA, online platforms... (VLOPs) and Very Large Online Search Engines (VLOSEs) completed their inaugural risk assessmentIt refers to the process of identifying, analyzing, and evaluating the severity and probability of risks and threats associated with business or product development, deployment, and maintenance. In the context of the DSA, very large online platforms are required to annually assess the systemic risks stemming from the design, functioning or use of the platforms, including any actual or foreseeable... reports. These risk assessments initiate a series of annual requirements, which encompass transparency reporting, external audits, and subsequent risk assessments.
DSA deadlines, including the upcoming October 26, 2023, deadline for transparency reporting by VLOPs and VLOSEs, have garnered international attention. However, the implications of the DSA for non-VLOPs and non-VLOSEs have not received the same level of publicity. The obligations in the DSA on the approximately 10,000 online services that could fall in scope are substantial and as the DSA-Day for all other services is approaching quickly, it’s essential to take stock on what they need to do and if any lessons can be drawn from the experiences of VLOPs and VLOSEs.
How will the DSA impact non-VLOPs?
The European Union’s online ecosystem predominantly consists of intermediary services, hosting services, and online platforms, accounting for 90% of the landscape. The DSA employs a tiered obligation structure, wherein all three of these service categories are subject to general obligations. These obligations include cooperating with national authorities when required, designating EU-specific points of contact, engaging in transparency reporting, and more. Additionally, online platforms and hosting services have specific obligations tailored to their functions. All intermediary services, hosting services, and online platforms must meet the obligations outlined in the DSA and the table below by February 17, 2024:
D(SA) Day 1 approaching quickly and preparation is key
While a six-month timeframe may appear ample at the moment, fulfilling these obligations is a formidable undertaking, even for well-established platforms. Some obligations apply to every single service, including micro and small enterprises, and they will not be so easy to comply with. A notice and actionNotice-and-action is a mechanism that allows users to notify or flag illegal content to an online service. Under the DSA, notice and action mechanisms are mandatory for all hosting service providers and they must be easy to access and user-friendly. mechanism may seem simple to begin with but ensuring compliance with ‘timely’ processing of notices will very likely require changes to existing processes. Statement of reasons to users for any type of content moderationReviewing user-generated content to ensure that it complies with a platform’s T&C as well as with legal guidelines. See also: Content Moderator decision will equally very likely require resources and time to build out operationally and technically.
Further, let’s take Transparency Reporting, a requirement that applies to every medium sized online service. It necessitates not only the provision of details regarding legal orders received from member states, categorized by the type of illegal content, but also encompasses notices submitted by trusted flaggersGenerally, this refers to individuals or entities that have proven expertise in flagging harmful or illegal content to online service providers. Within the meaning of the DSA, trusted flaggers are entities that have been awarded an official status by a Digital Service Coordinator. Online platforms will need to ensure notices from such organizations are treated with priority., along with an explanation of the actions taken in accordance with terms and conditions or legal compliance. Furthermore, services are obligated to provide comprehensive insights into their content moderation systems, whether human or automated, classified by the type of violation. Finding this complex? Wait, there’s more: As businesses are well aware, content moderation is an inherently imprecise task, and thus, under Article 15 of the DSA, services must also provide indicators of accuracy, error rates of automated moderation, and the safeguards implemented, to satisfy the EU Member States. These are merely some of the many requirements that intermediary, hosting, and online platform services are obliged to adhere to and their impact on individual businesses will be varied. What is certain, is that, preparation, if not already in progress, should start immediately to ensure compliance.
How can Tremau Help?
Recognizing that compliance with the DSA’s provisions is an ongoing commitment, businesses must establish safety and compliance mechanisms with scalability in mind. This is where Tremau steps in – we offer comprehensive compliance solutions not only for regulatory obligations of today, but also upcoming ones like the Online Safety Bill in the UK. Backed by a team of regulatory experts, Tremau simplifies your transition to global compliance obligations, ensuring the safeguarding of your users, optimizing content moderation efficiency, and bolstering your organization’s reputation.