Lead Forensics

D(SA)-Day for VLOPs is past us. What now? 

Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) reached a significant milestone on August 26 – their first day of compliance with the Digital Services Act (DSA). This included completing the first ever online platform systemic risk assessments — which marks the beginning of a series of annual requirements, including transparency reporting, external audits, and a yearly risk assessment cycle. Below is a list of things to look out for in year one of the DSA for VLOPs.  

Transparency reporting  

As the next immediate step in this compliance process, all 19 designated platforms are required to submit transparency reports by October 26, 2023. Transparency reporting, as defined by Article 15, 24, and 42 of the DSA, involves a bi-annual process where platforms must publish publicly accessible, machine-readable reports. These reports should have extensive details concerning legal orders received from member states related to various forms of illegal content, user-reported incidents, content moderation actions, user complaints, appeals processes, and more.  

In short, platforms are required to report on any and all factors that collectively influence the availability, visibility, and accessibility of information, categorized by violation type and detection method, while also including accuracy indicators for their automated detection tools. A specific requirement for VLOPs is that platforms must break down accuracy indicators by each official language of the member states within this reporting process. 

Risk assessments prior to new functionalities   

Buried in the fine print of Article 34 is a crucial obligation for VLOPs and VLOSEs to assess, prior to deploying new functionalities, if these are likely to have a critical impact on systemic risks in the EU. These pre-deployment assessments likely need to be built into both product and policy change processes to evaluate if changes would meet the ‘critical impact’ threshold. As the DSA gives little indication on what indicators or benchmarks to use for this assessment, platforms will need to develop a defensible process in house, pending specific regulatory guidance on this issue. Presumably, where a functionality change is deemed to have critical impact, a mini-risk assessment on any relevant systemic risk areas should follow.  

This requirement highlights that while it was undoubtedly a huge effort across a multitude of teams inside the VLOPs to pull together the first inaugural risk assessment, this is only the first step. Given the cyclical nature of the DSA risk assessments and the obligation to check risks prior to deploying functionalities, it is likely that such processes will need to be fully integrated into the daily operations of relevant internal teams. This could mean dedicated DSA risk assessment policies and procedures, including mechanisms for incorporating feedback from regulators and auditors, the process for requesting the required data from engineering teams, and the timeline for delivery of assessments. 

Preparation for audits  

In May 2023, the European Commission initiated a public consultation on the Draft Delegated Act concerning audits that sets out the details of this annual obligatory exercise. Once enacted, this will establish the methodology and approach for auditors to conduct annual DSA Compliance audits for VLOPs and VLOSEs. Given the complex and specialized nature of these audits, it is anticipated that only a few global firms will be able to take up the task. To be uniquely positioned to help VLOPs, they will likely have to find partners with niche skills in assessing content and conduct-related risks, AI and recommender systems related risks.  

VLOPs and auditors alike are now eagerly waiting for the final delegated act to be published to be able to engage in contract negotiations and ensure that there are no conflicts of interest before commencing the actual audit. Given the breadth and novelty of the exercise, time is of the essence as the audit reports need to be made publicly available within three months of completion and receipt of the audit (August 2024).  

Commission’s breadcrumbs on what a risk assessment should look like  

While no official communication has happened on official regulatory guidance on risk assessments, the Commission did recently publish a study that gives some insight into potential expectations for risk assessments to come.  

The study adopts a scenario-based approach, assessing the effectiveness of online platforms’ measures against Russian disinformation on the basis of the DSA’s risk management principles and risk mitigation requirements. Its scope is much narrower than the required VLOP risk assessments as this forms a sub-risk category within a wide range of illegal content, civic discourse, electoral and fundamental rights related risks to be evaluated in the DSA. The study could nevertheless set the expectation that root cause analyses on specific risk manifestations would be expected from platforms. What’s more, the study gives an insight into which risks may be on the commission’s mind this regulatory cycle, including a heightened focus on disinformation due to upcoming elections.   

How can Tremau Help? 

DSA day one is here for VLOPs and it is just the beginning. With the transparency reporting deadline just around the corner VLOPs and VLOSEs should now turn their attention to the DSA’s transparency reporting requirements and prepare for contracting the audit. In doing so, they should not lose sight of the mitigation roadmap outlined in their risk assessments, remembering that these must also be diligently implemented, so that the audit report can serve as a reflection of ongoing efforts to address both inherent and residual risks. 

Recognizing that compliance with the DSA’s provisions is an ongoing commitment, businesses must establish safety and compliance mechanisms with scalability in mind. This is where Tremau steps in – we offer comprehensive compliance solutions not only for regulatory obligations of today, but also upcoming ones like the Online Safety Bill in the UK. Backed by a team of regulatory experts, Tremau simplifies your transition to global compliance obligations, ensuring the safeguarding of your users, optimizing content moderation efficiency, and bolstering your organization’s reputation. 

JOIN OUR COMMUNITY

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.

Share This Post

Further articles

Global Regulations

What does Canada’s proposed Online Harms Act mean for your platform?

In the last three years, roughly 540 million people—representing some of the most lucrative markets—have come under the protection of next-generation online safety laws in the European Union (EU), United Kingdom (UK), and Australia. The dominoes are falling, and Canada is suiting up to join the party.  Canada’s Bill C-63, otherwise known as the Online

Trust & Safety Software
Best practices in trust & safety

Making the Right Choice: Buy or Build Your Trust & Safety Software?

In the realm of software development, the age-old question about building software in-house or buying it from vendors is very common.  It is not surprising that this question is also very common when it comes to enterprise-level Trust & SafetyThe field and practices that manage challenges related to content- and conduct-related risk, including but not

Join our community

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.