Very Large An online platform refers to a digital service that enables interactions between two or more sets of users who are distinct but interdependent and use the service to communicate via the internet. The phrase "online platform" is a broad term used to refer to various internet services such as marketplaces, search engines, social media, etc. In the DSA, online platforms... More (VLOPs) and Very Large Online Search Engines (VLOSEs) reached a significant milestone on August 26 – their first day of compliance with the Digital Services Act (DSA). This included completing the first ever online platform systemic risk assessments — which marks the beginning of a series of annual requirements, including transparency reporting, external audits, and a yearly It refers to the process of identifying, analyzing, and evaluating the severity and probability of risks and threats associated with business or product development, deployment, and maintenance. In the context of the DSA, very large online platforms are required to annually assess the systemic risks stemming from the design, functioning or use of the platforms, including any actual or foreseeable... More cycle. Below is a list of things to look out for in year one of the DSA for VLOPs.
As the next immediate step in this compliance process, all 19 designated platforms are required to submit transparency reports by October 26, 2023. Transparency reporting, as defined by Article 15, 24, and 42 of the DSA, involves a bi-annual process where platforms must publish publicly accessible, machine-readable reports. These reports should have extensive details concerning legal orders received from member states related to various forms of illegal content, user-reported incidents, Reviewing user-generated content to ensure that it complies with a platform’s T&C as well as with legal guidelines. See also: Content Moderator More actions, user complaints, appeals processes, and more.
In short, platforms are required to report on any and all factors that collectively influence the availability, visibility, and accessibility of information, categorized by violation type and detection method, while also including accuracy indicators for their Tools that can detect harmful or illegal content online through the use of machine learning algorithms or image copy detection software that check against existing databases of prohibited content. They can be used in both automatically removing or restricting content where there is high probability of the content being prohibited and as a support to human moderators to signal potentially... More. A specific requirement for VLOPs is that platforms must break down accuracy indicators by each official language of the member states within this reporting process.
Risk assessments prior to new functionalities
Buried in the fine print of Article 34 is a crucial obligation for VLOPs and VLOSEs to assess, prior to deploying new functionalities, if these are likely to have a critical impact on systemic risks in the EU. These pre-deployment assessments likely need to be built into both product and policy change processes to evaluate if changes would meet the ‘critical impact’ threshold. As the DSA gives little indication on what indicators or benchmarks to use for this assessment, platforms will need to develop a defensible process in house, pending specific regulatory guidance on this issue. Presumably, where a functionality change is deemed to have critical impact, a mini-risk assessment on any relevant systemic risk areas should follow.
This requirement highlights that while it was undoubtedly a huge effort across a multitude of teams inside the VLOPs to pull together the first inaugural risk assessment, this is only the first step. Given the cyclical nature of the DSA risk assessments and the obligation to check risks prior to deploying functionalities, it is likely that such processes will need to be fully integrated into the daily operations of relevant internal teams. This could mean dedicated DSA risk assessment policies and procedures, including mechanisms for incorporating feedback from regulators and auditors, the process for requesting the required data from engineering teams, and the timeline for delivery of assessments.
Preparation for audits
In May 2023, the European Commission initiated a public consultation on the Draft Delegated Act concerning audits that sets out the details of this annual obligatory exercise. Once enacted, this will establish the methodology and approach for auditors to conduct annual DSA Compliance audits for VLOPs and VLOSEs. Given the complex and specialized nature of these audits, it is anticipated that only a few global firms will be able to take up the task. To be uniquely positioned to help VLOPs, they will likely have to find partners with niche skills in assessing content and conduct-related risks, AI and recommender systems related risks.
VLOPs and auditors alike are now eagerly waiting for the final delegated act to be published to be able to engage in contract negotiations and ensure that there are no conflicts of interest before commencing the actual audit. Given the breadth and novelty of the exercise, time is of the essence as the audit reports need to be made publicly available within three months of completion and receipt of the audit (August 2024).
Commission’s breadcrumbs on what a risk assessment should look like
While no official communication has happened on official regulatory guidance on risk assessments, the Commission did recently publish a study that gives some insight into potential expectations for risk assessments to come.
The study adopts a scenario-based approach, assessing the effectiveness of online platforms’ measures against Russian False information that is circulated on online platforms to deceive people. It has the potential to cause public harm and is often done for economic or political gain. More on the basis of the DSA’s risk management principles and risk mitigation requirements. Its scope is much narrower than the required VLOP risk assessments as this forms a sub-risk category within a wide range of illegal content, civic discourse, electoral and fundamental rights related risks to be evaluated in the DSA. The study could nevertheless set the expectation that root cause analyses on specific risk manifestations would be expected from platforms. What’s more, the study gives an insight into which risks may be on the commission’s mind this regulatory cycle, including a heightened focus on disinformation due to upcoming elections.
How can Tremau Help?
DSA day one is here for VLOPs and it is just the beginning. With the transparency reporting deadline just around the corner VLOPs and VLOSEs should now turn their attention to the DSA’s transparency reporting requirements and prepare for contracting the audit. In doing so, they should not lose sight of the mitigation roadmap outlined in their risk assessments, remembering that these must also be diligently implemented, so that the audit report can serve as a reflection of ongoing efforts to address both inherent and residual risks.
Recognizing that compliance with the DSA’s provisions is an ongoing commitment, businesses must establish safety and compliance mechanisms with scalability in mind. This is where Tremau steps in – we offer comprehensive compliance solutions not only for regulatory obligations of today, but also upcoming ones like the Online Safety Bill in the UK. Backed by a team of regulatory experts, Tremau simplifies your transition to global compliance obligations, ensuring the safeguarding of your users, optimizing content moderation efficiency, and bolstering your organization’s reputation.