DatabaseDSA

Digital Services Act Database

Access up-to-date trackers on Digital Service Coordinators, Trusted Flaggers, dispute resolution bodies, enforcement actions, and the latest VLOP and VLOSE risk and transparency reports - all curated by Tremau’s T&S Research Team.

Maintained by Tremau T&S Research Team

Last updated: Yesterday at 5:25 PM

Get Safety Space in your inbox, our monthly newsletter on all things T&S

VLOPs & VLOSEs

Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) are defined as platforms and search engines with more than 45 million monthly active users in the EU and are subject to the highest tier of regulatory requirements under the Digital Services Act, attracting the most intense enforcement scrutiny.

While the initial designations occurred in April 2023, the European Commission continues to designate new VLOPs and VLOSEs as they meet the user threshold.

The average monthly user figures are updated based on the latest transparency reports published by the platforms. A platform will lose its VLOP or VLOSE status if it has fewer than 45 million average monthly users for an entire year, and the European Commission will make a formal announcement in those cases.

The table can be filtered by DSA Classification, Country of Establishment, and Digital Services Coordinator.

Enforcement Actions

With the DSA in full effect, the European Commission (EC) has kicked off its enforcement role, sending requests for information (RFI) and opening investigations into different platforms.

You can filter the table by platform, the authority, the type of enforcement action, and concerned DSA articles. 

 

VLOPs & VLOSEs Risk Assessments

Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) are required by Article 34 of the DSA to conduct annual Risk Assessments to identify and mitigate systemic risks associated with their services.

While VLOPs and VLOSEs must submit these Risk Assessments to the European Commission, they are also obligated to publish publicly accessible summaries. These public versions may differ from the full reports provided to regulators, primarily for safety and confidentiality reasons.

In 2024, some platforms published their 2023 risk assessments, but only the 2024 systemic risk assessments required audit and implementation reports.

If you notice missing risk assessment information for 2023, here’s why: VLOPs were designated in April 2023 and had to conduct their first risk assessments by August/September. However, Article 42(4) of the DSA requires that the first public risk assessment and audit reports cover the 2024 cycle, with publication expected in late 2024 – not for 2023.

VLOPs & VLOSEs Transparency Reports

The DSA requires Very Large Online Platforms and Very Large Online Search Engines to publish Transparency Reports every six months. The aim? Monitoring and documenting the internal operations carried out to ensure the safety of their services.

Read our blog to know more and our checklist on how to get your systems ready for the new template mandatory since July 2025.

Please note that Pornhub, XVideos, Shein, and Temu were designated as VLOPs later, that is why they did not have to publish a transparency report in 2023 or early 2024.

Digital Services Coordinators ​

DSCs oversee and enforce the DSA within their Member States while also contributing to its supervision and enforcement across the entire Union. Each Member State appoints one DSC from among its competent authorities. The DSC serves as the first point of contact for enforcement, coordinates national authorities, and ensures cooperation at the EU level.

Our tracker provides you a manageable list of all these entities, their contacts, and main areas of competence. 

You can filter the information by competence areas and VLOPs in their jurisdiction.

Trusted Flaggers

Under the DSA, Trusted Flaggers will have a special status where they are able to signal illegal content with priority to online platforms.

As Digital Services Coordinators kick off the designation, our tracker offers an overview of who these actors are, where they are based, the content area they focus on, and the VLOPSEs under their jurisdictions.

Filter the information by Area of Expertise, Country and Digital Service Coordinator in scope.

Out-of-court dispute settlement bodies

Article 21 of the DSA has established the figure of “Out-of-court dispute settlement bodies” as private actors users can bring content moderation decisions to.

These bodies are responsible for either helping to resolve disputes or issuing non-binding decisions for platforms to consider.

With OOC bodies being designated by Digital Services Coordinators, our tracker will help you monitor who these bodies are, where they are based, and how to contact them.

Filter the table by country, languages and Digital Services Coordinator in charge.

Are we missing something? Let us know any data that might need to be reviewed