Lead Forensics

Peeling back the screen: lessons from the first DSA transparency reports

The nineteen designated very large online platforms (VLOPs) and search engines (VLOSEs) have published their first bi-annual transparency reports under Articles 15, 24 and 42 of the EU’s Digital Services Act (DSA). These reports provide valuable insight into how intermediary services can understand and deliver on their respective reporting obligations.

Transparency reporting recap: The why, who, when, and what?

Why: The DSA, in its might, requires intermediary services to make detailed disclosures and report on complex metrics relating to their content moderation activities. Articles 15, 24 and 42 weave together a web of reporting obligations that have, for the first time, been put into practice by VLOP and VLOSEs. The new reports form part of the DSA’s ambition to increase platform accountability to users through transparency and, by requisite, user trust in platforms.

Who: Under the DSA, transparency reporting obligations follow the regulation’s broader risk-based framework such that VLOPs and VLOSEs—the largest platforms and search engines—have the most rigorous obligations, in reflection of their elevated risk profile. Less burdensome obligations applying to the remaining medium and large online services.

When: VLOPs and VLOSEs need to publish transparency reports “at least every six months”, with their first due this past fortnight. Other services (excluding small and micro enterprises) must publish reports “at least once a year” from 17 February 2024, at the latest by 17 February 2025. 

What: Each article of the DSA (15, 24 & 42) contains specific reporting requirements that apply to the eligible intermediary services.

Key insights: What can we learn from the first VLOP and VLOSE transparency reports?

  1. Locating and adjusting standard metrics was not necessarily easy—even for VLOPs 

Surveying the reports reveals a deal of variance in reporting practices between VLOPs and VLOSEs. Reporting on DSA-specific instruments, such as the notice-and-action mechanism and complaints under Article 20, required the implementation of novel reporting practices. In some cases, platforms noted that data was not available at this stage to provide the full metric breakdowns required by the DSA. Most VLOPs and VLOSEs had also never isolated their pre-DSA transparency reports to the EU, creating an additional challenge to count EU specific data. 

While not necessarily a concern for VLOPs, all platforms need to keep in mind that they will need to ensure that they can distinguish user notices under the DSA notice-and-action mechanism from any other user notice mechanisms they have for reporting purposes. This requires the capacity to distinguish which notices have been submitted on the basis of alleged illegal content and by EU users

  1. Transparency reporting can be used to communicate on compliance and T&S efforts

Far from a simple compliance exercise, most VLOPs and VLOSEs have taken their reports as an opportunity to communicate their commitment to transparency, user trust, and safe online spaces. Many have described policies and initiatives that extend beyond strict requirements additional language moderation capacity, and information sharing between platforms. It is clear that transparency reporting processes will increase pressure on intermediary services to demonstrate a serious commitment to user safety. 

  1. What should DSA transparency reports look like? 

There are some specific requirements with regards to formatting under Articles 15, 24 and 42. All reports must be made “publicly available”, in a “machine readable format” and “accessible manner”. Most VLOPs and VLOSEs have sought to meet this requirement by providing a PDF file or webpage with qualitative descriptions and quantitative data in tables, with some including also CSV files. At this stage, it is unclear whether machine readability and accessibility are mutually exclusive and require the publication of two separate files, or if one format can satisfy both requirements. 

  1. Be prepared to calculate the accuracy of automated tools 

VLOPs and VLOSEs adopted varying techniques to calculate the accuracy of their automated tools. Many used successful appeals against automated decisions as a proxy for accuracy. A different methodology applied by Google involved comparing human evaluations of a random sample of user content against decisions made by its automated systems on that same content, with the human evaluation assumed as the ground truth. Alternatively, Meta described a methodology for measuring random samples of automated decisions against pre-established expectations for policy enforcement.

Companies have generally only reported accuracy metrics for automated enforcement tools. In many cases though, automated tools are limited to the detection of content for human review. However, there is little established practice on how to report on accuracy of automated tools that are used to flag content for human review. In these scenarios, there is no appeal based metric to operationalise as a proxy for accuracy. 

A key takeaway for all services is to ensure that they store the necessary data to provide an indication of accuracy rates, and develop a thoughtful explanation of decisions made in their accuracy methodology. Beyond the DSA, strong accuracy reporting practices can support better business decisions regarding investments in and quality assurance of automated tools. 

  1. Who should be counted as “human resources” in content moderation?

The novelty of reporting on content moderation “human resources” led to a variety of different approaches, from the inclusion or exclusion of ML scientists, data analysts, software developers, legal teams, and policy developers alongside human reviewers. Most VLOPs and VLOSEs have sought to satisfy this requirement by providing relatively expansive figures on the full range of human resources involved in content moderation-tangient activities, with a sub-category of human resources engaged in human content review. 

While other online platforms are not under the obligation to share breakdowns of human resources by Member States, it’s noteworthy that many VLOPs and VLOSEs share the challenge of catering for the full range of official member state languages in their content moderation teams. To overcome this challenge in the near term, many VLOPs and VLOSEs have integrated language translation capabilities in their CM process. 

  1. Prepare for large volumes of user notices

The reports reveal that hosting services will need to ensure they have capacity to process and report on a potentially high volume of user reports under the notice-and-action mechanism. Across its five month reporting period, Meta processed 524,821 Article 16 notices from users on its Facebook platform and 351,403 from Instagram. Amazon processed 417,846 across six months. In similarly high volumes, during its respective one-month reporting period TikTok processed 36,165. YouTube processed 42,090 notices in its two-week reporting period alone. 

While many platforms have experience implementing and managing a user reporting mechanism, the DSA requires hosting services to accept and process Article 16 notices “in a timely, diligent, non-arbitrary and objective manner”. This is a rather strict requirement that, in combination with large volumes of notices, warrants early preparation by platforms.   

  1. What can intermediary services expect from government orders?

The DSA requires intermediary services to receive and process orders from member state authorities to either remove content or provide information about specific users of the service. For VLOPs and VLOSEs, their reports reveal that they received a much smaller volume of government orders than user notices, as could be expected. Where orders have been made, they tend to be information requests more than removal orders. Nonetheless, marketplace-based platforms such as Amazon still processed a significant volume of removal orders, largely related to product compliance issues (e.g., product safety, legality, etc). As an additional observation, some member states tended to be responsible for a disproportionate share of orders, namely Germany and France.  

What’s next?

The Commission is likely to publish an implementing act before the end of the year that will provide more instruction on how to format reports, what data needs to be reported, and how to report it. 

Meanwhile, intermediary services can take some operational steps in preparation: 

  • Considering how to communicate and explain the content moderation processes, keeping in mind that this is a communication tool not only to regulators but also to civil society, researchers, and users
  • Generating and storing necessary data to fulfil DSA reporting requirements,
  • Time-stamping CM actions,
  • Isolating necessary data to the EU (e.g. dedicated channel for DSA user notices), 
  • Preparing to process a high volume of user notices, 
  • Establishing a credible methodology for calculating the accuracy of automated systems and CM human resources, and
  • Ensuring sufficient resources to process user notices and government orders.

How can Tremau help?

The DSA transparency reporting process is one part of an increasingly complex global regulatory landscape for online platforms. This is where Tremau steps in—we offer comprehensive compliance solutions not only for the regulatory obligations of today, but for new and emerging requirements such as the UK’s Online Safety Act. Backed by a team of regulatory experts, Tremau simplifies your transition to global compliance, ensuring the safety of your users, the efficiency of your content moderation systems, and your organisation’s good reputation.  Get in touch!

JOIN OUR COMMUNITY

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.

Share This Post

Further articles

Global Regulations

What does Canada’s proposed Online Harms Act mean for your platform?

In the last three years, roughly 540 million people—representing some of the most lucrative markets—have come under the protection of next-generation online safety laws in the European Union (EU), United Kingdom (UK), and Australia. The dominoes are falling, and Canada is suiting up to join the party.  Canada’s Bill C-63, otherwise known as the Online

Trust & Safety Software
Best practices in trust & safety

Making the Right Choice: Buy or Build Your Trust & Safety Software?

In the realm of software development, the age-old question about building software in-house or buying it from vendors is very common.  It is not surprising that this question is also very common when it comes to enterprise-level Trust & SafetyThe field and practices that manage challenges related to content- and conduct-related risk, including but not

Join our community

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.