Lead Forensics

Across the channel: How can companies efficiently comply with Europe’s DSA & the UK’s OSA?

The Online Safety Bill has now received Royal Assent, meaning Ofcom’s powers as online safety regulator have officially commenced. On the other side of the Channel, Europe’s Digital Services Act (DSA) has checked off some key deadlines for very large online platforms (VLOPs) and very large online search engines (VLOSEs), with all the first DSA transparency reports now published.

Online services, regardless of where they are based, will have to comply with the DSA and the OSA if they conduct – or plan to conduct– business in the EU and the UK respectively. Despite certain exemptions for small and micro enterprises in the DSA, these regulations throw a large net that captures most online services and set a baseline of obligations that need to be met to avoid heavy enforcement.

The bad news?
You’re looking at some dense and lengthy texts: the DSA stands at 102 pages and the OSA, a whopping 302.

The good news?
There is ample convergence between these two regulations that promise a less bumpy road to compliance.

The better news?
We have done the analysis for you! Read on to find out how your teams can most efficiently comply with both the DSA & the OSA, while also preparing your systems for the regulations of tomorrow.

Back to the basics

Perhaps the most confusing part of these regulations is figuring out if they apply to you. The geographic scope for both is defined by the location of the users of your service, rather than your place of establishment. The OSA regulates services with “links with the UK”, which can mean that you have significant number of users in the UK, the UK is one of your target markets, or that your service can be accessed by individuals in the UK and poses a material risk of significant harm through content or search results present on your service. The DSA applies similarly to all online services with a connection to the EU, i.e. for services with users in the Union or whose activities are targeting the Union.

When should I be looking into these laws ?

Both regulations define different types of services to which varying levels of obligations apply. The OSA tackles services that allow users to produce content that can then be encountered by other users – referred to as a ‘user-to-user online service’ (U2U service) – as well as search engines. The DSA has a broader definition capturing all types of online intermediary services in its scope – from those that only allow for content to be transmitted or hosted, to those that also allow user generated content to be publicly shared (referred to as online platforms).

Does size matter?

While size does matter, both laws include a significant layer of obligations on also the smallest services. At the same time, both regulations introduce special categories for the largest services – while the DSA has already defined and designated its VLOPS/VLOSEs, the threshold for Category 1 and Category 2A services is yet to be determined for the OSA. Nevertheless, it is clear that these services, similar to VLOPs in the DSA, will have to meet additional obligations, in proportion to their size and societal impact.

What do you need to do?

Both the DSA and the OSA contain obligations relating to processes as opposed to removal of individual pieces of content. While both laws outline rules related to illegal content in their respective jurisdictions, the laws touch on all aspects of the content moderation funnel with the aim to protect users from harm and empower them to hold platforms accountable. Below we highlight the key obligations that apply to all services, regardless of size.

  1. Terms & Conditions

DSA Article 14 asks online services to be transparent about their moderation processes in their Terms & Conditions (T&Cs), and present them in a user-friendly and unambiguous language. If your service is directed at minors, you are obliged to ensure that your T&Cs can be understood by them. Your T&Cs should clearly state the policies, procedures, measures, and tools used for content moderation, and disclose to what extent algorithmic decision-making or human review is involved. If you offer the possibility of appealing a moderation decision, this should also be detailed. Moreover, you should inform your users of any significant changes to your T&Cs.

Section 10 of the OSA lays out similar duties as DSA Article 14, and further requires all U2U services to separately address terrorism content, child sexual exploitation and abuse (CSEA) content, and other priority illegal content, as defined by the OSA. Further, Section 73 emphasises that the T&Cs should include provisions about a user’s right to bring a claim for breach of contract, if your content moderation decisions are in breach of your terms of service.

TLDR?

  • DSA: Be transparent with your users about your content moderation in a way that is accessible and understandable.
  • OSA: Everything the DSA said, as well as specific provisions for different types of illegal content and information on a user’s right to sue the service.

  1. User reports

Under DSA Article 16, all online services need to have an easy-to-access and user friendly notice and action mechanism in place for illegal content. Following a user report, you will also have to send a confirmation of receipt and once a decision is taken, notify the user of the decision and any possibilities of redress. Crucially, Article 16(6) clarifies that these notices should be processed in a timely, diligent, non-arbitrary, and objective manner. This suggests that dedicated processes for such reports will need to be implemented that can ensure they are upheld to the standard required by the DSA. This obligation is largely the same as the requirements set out in Section 20 of the OSA, which specifies that all U2U services must allow users to easily report content that they consider to be illegal content. For those services that are likely to be accessed by children, the option to report content that is harmful to children should also be available.

TLDR?

  • DSA: All services need to implement a user reporting mechanism and inform the reporter of the decision taken on the report.
  • OSA: All services have to allow users to easily report illegal content. If accessible to children, service should allow users to report content that is harmful to children.

  1. Statements of reasons

Under DSA Article 17, online services need to send a clear and specific statement of reasons (SoR) to a user impacted by moderation decisions ranging from any restrictions on visibility of information – including the demotion of content – to the suspension or termination of a user’s monetary payments or their account. Article 17 details the information that will need to be included in this SoR, and more examples are available in the DSA Transparency Database where VLOPs and VLOSEs have already begun sending their statements. If you are an online platform, following February 17th, 2024, you will also have to submit your SoRs to this database, ensuring that it does not contain personal data. Furthermore, complying with DSA Article 17 will likely bolster your compliance with the duties towards freedom of expression that are scattered across the OSA.

While both regulations aim to empower users, they adopt different regulatory solutions towards this objective. As such, complying with these requirements could mean a few tweaks to your current processes or require a complete overhaul.

TLDR?

  • DSA: All services need to send users impacted by content moderation clear SoRs that meet Article 17 requirements.
  • OSA: No SoR obligations at the moment, but DSA compliance could potentially help with meeting OSA’s freedom of expression duties.

  1. Complaints & appeals 

Section 21 of the OSA clarifies that all U2U services need to have a procedure to allow users to complain about a variety of issues. This includes standard items such as content users consider to be illegal, appeals against content restrictions on the basis of illegality, appeals against any restrictions on the use of the service on the account of illegal content, as well as complaints against the use of proactive technology that the users consider to be a breach of the terms of service. Interestingly, a user can also complain if they consider that the provider is not complying with certain OSA duties – giving users a hand in interpreting the compliance of a platform as well. 

Similarly, DSA Article 21 specifies that online platforms (unless they are small or micro enterprises) need to implement an internal complaint handling system. However, this needs to be available to both a user who is impacted by a moderation decision as well as to anyone who has submitted a notice, for at least six months following the decision. Here too, the ability to appeal should be available for any moderation decision from removal of content and banning of users to any restrictions on the visibility of a user’s content. Furthermore, you will need to inform complainants of your reasoned decision as well as other possibilities of redress. While the OSA does not explicitly touch upon this, Section 21

states that the complaints procedure should provide for “appropriate action to be taken” in response to complaints of a relevant kind. Ofcom may later provide guidance on the parameters of ‘appropriate action’ and specify whether it includes the need to provide a reasoned decision, indicating that complaints systems may need to evolve with regulatory guidance. 

Although the DSA only requires medium sized online platforms to set up complaints procedures, the OSA obliges all U2U services to set these procedures out. Therefore, platforms that under the DSA were able to breathe easy on that obligation, may need invest into the process in the UK market. 

One thing is clear: these obligations spell out the need for tools and processes that can keep track of user reports and complaints, and ensure timely communication of your decisions with your users. On the bright side, investment in it can have some positive effects — research shows that user perception of how fairly a case is handled during policy enforcement impacts their future rule following behaviour more strongly than their agreement with a decision. This suggests that adopting measures that increase user agency and perception of fairness by implementing features such as complaint procedures and SoRs globally can be best practices that improve user experience and eventually bring in business value.

TLDR?

  • DSA: Medium sized online platforms  need to implement an internal complaint handling system.
  • OSA: All services have to meet duties regarding complaints procedures.

  1. Transparency reports

Under the DSA, depending on the type and size of your service, you will face different transparency reporting requirements. At a base level, all services – bar small and micro enterprises – will have to report at least annually on the orders for information and/or takedown they received from competent authorities, any content moderation decisions and appeals, and any use made of automated tools in the moderation process. Article 15 also discusses the ways in which these data points need to be organised, whereas Articles 24 and 42 detail the transparency reporting obligations for online platforms and VLOPs/VLOSEs respectively.

On the other hand, only Category 1, 2A, and 2B services under the OSA have to produce annual transparency reports. These reports will only follow an Ofcom notice which will detail the information the report must include, its format, deadline, and mode of publication. It is worth noting that many upcoming regulations are implementing mandatory transparency reporting obligations, meaning that implementing tools and processes that allow rapid transparency reporting generation may be a worthwhile investment.

TLDR?

  • DSA: Except for small and micro enterprises, all online services need to publish at least annual transparency reports.
  • OSA: Category 1, 2A, and 2B services need to publish annual transparency reports following an Ofcom notice.

How much time do you have left?

Not much. While the designated VLOPs and VLOSEs have already passed their compliance deadlines, all online services need to be compliant with the DSA by February 17th, 2024 – a short four months away, and even less if you would like to have a well-deserved end of year break!

The OSA is more generous: the most immediately enforceable duties will be those relating to illegal content, with OFCOM set to publish draft codes of practice this month, and the duty to conduct illegal content risk assessments likely kicking in within the first half of next year. Child safety duties will come into force over a longer time frame, with OFCOM planning consultations for draft codes of practice for next spring. The additional duties applying to the largest services will also be slower to come into effect as they will depend on OFCOM categorising services as a stated third priority after conducting consultations on codes of conduct for child safety.

So, what now?

While the convergence between these new regulations promises a smoother road to compliance, navigating their combined effect is not without its challenges. That said, compliance with the DSA and the OSA  can be an avenue to health check your current moderation processes, allowing you find areas where effectiveness can be improved — both in terms of moderation of violative content but also to provide a better experience for your users and bring value to your business.

Confused and overwhelmed by how these regulations apply to you and the impact they could have on your service? Tremau’s expert advisory team can help you assess your obligations and prepare an operational roadmap to compliance.

Not sure whether you have the capacity to build out these tools yourself? Tremau offers a unified Trust & Safety content moderation platform that prioritizes compliance as a service. We integrate workflow automation and AI tools, helping online service providers meet the regulatory requirements of tomorrow while boosting trust & safety metrics and reducing administrative burden.

JOIN OUR COMMUNITY

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.

Share This Post

Further articles

Global Regulations

What does Canada’s proposed Online Harms Act mean for your platform?

In the last three years, roughly 540 million people—representing some of the most lucrative markets—have come under the protection of next-generation online safety laws in the European Union (EU), United Kingdom (UK), and Australia. The dominoes are falling, and Canada is suiting up to join the party.  Canada’s Bill C-63, otherwise known as the Online

Trust & Safety Software
Best practices in trust & safety

Making the Right Choice: Buy or Build Your Trust & Safety Software?

In the realm of software development, the age-old question about building software in-house or buying it from vendors is very common.  It is not surprising that this question is also very common when it comes to enterprise-level Trust & Safety (T&S) tools. Luckily, there is a long history of research on this question, starting from

Join our community

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.