Lead Forensics

Navigating the Online Safety Act: Ofcom maps the road ahead

Breaking Down the Jargon

Pursuant to its mandate, Ofcom has opened up a consultation on the documents it has recently published on the dos and don’t for platforms falling under the Online Safety Act with respect to the duty to carry out an illegal content risk assessment and implement safety measures to protect users. 

We, at Tremau, did the heavy lifting for you, sifting through over 1,700 pages. Here’s what you need to know:

Diverse Services, Diverse Challenges

From social media hangouts to gaming hubs, dating platforms, and beyond, Ofcom estimates over 100,000 companies will be in scope of the Online Safety Act. But it’s not a one-size-fits-all deal. They’ve categorised services based on size, risk, and impact and noted how proportionality will be a key principle, both in terms of the scope of duties and enforcement. Some key things to keep in mind when embarking on your compliance journey: 

Size Matters: Big vs Small Services

If your user base is more than 7 million on average per month in the UK, you will be categorised as a Large Service. Every service that does not meet this threshold is a small service. Seems easy at a glance, although what constitutes a ‘user’ may not be always completely obvious and will require an assessment.

Levels of Risk 

Ofcom’s focus is on ensuring services reduce and manage the risk of harm stemming from illegal content. The scope of duties and compliance obligations services will have to meet will depend on their level of risk. The illegal risk assessment (touched on further below) should help services determine whether they are:

  • Low risk 🙂
  • Subject to a specific risk 😐
  • Subject to multi-risks 😨

Impact of the service

A further categorisation based on ‘impact’ will also have to be made. Impact refers to the consequences, in terms of harm to users, once illegal content or activity has been encountered or taken place on the service. Ofcom has divided impact into High, Medium and Low and has linked this categorisation to user numbers (High when the user base is more than 7 million monthly UK users, Medium for a user base between 700,000 and 7 million and Low for anyone falling below that level).

This proportionate approach taking into account both size, level of risk and impact of the platform is both helpful to ensure the measures taken make sense for individual platforms and that the burden of compliance is highest where necessary but also presents an operational and compliance challenge given the wide range of classifications and considerations. It is clear that in order to assess the applicable obligations and to meet Ofcom’s expectations, platforms will need to take a customised approach to their particular platform. 

Illegal Content Risk Assessment

As is no doubt evident, risk governance is a major component of the OSA and Ofcom’s mandate. To this end and in contrast to the DSA, all services within scope of the OSA will have to conduct an illegal content risk assessment. While proportionality is also a guiding theme here, smaller services will have to start future planning for carrying out a risk assessment and preparing for the inherently ongoing nature of such an obligation taking steps to ensure re-evaluation at least annually!

Let’s break down the risk assessment duty.

What risks am I assessing?

Firstly, when Ofcom refers to risk of illegal content it actually means the following three types of activity:

·  The risk of illegal content being published or encountered on a service

·  The risk of an offence being committed on a service

·  The risk of an offence being facilitated on a service

Services will be responsible for assessing the likelihood of these activities taking place on their platforms as well as the impact of harm to users if they do.

Linked to this activity is the obvious precondition that it be illegal. Illegality is extremely broad and thankfully the OSA does narrow its scope to 15 priority areas.

How do I carry out a risk assessment?

To conduct the risk assessment, services will have to undergo a 4 steps process where they identify their risk profile and risk factors, assess these risk factors, plan to mitigate them and ensure future review and accuracy. 

To start the process, services will have to consult Ofcom’s Guidance – mainly focusing on the Risk Profiles and Risk Register – and attempt to contextualise and ascertain the type of risks that are relevant to it. A relevant risk is referred to as Risk Factor by Ofcom and relates to a service’s particular characteristics, such as the functionalities it provides users with, its business model as well as its use of a recommender system. 

Next, they will have to adjudicate on these risk factors and score them either Low, Medium or High. To complete this step effectively, platforms will have to gather and consult a number of inputs.

Ofcom has clarified that all services will have to consider what it terms ‘core inputs’. These are (in addition to Ofcom’s documents), users reports/complaints, relevant user data and analysis of past incidents. If a service is unable to effectively assess and score risk solely based on its core inputs, then Ofcom notes that enhanced inputs should be considered. Examples of enhanced inputs include consulting with users and commissioning external online governance experts to get their insights on the likelihood and impact of illegal content on their service.

Next Step…Mitigation

The result of this assessment will then lead to clarity on the categorisation of the service as low risk or subject to specific risk or multi-risks. It will also clear the way forward for the selection and implementation of mitigation measures which are necessary to tackle the risk factors scored Medium or High.

Thankfully, platforms will be able to take inspiration from the measures in Ofcom’s Code of Practice which was put together based on current industry best practice from around the world. Yet, it is also open to services to introduce their own measures. Ofcom refers to these as ‘alternative measures’. The disadvantage of taking such an approach however is that additional compliance duties (such as documenting and justifying the selection of these alternative measure) will apply.

Recurring assessments 

The duty to carry out an illegal content risk assessment and introduce mitigation measures is not a once off. At minimum, it is an annual recurring duty but the obligation to reassess particular risks can also arise if a service experiences a significant change.

A significant change can result from an Ofcom amendment to the risks factors that apply to a service or from an internal change to the functionalities or characteristics of a service. For example if you are a social media service that did not allow for child users and you amended that policy or if you were a dating service that decided to remove an unmatch functionality, these would classify as significant changes and would necessitate an update to the risk assessment.

You complied, now make sure you can prove it…

One additional takeaway that is readily apparent from Ofcom’s approach is the importance placed on governance procedures and recording compliance. Robust and effective governance and documenting structures are viewed as integral to the overall functioning of a platform’s management and reduction of risk and harm moving forward.

All services will have to nominate a Senior Accountable Officer for matters relating to online safety under the OSA. Ofcom wants this person to be accountable in the sense that they explain and justify actions or decisions regarding online safety risk management and mitigation to the leadership/Board of the company.  

Additional duties will apply to larger and multi-risk services such as the introduction of codes of conduct and regular training for staff dealing with content moderation and trust & safety.

Documenting and recording the steps taken or the decisions made concerning compliance with the OSA will be required of all services with no real derogation or distinction applied based on the size of the platform. 

For early stage services in this space, this may present operational difficulties as they will have to formalise existing channels of decision-making and create procedures to capture risk management moving forward. Larger and more established services on the other hand will need to consider updating their current systems and processes, particularly when it comes to the company’s leadership and upper management being kept informed of the identification, mitigation and management of risk of illegal content on the service.

Keep an eye on our website and social channels as we will be releasing more information on the duty to carry out a risk assessment over the coming months!

Have your say

As a final takeaway, it is important to emphasise that Ofcom’s recent publications are drafts only pending the conclusion of this consultation phase. If you wish to provide input or evidence agreeing or contesting Ofcom’s proposed approach, they are accepting submissions until 23 February 2024.

After that, all received submissions will be reviewed and the Guidance on the risk assessment procedure and the Code of Practice will be finalised, thus kickstarting the obligation to comply with the relevant legal duties. See our Article on how companies can comply with the DSA and the OSA which provides an overview of the upcoming compliance challenges.

How Can Tremau help you?

Tremau’s expert advisory team, with extensive experience in compliance with the Digital Services Act can help you assess your obligations and prepare an operational roadmap to compliance with the UK Online Safety Act.

In addition, Tremau offers a unified Trust & Safety content moderation platform that prioritises compliance as a service. We integrate workflow automation and AI tools, helping online service providers meet the U.K.’s Online Safety Act requirements while boosting trust & safety metrics and reducing administrative burden.  
To find out more, contact us at info@tremau.com

JOIN OUR COMMUNITY

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.

Share This Post

Further articles

Global Regulations

What does Canada’s proposed Online Harms Act mean for your platform?

In the last three years, roughly 540 million people—representing some of the most lucrative markets—have come under the protection of next-generation online safety laws in the European Union (EU), United Kingdom (UK), and Australia. The dominoes are falling, and Canada is suiting up to join the party.  Canada’s Bill C-63, otherwise known as the Online

Trust & Safety Software
Best practices in trust & safety

Making the Right Choice: Buy or Build Your Trust & Safety Software?

In the realm of software development, the age-old question about building software in-house or buying it from vendors is very common.  It is not surprising that this question is also very common when it comes to enterprise-level Trust & Safety (T&S) tools. Luckily, there is a long history of research on this question, starting from

Join our community

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.