Lead Forensics

What does Canada’s proposed Online Harms Act mean for your platform?

In the last three years, roughly 540 million people—representing some of the most lucrative markets—have come under the protection of next-generation online safety laws in the European Union (EU), United Kingdom (UK), and Australia. The dominoes are falling, and Canada is suiting up to join the party. 

Canada’s Bill C-63, otherwise known as the Online Harms Act (OHA), is part of a continuing shift from specific notice-and-takedown obligations to general-scope regulatory models based on platform responsibility and transparency. Introduced to the Canadian Parliament in late February 2024, the Bill is currently sitting in parliamentary committees subject to amendments, before facing a vote.

Am I in scope?

The OHA applies to social media services. These are defined as “a website or application that is accessible in Canada, the primary purpose of which is to facilitate interprovincial or international online communication among users of the website or application by enabling them to access and share content.” 

This appears to be a narrower category of online services than are addressed by the Digital Services Act (DSA) and Online Safety Act (OSA) in the EU and UK respectively. However, scope may be interpreted broadly. For instance, the Act expressly expands this definition beyond mainstream understandings of social media to include “adult content services” (e.g. Pornhub) and “live streaming services” (e.g. Twitch). 

Not all social media services will fall into scope; they must either meet a minimum user threshold or be “designated”. The minimum user threshold is set by the Government and may vary between different types of social media services depending on the Government’s discretion. Designation is also made by the Government. A service can be designated irrespective of user-base size if there is “a significant risk that harmful content is accessible.” 

Four key duties

Bill C-63 will make regulated services accountable at a systemic level for protecting their users from harmful content. This accountability is imposed through four principal duties:

  1. To act responsibly by implementing measures to adequately mitigate the risk that users will be exposed to harmful content;
  2. To protect children by integrating design features respecting the protection of children; 
  3. To make non-consensually distributed intimate images and child sex abuse material inaccessible within 24 hours; and
  4. To keep all records that are necessary to determine compliance.

Seven categories of harmful content

Unlike the DSA, which imposes responsibilities relating to all forms of illegal content alongside risks related to disinformation and electoral integrity, obligations under the OHA only apply in relation to seven explicit forms of “harmful content”:

  1. Intimate content communicated without consent;
  2. Content that sexually victimises a child or revictimises a survivor;
  3. Content that induces a child to harm themselves;
  4. Content used to bully a child;
  5. Content that incites hatred;
  6. Content that incites violence; and
  7. Content that incites violent extremism or terrorism.

Risk mitigation and the digital safety plan

Through its duty to “act responsibly”, the OHA is comparable to the DSA’s systemic risk management approach. Under this duty, social media services must implement “adequate measures” to mitigate the risk of harm. Some of these measures are expressly prescribed, such as tools for users to report harmful content, while others may need to be identified and implemented by service providers to meet the adequacy test.

Services will also be required to prepare a digital safety plan in order to comply with their duty to “act responsibly”. This plan has the combined effect of a risk assessment, self-audit, and transparency report. It must:

  • Assess the risk that users will be exposed to harmful content;
  • List and describe mitigation measures;
  • Assess the effectiveness of these measures; 
  • Describe the indicators used to assess effectiveness; 
  • Provide a range of information relating to content moderation resources and practices (e.g. volume and type of harmful content moderated); and
  • Disclose any internal research conducted in relation to harmful content and design features. 

The OHA gives the Government the power to establish how frequently plans must be submitted, as well as the reporting period to which they apply. While further clarity will be required, it is an explicit requirement that the digital safety plan is publicly available in an accessible format.

Child safety is a priority

Alongside the broadly defined duty to “act responsibly”, social media services will also be subject to a specific duty to “protect children”. Under this duty, service providers must implement dedicated measures to protect children within their broader risk management strategy and integrate features such as “age appropriate design”.

The child safety duty in the Online Harms Act—and its emphasis on “design features”—suggests that this law may impose regulatory scrutiny on so-called “addictive design”. This is a subject of growing legislative and judicial attention in both the US and EU. Given the broad scope and lack of definitional clarity around “addictive design”, introducing regulatory scrutiny to this issue may magnify service provider responsibilities under the Online Harms Act vis-a-vis existing regulatory peers in the EU and UK. 

A new regulator—the Digital Safety Commission

The OHA will establish a new regulator, the Digital Safety Commission, and equip it with extensive powers to monitor and enforce the regulation. Some of its powers appear to be modeled on the role of the Australian eSafety Commissioner, including the ability to issue removal notices against child sexual abuse or non-consensual intimate content. Other powers to investigate, audit and penalise social media services for non-compliance are comparable to the European Commission’s enforcement functions under the DSA. 

How can Tremau help?

A new era of online safety regulation is upon us. Online platforms are increasingly subject to strict online safety obligations in key markets across the globe. The compliance challenge is only going to grow more complicated and inescapable, and the consequences of failure are huge fines and significant reputational harm. Our advisory team can help you decipher and disentangle the complex web of online safety regulation, and ensure efficient and functional compliance. Send us an email at info@tremau.com to get in touch!

JOIN OUR COMMUNITY

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.

Share This Post

Further articles

Best practices in trust & safety

Building a Robust Trust & Safety Framework

In today’s digital landscape, Trust and Safety have become paramount concerns for online platforms. With the introduction of regulations like the EU’s Digital Services Act (DSA) and the UK’s Online Safety Act (OSA), platforms face increasing pressure to ensure user safety and maintain trust. However, many find themselves at a loss when it comes to

DSA Compliance

Regulating Online Gaming: Challenges and Future Landscape

The online gaming industry has experienced significant growth in the past years. In the EU alone, the sector generated €23.48 billion in 2022, four times the revenue of digital music and almost twice that of video-on-demand. Only in the EU, the sector generated €23.48 billion in the EU in 2022, namely four times the revenue

Join our community

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.