In the last three years, roughly 540 million people—representing some of the most lucrative markets—have come under the protection of next-generation online safety laws in the European Union (EU), United Kingdom (UK), and Australia. The dominoes are falling, and Canada is suiting up to join the party.
Canada’s Bill C-63, otherwise known as the Online Harms Act (OHA), is part of a continuing shift from specific notice-and-takedown obligations to general-scope regulatory models based on platform responsibility and transparency. Introduced to the Canadian Parliament in late February 2024, the Bill is currently sitting in parliamentary committees subject to amendments, before facing a vote.
Am I in scope?
The OHA applies to social media services. These are defined as “a website or application that is accessible in Canada, the primary purpose of which is to facilitate interprovincial or international online communication among users of the website or application by enabling them to access and share content.”
This appears to be a narrower category of online services than are addressed by the Digital Services Act (DSA) and Online Safety Act (OSA) in the EU and UK respectively. However, scope may be interpreted broadly. For instance, the Act expressly expands this definition beyond mainstream understandings of social media to include “adult content services” (e.g. Pornhub) and “live streaming services” (e.g. Twitch).
Not all social media services will fall into scope; they must either meet a minimum user threshold or be “designated”. The minimum user threshold is set by the Government and may vary between different types of social media services depending on the Government’s discretion. Designation is also made by the Government. A service can be designated irrespective of user-base size if there is “a significant risk that harmful content is accessible.”
Four key duties
Bill C-63 will make regulated services accountable at a systemic level for protecting their users from harmful content. This accountability is imposed through four principal duties:
- To act responsibly by implementing measures to adequately mitigate the risk that users will be exposed to harmful content;
- To protect children by integrating design features respecting the protection of children;
- To make non-consensually distributed intimate images and child sex abuse material inaccessible within 24 hours; and
- To keep all records that are necessary to determine compliance.
Seven categories of harmful content
Unlike the DSA, which imposes responsibilities relating to all forms of illegal content alongside risks related to disinformationFalse information that is circulated on online platforms to deceive people. It has the potential to cause public harm and is often done for economic or political gain. and electoral integrity, obligations under the OHA only apply in relation to seven explicit forms of “harmful content”:
- Intimate content communicated without consent;
- Content that sexually victimises a child or revictimises a survivor;
- Content that induces a child to harm themselves;
- Content used to bully a child;
- Content that incites hatred;
- Content that incites violence; and
- Content that incites violent extremism or terrorism.
Risk mitigation and the digital safety plan
Through its duty to “act responsibly”, the OHA is comparable to the DSA’s systemic risk management approach. Under this duty, social media services must implement “adequate measures” to mitigate the risk of harm. Some of these measures are expressly prescribed, such as tools for users to report harmful content, while others may need to be identified and implemented by service providers to meet the adequacy test.
Services will also be required to prepare a digital safety plan in order to comply with their duty to “act responsibly”. This plan has the combined effect of a risk assessmentIt refers to the process of identifying, analyzing, and evaluating the severity and probability of risks and threats associated with business or product development, deployment, and maintenance. In the context of the DSA, very large online platforms are required to annually assess the systemic risks stemming from the design, functioning or use of the platforms, including any actual or foreseeable..., self-audit, and transparency reportA transparency report is a document released by an organization that discloses information related to its policies, practices, and actions. Typically, transparency reports provide details on the handling of requests for user data and content removal, as well as government requests for user records, among other relevant metrics and insights.. It must:
- Assess the risk that users will be exposed to harmful content;
- List and describe mitigation measures;
- Assess the effectiveness of these measures;
- Describe the indicators used to assess effectiveness;
- Provide a range of information relating to content moderationReviewing user-generated content to ensure that it complies with a platform’s T&C as well as with legal guidelines. See also: Content Moderator resources and practices (e.g. volume and type of harmful content moderated); and
- Disclose any internal research conducted in relation to harmful content and design features.
The OHA gives the Government the power to establish how frequently plans must be submitted, as well as the reporting period to which they apply. While further clarity will be required, it is an explicit requirement that the digital safety plan is publicly available in an accessible format.
Child safety is a priority
Alongside the broadly defined duty to “act responsibly”, social media services will also be subject to a specific duty to “protect children”. Under this duty, service providers must implement dedicated measures to protect children within their broader risk management strategy and integrate features such as “age appropriate design”.
The child safety duty in the Online Harms Act—and its emphasis on “design features”—suggests that this law may impose regulatory scrutiny on so-called “addictive design”. This is a subject of growing legislative and judicial attention in both the US and EU. Given the broad scope and lack of definitional clarity around “addictive design”, introducing regulatory scrutiny to this issue may magnify service provider responsibilities under the Online Harms Act vis-a-vis existing regulatory peers in the EU and UK.
A new regulator—the Digital Safety Commission
The OHA will establish a new regulator, the Digital Safety Commission, and equip it with extensive powers to monitor and enforce the regulation. Some of its powers appear to be modeled on the role of the Australian eSafety Commissioner, including the ability to issue removal notices against child sexual abuse or non-consensual intimate content. Other powers to investigate, audit and penalise social media services for non-compliance are comparable to the European Commission’s enforcement functions under the DSA.
How can Tremau help?
A new era of online safety regulation is upon us. Online platformsAn online platform refers to a digital service that enables interactions between two or more sets of users who are distinct but interdependent and use the service to communicate via the internet. The phrase "online platform" is a broad term used to refer to various internet services such as marketplaces, search engines, social media, etc. In the DSA, online platforms... are increasingly subject to strict online safety obligations in key markets across the globe. The compliance challenge is only going to grow more complicated and inescapable, and the consequences of failure are huge fines and significant reputational harm. Our advisory team can help you decipher and disentangle the complex web of online safety regulation, and ensure efficient and functional compliance. Send us an email at info@tremau.com to get in touch!