Online platformsAn online platform refers to a digital service that enables interactions between two or more sets of users who are distinct but interdependent and use the service to communicate via the internet. The phrase "online platform" is a broad term used to refer to various internet services such as marketplaces, search engines, social media, etc. In the DSA, online platforms... already risk hefty fines or exclusion from big markets such as the EU for non-compliance with strict new online safety laws. Increasingly, they are also exposed to reputational risks from transparency measures in smaller jurisdictions such as Australia.
Smaller markets like Australia have traditionally struggled to enforce compliance against online services that have a large local user base, but lack a local physical presence and are globally oriented. However, where Australia once defaulted to self-regulation, it is implementing powerful transparency measures that apply public scrutiny to motivate compliance.
The Online Safety Act
Under Australia’s Online Safety Act (OSA), the eSafety Commissioner can require online service providers to produce reports on their compliance with Basic Online Safety Expectations (BOSEs). BOSEs describe the “systems, policies and processes” that online service providers are “expected to employ to prevent harm and respond to harm when it occurs”.
Reports required by the Commissioner may be “periodic” (i.e. regular reporting as frequently as every six months) or non-periodic (i.e. once-off). They can be required in relation to compliance with all relevant BOSEs or specific Expectations such as those regarding child safety.
The Commissioner is already making targeted transparency interventions
The Commissioner has been busy wielding these new transparency powers to access information and scrutinise online service providers’ safety performance. In 2022 and 2023, the Commissioner ordered non-periodic reports from Apple, Meta, Microsoft, Omegle and Snap, Google, X, Twitch, TikTok, and Discord on their efforts to combat Expectations relating to child sexual exploitation and abuse. More recently, eSafety required X to report on its efforts to satisfy Expectations regarding online hate.
Importantly, these reports are not just for the Commissioner’s private reading pleasure. Following each report, eSafety has published extensive summaries of the information received from service providers and commentaries on their performance. This has included information about a provider’s human resources dedicated to online safety, content moderationReviewing user-generated content to ensure that it complies with a platform’s T&C as well as with legal guidelines. See also: Content Moderator processes, competency in localised safety issues (e.g. online hate targeting First Nations Australians), use of automated tools and recommender systems, enforcement of its terms and conditions, and median time to respond to user reports, among other disclosures.
Australia’s transparency measures may apply even more pressure than the DSA
Transparency is already an established feature of online safety regulation in large markets such as the EU. Last October, very large online platforms (VLOPs) and very large online search engines (VLOSEs) published their first annual transparency reports under the Digital Services Act (DSA) involving similar categories to their Australian counterparts.
The key difference under the DSA is that transparency reports are drafted and published by the platforms themselves. In Australia, the information is provided to eSafety and then presented to the public by the Commissioner. This reduces “narrative control” for platforms and increases pressure to demonstrate strong trust and safety processes that satisfy the Commissioner.
Still room for collaboration between service providers and regulators
Nonetheless, Australia’s OSA does not treat online service providers as inherently harmful or problematic. In fact, collaboration between eSafety and service providers is a defining characteristic of the law.
The OSA is designed to adapt to the complexity and continuing evolution of online services through iteration and co-regulation. While BOSEs ensure the most urgent and obvious safety guarantees are fixed and non-negotiable, the OSA creates a flexible process whereby service providers can formulate enforceable industry codes under the oversight of the Commissioner. To a degree, this shares the EU Digital Services Act’s very large online platform (VLOP) and very large online search engine (VLOSE) risk mitigation process, under which service providers determine measures with the oversight of regulators.
Important outtakes for online service providers
Recent negative findings made against platforms following transparency disclosures to the Commissioner illustrate the reputational risks from poor online safety performance.
It is not sufficient to limit compliance to primary jurisdictions such as the EU or UK. Online service providers should monitor regulatory developments and familiarise themselves with their obligations in smaller markets such as Australia. The active implementation of transparency measures by the eSafety Commissioner demonstrates that non-compliance risks significant reputational harm.