Lead Forensics

Nowhere to hide: smaller markets such as Australia are wielding transparency to enforce online safety

Online platforms already risk hefty fines or exclusion from big markets such as the EU for non-compliance with strict new online safety laws. Increasingly, they are also exposed to reputational risks from transparency measures in smaller jurisdictions such as Australia.

Smaller markets like Australia have traditionally struggled to enforce compliance against online services that have a large local user base, but lack a local physical presence and are globally oriented. However, where Australia once defaulted to self-regulation, it is implementing powerful transparency measures that apply public scrutiny to motivate compliance.

The Online Safety Act

Under Australia’s Online Safety Act (OSA), the eSafety Commissioner can require online service providers to produce reports on their compliance with Basic Online Safety Expectations (BOSEs). BOSEs describe the “systems, policies and processes” that online service providers are “expected to employ to prevent harm and respond to harm when it occurs”. 

Reports required by the Commissioner may be “periodic” (i.e. regular reporting as frequently as every six months) or non-periodic (i.e. once-off). They can be required in relation to compliance with all relevant BOSEs or specific Expectations such as those regarding child safety. 

The Commissioner is already making targeted transparency interventions 

The Commissioner has been busy wielding these new transparency powers to access information and scrutinise online service providers’ safety performance. In 2022 and 2023, the Commissioner ordered non-periodic reports from Apple, Meta, Microsoft, Omegle and Snap, Google, X, Twitch, TikTok, and Discord on their efforts to combat Expectations relating to child sexual exploitation and abuse. More recently, eSafety required X to report on its efforts to satisfy Expectations regarding online hate.  

Importantly, these reports are not just for the Commissioner’s private reading pleasure. Following each report, eSafety has published extensive summaries of the information received from service providers and commentaries on their performance. This has included information about a provider’s human resources dedicated to online safety, content moderation processes, competency in localised safety issues (e.g. online hate targeting First Nations Australians), use of automated tools and recommender systems, enforcement of its terms and conditions, and median time to respond to user reports, among other disclosures. 

Australia’s transparency measures may apply even more pressure than the DSA

Transparency is already an established feature of online safety regulation in large markets such as the EU. Last October, very large online platforms (VLOPs) and very large online search engines (VLOSEs) published their first annual transparency reports under the Digital Services Act (DSA) involving similar categories to their Australian counterparts. 

The key difference under the DSA is that transparency reports are drafted and published by the platforms themselves. In Australia, the information is provided to eSafety and then presented to the public by the Commissioner. This reduces “narrative control” for platforms and increases pressure to demonstrate strong trust and safety processes that satisfy the Commissioner.

Still room for collaboration between service providers and regulators

Nonetheless, Australia’s OSA does not treat online service providers as inherently harmful or problematic. In fact, collaboration between eSafety and service providers is a defining characteristic of the law.

The OSA is designed to adapt to the complexity and continuing evolution of online services through iteration and co-regulation. While BOSEs ensure the most urgent and obvious safety guarantees are fixed and non-negotiable, the OSA creates a flexible process whereby service providers can formulate enforceable industry codes under the oversight of the Commissioner. To a degree, this shares the EU Digital Services Act’s very large online platform (VLOP) and very large online search engine (VLOSE) risk mitigation process, under which service providers determine measures with the oversight of regulators.

Important outtakes for online service providers

Recent negative findings made against platforms following transparency disclosures to the Commissioner illustrate the reputational risks from poor online safety performance.

It is not sufficient to limit compliance to primary jurisdictions such as the EU or UK. Online service providers should monitor regulatory developments and familiarise themselves with their obligations in smaller markets such as Australia. The active implementation of transparency measures by the eSafety Commissioner demonstrates that non-compliance risks significant reputational harm.  

JOIN OUR COMMUNITY

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.

Share This Post

Further articles

Global Regulations

What does Canada’s proposed Online Harms Act mean for your platform?

In the last three years, roughly 540 million people—representing some of the most lucrative markets—have come under the protection of next-generation online safety laws in the European Union (EU), United Kingdom (UK), and Australia. The dominoes are falling, and Canada is suiting up to join the party.  Canada’s Bill C-63, otherwise known as the Online

Trust & Safety Software
Best practices in trust & safety

Making the Right Choice: Buy or Build Your Trust & Safety Software?

In the realm of software development, the age-old question about building software in-house or buying it from vendors is very common.  It is not surprising that this question is also very common when it comes to enterprise-level Trust & SafetyThe field and practices that manage challenges related to content- and conduct-related risk, including but not

Join our community

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.