Lead Forensics

Nowhere to hide: smaller markets such as Australia are wielding transparency to enforce online safety

Online platforms already risk hefty fines or exclusion from big markets such as the EU for non-compliance with strict new online safety laws. Increasingly, they are also exposed to reputational risks from transparency measures in smaller jurisdictions such as Australia.

Smaller markets like Australia have traditionally struggled to enforce compliance against online services that have a large local user base, but lack a local physical presence and are globally oriented. However, where Australia once defaulted to self-regulation, it is implementing powerful transparency measures that apply public scrutiny to motivate compliance.

The Online Safety Act

Under Australia’s Online Safety Act (OSA), the eSafety Commissioner can require online service providers to produce reports on their compliance with Basic Online Safety Expectations (BOSEs). BOSEs describe the “systems, policies and processes” that online service providers are “expected to employ to prevent harm and respond to harm when it occurs”. 

Reports required by the Commissioner may be “periodic” (i.e. regular reporting as frequently as every six months) or non-periodic (i.e. once-off). They can be required in relation to compliance with all relevant BOSEs or specific Expectations such as those regarding child safety. 

The Commissioner is already making targeted transparency interventions 

The Commissioner has been busy wielding these new transparency powers to access information and scrutinise online service providers’ safety performance. In 2022 and 2023, the Commissioner ordered non-periodic reports from Apple, Meta, Microsoft, Omegle and Snap, Google, X, Twitch, TikTok, and Discord on their efforts to combat Expectations relating to child sexual exploitation and abuse. More recently, eSafety required X to report on its efforts to satisfy Expectations regarding online hate.  

Importantly, these reports are not just for the Commissioner’s private reading pleasure. Following each report, eSafety has published extensive summaries of the information received from service providers and commentaries on their performance. This has included information about a provider’s human resources dedicated to online safety, content moderation processes, competency in localised safety issues (e.g. online hate targeting First Nations Australians), use of automated tools and recommender systems, enforcement of its terms and conditions, and median time to respond to user reports, among other disclosures. 

Australia’s transparency measures may apply even more pressure than the DSA

Transparency is already an established feature of online safety regulation in large markets such as the EU. Last October, very large online platforms (VLOPs) and very large online search engines (VLOSEs) published their first annual transparency reports under the Digital Services Act (DSA) involving similar categories to their Australian counterparts. 

The key difference under the DSA is that transparency reports are drafted and published by the platforms themselves. In Australia, the information is provided to eSafety and then presented to the public by the Commissioner. This reduces “narrative control” for platforms and increases pressure to demonstrate strong trust and safety processes that satisfy the Commissioner.

Still room for collaboration between service providers and regulators

Nonetheless, Australia’s OSA does not treat online service providers as inherently harmful or problematic. In fact, collaboration between eSafety and service providers is a defining characteristic of the law.

The OSA is designed to adapt to the complexity and continuing evolution of online services through iteration and co-regulation. While BOSEs ensure the most urgent and obvious safety guarantees are fixed and non-negotiable, the OSA creates a flexible process whereby service providers can formulate enforceable industry codes under the oversight of the Commissioner. To a degree, this shares the EU Digital Services Act’s very large online platform (VLOP) and very large online search engine (VLOSE) risk mitigation process, under which service providers determine measures with the oversight of regulators.

Important outtakes for online service providers

Recent negative findings made against platforms following transparency disclosures to the Commissioner illustrate the reputational risks from poor online safety performance.

It is not sufficient to limit compliance to primary jurisdictions such as the EU or UK. Online service providers should monitor regulatory developments and familiarise themselves with their obligations in smaller markets such as Australia. The active implementation of transparency measures by the eSafety Commissioner demonstrates that non-compliance risks significant reputational harm.  

JOIN OUR COMMUNITY

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.

Share This Post

Further articles

Best practices in trust & safety

Building a Robust Trust & Safety Framework

In today’s digital landscape, Trust and Safety have become paramount concerns for online platforms. With the introduction of regulations like the EU’s Digital Services Act (DSA) and the UK’s Online Safety Act (OSA), platforms face increasing pressure to ensure user safety and maintain trust. However, many find themselves at a loss when it comes to

DSA Compliance

Regulating Online Gaming: Challenges and Future Landscape

The online gaming industry has experienced significant growth in the past years. In the EU alone, the sector generated €23.48 billion in 2022, four times the revenue of digital music and almost twice that of video-on-demand. Only in the EU, the sector generated €23.48 billion in the EU in 2022, namely four times the revenue

Join our community

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.