On Wednesday evening (Sydney time), Australia’s Federal Court extended an interim injunction mandating X Corp to hide posts containing copies of a live-streamed stabbing attack in a Sydney church.
The injunction followed an official removal notice issued by the eSafety Commissioner, ordering that X remove the posts from its platform globally. Having already geo-blocked the content in Australia, X has contested the legal grounds for the removal notice. The issue will be ruled on in a final hearing scheduled for May 10.
This case coincides with heightened scrutiny of online platformsAn online platform refers to a digital service that enables interactions between two or more sets of users who are distinct but interdependent and use the service to communicate via the internet. The phrase "online platform" is a broad term used to refer to various internet services such as marketplaces, search engines, social media, etc. In the DSA, online platforms... in Australia. Lawmakers are considering changes to the Online Safety Act (OSA) that would increase obligations on service providers, and sharpen enforcement mechanisms. Simultaneously, a specialised task force on algorithmic harms has been convened and a misinformation bill is set to be proposed soon. The opposition in Parliament has also called for social media to be blocked for children and for age verificationAge verification is a process used by websites or online services to confirm that a user meets a specific minimum age requirement. Age verification is typically used in jurisdictions where laws prohibit minors from accessing certain online content or services, such as gambling, pornography, or alcohol sales. to be made mandatory.
New regulatory currents are gathering pace in Australia. Here’s what online service providers should know about how things sit currently and what may lie ahead.
What are “removal notices” under Australia’s Online Safety Act?
The eSafety powers that X is challenging come from Australia’s Online Safety Act 2021 (OSA). Under the OSA, the Australian regulator—the eSafety Commissioner—can issue removal notices that require in-scope service providers to remove certain material from their services.
In-scope providers include:
- Social media services;
- Relevant electronic services;
- Designated internet services; and
- Hosting serviceA hosting service enables individuals, companies and other service providers to host websites, databases, applications.
Within the meaning of the DSA, a hosting service offers the storage of user-generated content. This includes for example filesharing, social media, video-sharing platforms as well as marketplaces. providers.
eSafety is empowered to assess whether online content qualifies for a removal notice, within the requirements of one of four categories, including:
- Cyber-bullying material targeted at an Australian child;
- Cyber-abuse material targeted at an Australian adult;
- Intimate images non-consensually shared on the service; and
- Class 1 and 2 material under the Online Content Scheme (includes material related to child exploitation, terrorism, and revolting or abhorrent phenomena).
Notices must be complied with within 24 hours, or such a longer period as the Commissioner allows. Non-compliance can result in court injunctions and ongoing daily fines. In the case of repeated non-compliance within 12 months, eSafety may obtain a court order requiring the service provider to cease providing their service in Australia.
What could the Federal Court show-down mean for Australia’s removal notice regime?
There are two key issues at play in the eSafety vs X case in the Federal Court that may fundamentally redefine Australia’s notice-and-takedown regime.
The first issue is jurisdictional. X says that it has geo-blocked the relevant content in Australia. However, eSafety argues that X must remove the content globally. The Commissioner is concerned about risks of radicalisation associated with the content, and that Australian users can easily circumvent the geo-block by using a VPN. Whether eSafety has the power to enforce global content removals will raise a number of compliance issues for online service providers.
The second issue is definitional. This case may test eSafety’s definition of class 1 content under the Online Content Scheme. X argues that the content does not violate its terms of service. However, eSafety claims it is “gratuitous or offensive violence with a high degree of impact or detail”. If and how the court rules on this issue will have consequences for the Commissioner’s scope to issue removal notices. It will also influence online service providers’ content moderationReviewing user-generated content to ensure that it complies with a platform’s T&C as well as with legal guidelines. See also: Content Moderator practices by more clearly defining what is lawful content under the Australian regime.
Are there any general-scope obligations in Australia?
Where Australia’s removal notice regime addresses online harm “down-stream” at the content level, general-scope regulations impose broad-based responsibility “up-stream” on service providers regarding the design and functioning of their services.
Australia’s 2021 reforms introduced Basic Online Safety Expectations (BOSEs) as a new general-scope instrument.
BOSEs are set by the Government and require in-scope service providers to take a series of broadly defined systemic actions to protect their users from harm. This includes taking “reasonable steps” to:
- Ensure safe use (including conducting safety risk and impact assessments);
- Prevent access by children to age-restricted material;
- Minimise the provision of illegal content; and
- Ensure mechanisms to report about illegal content.
The BOSEs are non-mandatory, but eSafety can order online platforms to report on their compliance. Non-compliance involves reputational risks as eSafety can publish a public report detailing the service provider’s failures.
Next steps for Australia: an overarching duty of care?
The ongoing review of Australia’s OSA was brought forward following a parliamentary inquiry into social media and online safety in 2023. The review has been tasked with considering, among other things, the “introduction of a duty of care requirement towards users”. This would “place broad obligations on platforms to focus on their systems and mitigate against harms before they happen”, according to think tank Reset Tech Australia.
The nature of these obligations might closely reflect the risk assessmentIt refers to the process of identifying, analyzing, and evaluating the severity and probability of risks and threats associated with business or product development, deployment, and maintenance. In the context of the DSA, very large online platforms are required to annually assess the systemic risks stemming from the design, functioning or use of the platforms, including any actual or foreseeable... and mitigation requirements of the European Union’s Digital Services Act (DSA). Rigorous record-keeping and documentation by affected platforms would be a necessity to demonstrate compliance.
In the Australian context in particular, a duty of care would invert the focus of regulatory exposure from reactive to systemic responsibilities. Currently, compliance is concentrated on cooperation with eSafety’s removal notices regarding specific items of content. Under a duty of care, platforms would be required to deploy significant resources to mitigate harms on their services.
The bottom line: expect new systemic obligations and stronger enforcement
The trend in Australia indicates a shift towards general-scope obligations with stronger enforcement. In 2021, Australia updated its regime by expanding eSafety’s notice-and-takedown powers and adding the non-mandatory BOSEs. The ongoing review is now considering replacing or augmenting the BOSEs with a new mandatory general-scope instrument, such as an overarching duty of care. This would mean mandatory responsibilities for platforms to address systemic risks, and tougher enforcement.
How can Tremau help?
A new era of online safety regulation is upon us. Online platforms are increasingly subject to strict obligations in key markets across the globe. The compliance challenge is only going to grow more complicated and inescapable, and the consequences of failure are significant fines and reputational harm. Our advisory team can help you decipher and disentangle the complex web of online safety regulation, and ensure efficient and functional compliance. Send us an email at info@tremau.com to get in touch!