Lead Forensics

Regulating Online Gaming: Challenges and Future Landscape

The online gaming industry has experienced significant growth in the past years. In the EU alone, the sector generated €23.48 billion in 2022, four times the revenue of digital music and almost twice that of video-on-demand.

Only in the EU, the sector generated €23.48 billion in the EU in 2022, namely four times the revenue of digital music and almost twice that of video-on-demand.

In addition, the number of European gamers increased during the COVID-19 pandemic, and now over half of the European population regularly play video games. While this growth is beneficial for the sector, it also comes with pressing and increasing safety challenges for users and gamers, among which almost 20% are under 14 years old and 22% between 15 and 24 years old.  

Indeed, a significant part of the community states that they have encountered online harassment, grooming and sexual abuse while gaming. More dangerously, extremist content finds new forums for propagation and mobilisation in these channels, as seen in recreations of mass shooting scenes with multiple connected devices. Thus, it is now more important than ever to pay attention to online trust & safety in the gaming industry. 

Challenges to online gaming regulations

Effective strategies for regulating user-generated or interactional content are largely missing in the traditionally self-regulated online gaming industry. Conventional regulations on video games – such as age-rating systems based on the degree of violence, strong language, sexual content, and other illicit practices – only apply to content released by developers and are yet to extend to user-generated content. However, for games that involve multiple connected players and allow real-time interaction among them, an ex-ante rating system cannot evaluate the risk of exposure to harmful or illegal content created by other gamers. 

Lists of banned words and user report systems are widely implemented across games, but both have limits. Banned words can be easily circumvented by inventing slang terms and can snowball into censoring content that is not necessarily harmful or illegal. Furthermore, report systems often suffer from overburdening, the inconsistency of decisions made by human moderators, and the algorithm’s failure to understand nuanced cases.

Apart from specific technical implementation issues, business considerations also affect content moderation in online gaming. A key problem is that gaming platforms demonstrate very different standards in content moderation and are not governed by a clear and consistent regulatory framework. For example, Nintendo is famous for its particularly strict content moderation policy due to its family-friendly brand, whereas other studios that produce mature-rated or adult-only games hold a relatively tolerant attitude towards deviant speech and conduct.

Trends in regulating online gaming

Regarding the unique “interactional risk” in online gaming with social features, a major trend is to combine child protection law with legal prescriptions for online content moderation, since these vulnerable gamers have long been active participants in the industry. In this blog, we will quickly explore how and why online gaming platforms should be concerned about the EU Digital Services Act and UK Online Safety Act. 

The impact of the EU Digital Services Act

In the European Union, the online gaming industry has fallen under the scope of the general regulations for online services. Therefore, the recent European Regulation on preventing the dissemination of terrorist content online and the Digital Services Act (DSA) in the European Union, are already impacting the online gaming industry, irrespective of gaming companies’ countries of establishment. 

Indeed, according to the DSA, gaming companies are now being obliged to:

      • Set up user report or flagging systems, which enables submitting detailed and precise information about the flagged content; 

      • Set up complaint-handling systems, for processing complaints against their content moderation decisions (small and micro enterprises are exempt from this duty);

      • Disclose information about their content moderation policies, procedures, measures, and tools to users;

      • Design interfaces that don’t deceive, manipulate or impair a user’s decision-making

      • Publish transparency reports at least once a year, which should include number of cases processed, number of complaints received, type of measures taken against flagged content, etc.

    More importantly, in cases of detection of illegal content, gaming companies are now expected to assume more responsibilities, including:

        • Removing illegal content from their platforms once they become aware of it

        • Suspend accounts of frequent offenders and of those who frequently submit unfounded reports of illegal content or complaints;

        • Promptly inform and respond to authorities providing all relevant information, if they are aware of any serious criminal offence.

        • Process notices from trusted flaggers with priority and without delay;

      For “very large online platforms”, there will be extra requirements for risk assessment, independent audit, transparency report, which may possibly affect major players in the market, such as Microsoft, Sony, Nintendo, and Steam, if the industry keeps expanding at the current rate.

      Furthermore, in recent years, with the growing number of minors playing video games daily, child protection has become a key focus. The European gaming industry has responded by advocating for several measures, including the development of technologies to tackle CSAM content, the use of parental controls for age verification, and increased global cooperation to harmonise regulations. 

      The path forward seems clear: as attention on online safety within the gaming sector continues to grow in the EU, future efforts will aim to protect all users, with a particular emphasis on safeguarding children leveraging a combination of policy measures and technological tools.

      The impact of the UK Online Safety Act

      The UK Online Safety Act (OSA) will extend its reach to gaming platforms, with a particular emphasis on protecting children from harmful online content. The act outlines specific responsibilities for gaming platforms that host user-generated content, requiring them to:

          • Minimise the presence of harmful material

          • Report abusive content targeting children

          • Conduct regular risk assessments to identify potential illegal harms

          • Ensure players have an easy way to report illegal or harmful content

          • Keep End User License Agreements (EULAs) and Terms of Service constantly updated

        Specifically, the OSA will target some key features of online gaming platforms, such as user-to-user communication (such as text or voice chats), the creation of user-generated content (like avatars or other visual elements), and virtual reality spaces where users can interact with one another.

        Tremau Policy Research Team

        Are you a gaming service looking to ensure compliance with upcoming regulations? Get ahead of the curve by contacting us at info@tremau.com.

        Our team of policy experts, former regulators, and skilled engineers is ready to craft a custom Trust & Safety solution designed specifically for your platform.

        JOIN OUR COMMUNITY

        Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.

        Share This Post

        Further articles

        Best practices in trust & safety

        Building a Robust Trust & Safety Framework

        In today’s digital landscape, Trust and Safety have become paramount concerns for online platforms. With the introduction of regulations like the EU’s Digital Services Act (DSA) and the UK’s Online Safety Act (OSA), platforms face increasing pressure to ensure user safety and maintain trust. However, many find themselves at a loss when it comes to

        DSA Compliance

        Regulating Online Gaming: Challenges and Future Landscape

        The online gaming industry has experienced significant growth in the past years. In the EU alone, the sector generated €23.48 billion in 2022, four times the revenue of digital music and almost twice that of video-on-demand. Only in the EU, the sector generated €23.48 billion in the EU in 2022, namely four times the revenue

        Join our community

        Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.