Lead Forensics

Regulating Online Gaming: Challenges and Future Landscape

The online gaming industry is booming. With an annual growth rate estimated at 12.1%, the global gaming market size will reach 435 billion USD by 2028. While the video game industry has been a vibrant market since the 1990s, the Covid pandemic brought an unprecedented change to the industry. During the lockdown, online gaming became a major channel for people to connect with friends and strangers – transforming gaming from only entertainment to a social experience. However, serious problems have also emerged in these new social spaces. 

An overwhelming majority of the community states that they have encountered online harassment while gaming. More dangerously, extremist content finds new forums for propagation and mobilization in these channels, as seen in recreations of mass shooting scenes with multiple connected devices. Thus, it is now more important than ever to pay attention to online trust & safety in the gaming industry. 

Challenges to online gaming regulations

Effective strategies for regulating user-generated or interactional content are largely missing in the traditionally self-regulated online gaming industry. Conventional regulations on video games – such as age-rating systems based on the degree of violence, strong language, sexual content, and other illicit practices – only apply to content released by developers and are yet to extend to user-generated content. A rating system works well for console games that usually do not have user-interaction features. However, for games that involve multiple connected players and allow real-time interaction among them, an ex-ante rating system cannot evaluate the risk of exposure to harmful or illegal content created by other gamers. 

Lists of banned words and user report systems are widely implemented across games, but both have considerable limits. Banned words can be easily circumvented by inventing slang terms and can snowball into censoring content that is not necessarily harmful or illegal. Furthermore, report systems often suffer from overburdening, the inconsistency of decisions made by human moderators, and the algorithm’s failure to understand nuanced cases.

Apart from specific technical implementation issues, business considerations also affect content moderation in online gaming. A key problem is that gaming platforms demonstrate very different standards in content moderation and are not governed by a clear and consistent regulatory framework. For example, Nintendo is famous for its particularly strict content moderation policy due to its family-friendly brand, whereas other studios that produce mature-rated or adult-only games hold a relatively tolerant attitude towards deviant speech and conduct.

Future trends in regulating online gaming

Concerning the unique “interactional risk” in online gaming with social features, a major trend is to combine child protection law with legal prescriptions for online content moderation, since these vulnerable gamers have long been active participants in the industry.

Germany’s youth protection law amended in April 2021 now integrates the in-game communication environment into the reformed age-rating standard for video games, and those with unlimited chat functions will receive a higher age-rating. On the other hand, the UK Draft Online Safety Bill published in May 2022 also gives special focus to online content accessed by children, stating that platforms hosting user-generated content have tailored duties for minimizing harmful content’s presence, reporting abusive content against children, and assisting law enforcement departments if needed. 

In the European Union, another crucial change is to put the online gaming industry under the general regulations for online platforms that provide hosting services. The recent European Regulation on preventing the dissemination of terrorist content online and coming Digital Services Act (DSA) in the European Union, are also going to impact the online gaming industry, irrespective of gaming companies’ countries of establishment. 

Indeed, according to DSA, gaming companies will now be obliged to:

  • Set up user report or flagging systems, which enables submitting detailed and precise information about the flagged content; 
  • Set up complaint-handling systems, for processing complaints against their content moderation decisions (small and micro enterprises are exempt from this duty);
  • Disclose information about their content moderation policies, procedures, measures, and tools to users;
  • Publish transparency reports at least once a year, which should include number of cases processed, number of complaints received, type of measures taken against flagged content, etc.

More importantly, in cases of detection of illegal content, gaming companies are now expected to assume more responsibilities, including:

  • Process notices from trusted flaggers with priority and without delay;
  • Suspend accounts of frequent offenders and also of those who frequently submit unfounded reports of illegal content or complaints;
  • Promptly inform authorities and provide all relevant information, if they are aware of any serious criminal offense.

For “very large online platforms”, there will be extra requirements for risk assessment, independent audit, transparency report, etc., which may possibly affect major players in the market, such as Microsoft, Sony, Nintendo, and Steam, if the industry keeps expanding at the current rate. In response to the DSA, the European gaming industry is calling for more detailed and nuanced regulations to address the complex and diverse services in the ecosystem. However, one key trend is certain: online gaming platforms will no longer stay self-regulated without direct intervention from governments, and they will be held accountable for not investing enough effort in combating their users’ illegal speech and conduct.

Tremau Policy Research Team

JOIN OUR COMMUNITY

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.

Share This Post

Further articles

Global Regulations

What does Canada’s proposed Online Harms Act mean for your platform?

In the last three years, roughly 540 million people—representing some of the most lucrative markets—have come under the protection of next-generation online safety laws in the European Union (EU), United Kingdom (UK), and Australia. The dominoes are falling, and Canada is suiting up to join the party.  Canada’s Bill C-63, otherwise known as the Online

Trust & Safety Software
Best practices in trust & safety

Making the Right Choice: Buy or Build Your Trust & Safety Software?

In the realm of software development, the age-old question about building software in-house or buying it from vendors is very common.  It is not surprising that this question is also very common when it comes to enterprise-level Trust & Safety (T&S) tools. Luckily, there is a long history of research on this question, starting from

Join our community

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.