The online gaming industry has experienced significant growth in the past years. In the EU alone, the sector generated €23.48 billion in 2022, four times the revenue of digital music and almost twice that of video-on-demand.
Only in the EU, the sector generated €23.48 billion in the EU in 2022, namely four times the revenue of digital music and almost twice that of video-on-demand.
In addition, the number of European gamers increased during the COVID-19 pandemic, and now over half of the European population regularly play video games. While this growth is beneficial for the sector, it also comes with pressing and increasing safety challenges for users and gamers, among which almost 20% are under 14 years old and 22% between 15 and 24 years old.
Indeed, a significant part of the community states that they have encountered online harassment, grooming and sexual abuse while gaming. More dangerously, extremist content finds new forums for propagation and mobilisation in these channels, as seen in recreations of mass shooting scenes with multiple connected devices. Thus, it is now more important than ever to pay attention to online trust & safetyThe field and practices that manage challenges related to content- and conduct-related risk, including but not limited to consideration of safety-by-design, product governance, risk assessment, detection, response, quality assurance, and transparency. See also: Safety by design in the gaming industry.
Challenges to online gaming regulations
Effective strategies for regulating user-generated or interactional content are largely missing in the traditionally self-regulated online gaming industry. Conventional regulations on video games – such as age-rating systems based on the degree of violence, strong language, sexual content, and other illicit practices – only apply to content released by developers and are yet to extend to user-generated content. However, for games that involve multiple connected players and allow real-time interaction among them, an ex-ante rating system cannot evaluate the risk of exposure to harmful or illegal content created by other gamers.
Lists of banned words and user report systems are widely implemented across games, but both have limits. Banned words can be easily circumvented by inventing slang terms and can snowball into censoring content that is not necessarily harmful or illegal. Furthermore, report systems often suffer from overburdening, the inconsistency of decisions made by human moderators, and the algorithm’s failure to understand nuanced cases.
Apart from specific technical implementation issues, business considerations also affect content moderationReviewing user-generated content to ensure that it complies with a platform’s T&C as well as with legal guidelines. See also: Content Moderator in online gaming. A key problem is that gaming platforms demonstrate very different standards in content moderation and are not governed by a clear and consistent regulatory framework. For example, Nintendo is famous for its particularly strict content moderation policy due to its family-friendly brand, whereas other studios that produce mature-rated or adult-only games hold a relatively tolerant attitude towards deviant speech and conduct.
Trends in regulating online gaming
Regarding the unique “interactional risk” in online gaming with social features, a major trend is to combine child protection law with legal prescriptions for online content moderation, since these vulnerable gamers have long been active participants in the industry. In this blog, we will quickly explore how and why online gaming platforms should be concerned about the EU Digital Services Act and UK Online Safety Act.
The impact of the EU Digital Services Act
In the European Union, the online gaming industry has fallen under the scope of the general regulations for online services. Therefore, the recent European Regulation on preventing the dissemination of terrorist content online and the Digital Services Act (DSA) in the European Union, are already impacting the online gaming industry, irrespective of gaming companies’ countries of establishment.
Indeed, according to the DSA, gaming companies are now being obliged to:
-
- Set up user report or flaggingFlagging is a term used interchangeably with reporting, which refers to the act of requesting a review of online content, conduct, or a user account. The content can be flagged by an algorithm, a content moderator, or another user. systems, which enables submitting detailed and precise information about the flagged content;
-
- Set up complaint-handling systems, for processing complaints against their content moderation decisions (small and micro enterprises are exempt from this duty);
-
- Disclose information about their content moderation policies, procedures, measures, and tools to users;
-
- Design interfaces that don’t deceive, manipulate or impair a user’s decision-making
-
- Publish transparency reports at least once a year, which should include number of cases processed, number of complaints received, type of measures taken against flagged content, etc.
More importantly, in cases of detection of illegal content, gaming companies are now expected to assume more responsibilities, including:
-
- Removing illegal content from their platforms once they become aware of it
-
- Suspend accounts of frequent offenders and of those who frequently submit unfounded reports of illegal content or complaints;
-
- Promptly inform and respond to authorities providing all relevant information, if they are aware of any serious criminal offence.
-
- Process notices from trusted flaggersGenerally, this refers to individuals or entities that have proven expertise in flagging harmful or illegal content to online service providers. Within the meaning of the DSA, trusted flaggers are entities that have been awarded an official status by a Digital Service Coordinator. Online platforms will need to ensure notices from such organizations are treated with priority. with priority and without delay;
For “very large online platformsAn online platform refers to a digital service that enables interactions between two or more sets of users who are distinct but interdependent and use the service to communicate via the internet. The phrase “online platform” is a broad term used to refer to various internet services such as marketplaces, search engines, social media, etc. In the DSA, online platforms…”, there will be extra requirements for risk assessmentIt refers to the process of identifying, analyzing, and evaluating the severity and probability of risks and threats associated with business or product development, deployment, and maintenance. In the context of the DSA, very large online platforms are required to annually assess the systemic risks stemming from the design, functioning or use of the platforms, including any actual or foreseeable…, independent audit, transparency reportA transparency report is a document released by an organization that discloses information related to its policies, practices, and actions. Typically, transparency reports provide details on the handling of requests for user data and content removal, as well as government requests for user records, among other relevant metrics and insights., which may possibly affect major players in the market, such as Microsoft, Sony, Nintendo, and Steam, if the industry keeps expanding at the current rate.
Furthermore, in recent years, with the growing number of minors playing video games daily, child protection has become a key focus. The European gaming industry has responded by advocating for several measures, including the development of technologies to tackle CSAM content, the use of parental controlsParental controls are software tools that enable parents or guardians to manage a child’s online activity and reduce the risks they might encounter. These features allow for the control of what content can be accessed, which apps can be downloaded or restrictions on in-app purchases. for age verificationAge verification is a process used by websites or online services to confirm that a user meets a specific minimum age requirement. Age verification is typically used in jurisdictions where laws prohibit minors from accessing certain online content or services, such as gambling, pornography, or alcohol sales., and increased global cooperation to harmonise regulations.
The path forward seems clear: as attention on online safety within the gaming sector continues to grow in the EU, future efforts will aim to protect all users, with a particular emphasis on safeguarding children leveraging a combination of policy measures and technological tools.
The impact of the UK Online Safety Act
The UK Online Safety Act (OSA) will extend its reach to gaming platforms, with a particular emphasis on protecting children from harmful online content. The act outlines specific responsibilities for gaming platforms that host user-generated content, requiring them to:
-
- Minimise the presence of harmful material,
-
- Report abusive content targeting children
-
- Conduct regular risk assessments to identify potential illegal harms
-
- Ensure players have an easy way to report illegal or harmful content
-
- Keep End User License Agreements (EULAs) and Terms of Service constantly updated
Specifically, the OSA will target some key features of online gaming platforms, such as user-to-user communication (such as text or voice chats), the creation of user-generated content (like avatars or other visual elements), and virtual reality spaces where users can interact with one another.
Tremau Policy Research Team
Are you a gaming service looking to ensure compliance with upcoming regulations? Get ahead of the curve by contacting us at info@tremau.com.
Our team of policy experts, former regulators, and skilled engineers is ready to craft a custom Trust & Safety solution designed specifically for your platform.