Online content moderationReviewing user-generated content to ensure that it complies with a platform’s T&C as well as with legal guidelines. See also: Content Moderator has been an increasingly important and debated topic, with new regulations, such as the EU’s Digital Services Act (DSA), expected to further reinforce this trend. Regulations will create more legally-binding obligations for online platformsAn online platform refers to a digital service that enables interactions between two or more sets of users who are distinct but interdependent and use the service to communicate via the internet. The phrase “online platform” is a broad term used to refer to various internet services such as marketplaces, search engines, social media, etc. In the DSA, online platforms… with respect to content moderation, in order to improve users’ online well-being and the better functioning of the online world.
Challenges of Content Moderation
However, while millions of posts appear every day on social platforms, only a few hundred thousand people work in the current content moderation industry. Despite plans from platforms to recruit more moderators, the amount of work managed by each moderator remains very large: they often have to review thousands of posts every day, leaving them with a very narrow (and stressful) window to decide whether or not an online post should be removed, raising possible issues regarding the accuracy, consistency and potential fairness of a company’s content moderation and its impact on free speech.
In addition to the very limited time to make moderation decisions, the quality of moderation can also be affected by AI tools deployed by platforms, the highly contextual nature of many online posts, and the large quantity of online content falling in the grey zone between harmful and safe. Potential biases of content moderators further exacerbate the issue. For example, some moderators might be too lenient or too strict with respect to company guidelines, and can also be impacted by how long they have been working in the day, others may be accurate on some categories of instances but lack the expertise or training on some others, while other moderators might be biased specifically towards some categories of content (e.g., culturally, politically, etc).
Importance of Quality Assurance
Ensuring the quality of content moderation is a challenge that has important implications for the proper functioning of social media and freedom of expression online. Quality assurance (QA) for content moderation is essential to ensure that the right balance between safety and freedom of expression is met in a fair and effective manner. Poor content moderation can also raise reputation, regulatory, and other business risks for online platforms, including a possible loss of users. QA becomes even more challenging and important as companies outsource content moderation to external providers – whose quality also needs to be continuously monitored. In this context, online platforms are looking for ways to monitor and improve the quality of their moderation processes. Quality can be measured using metrics such as accuracy, consistency and fairness (e.g. similar cases get similar decisions). Consistency is critical both over time for each moderator and across moderators.
The typical quality assurance process for online content moderation is based on performing regular (for example weekly) controlled evaluations: for example, after carefully labelling a number of content items (e.g., users’ posts), managers provide them to multiple moderators, which allows to compute a score for each of them based to how they perform relative to each other as well as relative to the desired labels the company selected for these items.
However, this common QA practice does not leverage all data available, and as the evaluations are done only once a while, one cannot detect potential QA issues real time – for example because a moderator may drift even temporarily. An important challenge related to quality and consistency evaluation is the ability to use many, if not all past decisions from all moderators, in order not to be limited by a small number of weekly test instances. Very importantly, this help get rid of additional evaluation processes entirely, while improving the reliability of the evaluation and ensuring continuous monitoring.
Managing/Improving QA
In our study, we discuss some approaches for managing content moderation quality real time, without the need to perform regular (and costly!) tests or requiring multiple moderators to handle the same cases. We develop a new method for comparing content moderators’ performances even when there is no overlap across moderators in the content they manage (i.e., each instance is only handled by a single moderator), using the data of the moderators’ previous decisions. To this purpose, we also discuss how to adapt crowd labelling algorithms for performing QA in content moderation – an approach that we believe can be promising to further explore.
To find out more about building an accurate and efficient content moderation system, contact us at info@tremau.com.
To download Improving Quality and Consistency in Single Label Content Moderation, please fill out the form below.
Tremau Policy Research Team
We're excited that you're enjoying our content and would like to read more.
By providing us with your email address, you’ll gain access to exclusive content, special offers, and updates about our latest articles. We’re committed to providing you with high-quality content that’s informative, engaging, and relevant to your interests.