Lead Forensics

Challenges in content moderation

Trust & Safety Software

Making the Right Choice: Buy or Build Your Trust & Safety Software?

In the realm of software development, the age-old question about building software in-house or buying it from vendors is very common.  It is not surprising that this question is also very common when it comes to enterprise-level Trust & Safety (T&S) tools. Luckily, there is a long history of research on this question, starting from […]

Making the Right Choice: Buy or Build Your Trust & Safety Software? Read More »

figting terrorist content

Fighting terrorist content online: insights from the FRISCO Project

In today’s digital age, there’s been a troubling increase in how terrorists exploit the internet, which has become a major concern for both online and offline security (OECD, 2022). Facebook alone removed a record 16 million pieces of terrorist propaganda in the first quarter of 2022, along with over 13 million instances of hate speech.

Fighting terrorist content online: insights from the FRISCO Project Read More »

What does it take to make your business LLM and GenAI proof?

Theodoros Evgeniou* (Tremau), Max Spero* (Checkfor.ai) Arguably “the person of the year for 2023” has been AI. We have all been taken by surprise by the speed of innovation and capabilities of Large Language Models (LLMs) and more generally generative AI (GenAI). At the same time, many, particularly in online platforms, raise questions about potential

What does it take to make your business LLM and GenAI proof? Read More »

Content Moderation: Key Practices & Challenges

Content moderation has become increasingly important for online platforms to protect their users from potential abuses. The evolving regulatory landscape has also put growing responsibilities on the way user-generated content should be moderated online. Notably, the upcoming Digital Services Act (DSA), which affects almost every online service provider active in the EU, will bring unprecedented obligations to online services in

Content Moderation: Key Practices & Challenges Read More »

How can you make content moderation processes more efficient?

The growing regulatory spotlight on content moderation, shorter deadlines for content removal, growth of detection of potentially illegal or harmful content to be reviewed, and pressing needs to protect both the safety and freedom of expression of users, has increased the urgency to enhance existing online moderation practices. With these practices becoming widespread, it is important

How can you make content moderation processes more efficient? Read More »

Deconstructing the DSA: How will the DSA impact online platforms’ policies on minors?

A 2022 Pew Research Survey found that 95% of teenagers (aged 13-17) use YouTube and 67% use TikTok, with nearly one in three reporting near-constant use. The amount of screen time has also increased in recent years and it hovers around five and a half hours on average.  With a greater number of underage users and increasing opportunities to

Deconstructing the DSA: How will the DSA impact online platforms’ policies on minors? Read More »

Gonzalez, Taamneh, and the Future of Content Moderation

The US “may be about to change the law on this massively complex question about human rights on the Internet through the backdoor”, tweeted Daphne Keller, Platform Regulation Director at the Stanford Cyber Policy Centre, in a thread detailing the Gonzalez and Taamneh cases that will be appearing at the Supreme Court this week. While the aforementioned cases raise questions on platform liability with

Gonzalez, Taamneh, and the Future of Content Moderation Read More »

Regulating Online Matchmaking: Trends & Challenges

Online dating platforms have exploded in popularity over the past decade with their combined global user bases topping 323 million and earning the industry $5.61 billion in 2021. However, the exponential growth of internet users has led to several enduring problems with creating an accessible virtual dating space where everyone feels safe and included. With a projected 15%

Regulating Online Matchmaking: Trends & Challenges Read More »

Image Copy Detection: A Key Problem in Online Trust & Safety

Image Copy Detection: A Key Problem in Online Trust & Safety Online platforms largely rely on content moderation to remove illegal or harmful content, such as terrorist content or child abuse images and videos. More often then not, detected and removed illegal content re-appear, possibly multiple times, as manipulated copies – for example, cropped, rotated,

Image Copy Detection: A Key Problem in Online Trust & Safety Read More »

Quality Assurance for Content Moderation

Online content moderation has been an increasingly important and debated topic, with new regulations, such as the EU’s Digital Services Act (DSA), expected to further reinforce this trend. Regulations will create more legally-binding obligations for online platforms with respect to content moderation, in order to improve users’ online well-being and the better functioning of the online

Quality Assurance for Content Moderation Read More »

Join our community

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.