Lead Forensics

Challenges in content moderation

Content Moderation: Key Practices & Challenges

Content moderation has become increasingly important for online platforms to protect their users from potential abuses. The evolving regulatory landscape has also put growing responsibilities on the way user-generated content should be moderated online. Notably, the upcoming Digital Services Act (DSA), which affects almost every online service provider active in the EU, will bring unprecedented obligations to online services in …

Content Moderation: Key Practices & Challenges Read More »

How can you make content moderation processes more efficient?

The growing regulatory spotlight on content moderation, shorter deadlines for content removal, growth of detection of potentially illegal or harmful content to be reviewed, and pressing needs to protect both the safety and freedom of expression of users, has increased the urgency to enhance existing online moderation practices. With these practices becoming widespread, it is important …

How can you make content moderation processes more efficient? Read More »

Deconstructing the DSA: How will the DSA impact online platforms’ policies on minors?

A 2022 Pew Research Survey found that 95% of teenagers (aged 13-17) use YouTube and 67% use TikTok, with nearly one in three reporting near-constant use. The amount of screen time has also increased in recent years and it hovers around five and a half hours on average.  With a greater number of underage users and increasing opportunities to …

Deconstructing the DSA: How will the DSA impact online platforms’ policies on minors? Read More »

Gonzalez, Taamneh, and the Future of Content Moderation

The US “may be about to change the law on this massively complex question about human rights on the Internet through the backdoor”, tweeted Daphne Keller, Platform Regulation Director at the Stanford Cyber Policy Centre, in a thread detailing the Gonzalez and Taamneh cases that will be appearing at the Supreme Court this week. While the aforementioned cases raise questions on platform liability with …

Gonzalez, Taamneh, and the Future of Content Moderation Read More »

Regulating Online Matchmaking: Trends & Challenges

Online dating platforms have exploded in popularity over the past decade with their combined global user bases topping 323 million and earning the industry $5.61 billion in 2021. However, the exponential growth of internet users has led to several enduring problems with creating an accessible virtual dating space where everyone feels safe and included. With a projected 15% …

Regulating Online Matchmaking: Trends & Challenges Read More »

Image Copy Detection: A Key Problem in Online Trust & Safety

Image Copy Detection: A Key Problem in Online Trust & Safety Online platforms largely rely on content moderation to remove illegal or harmful content, such as terrorist content or child abuse images and videos. More often then not, detected and removed illegal content re-appear, possibly multiple times, as manipulated copies – for example, cropped, rotated, …

Image Copy Detection: A Key Problem in Online Trust & Safety Read More »

Quality Assurance for Content Moderation

Online content moderation has been an increasingly important and debated topic, with new regulations, such as the EU’s Digital Services Act (DSA), expected to further reinforce this trend. Regulations will create more legally-binding obligations for online platforms with respect to content moderation, in order to improve users’ online well-being and the better functioning of the online …

Quality Assurance for Content Moderation Read More »

Content moderators: How to protect those who protect us?

Content moderators have become indispensable for online platforms’ everyday operations. However, major platforms outsourcing their content moderation to contractors all around the world face an increasingly pressing challenge: Employee turnover at these sites is high, as most moderators cannot continue for more than 2 years on average. Poor mental health is one of the major reasons behind …

Content moderators: How to protect those who protect us? Read More »

Are Our E-Commerce Platforms Safe and Sustainable?

Since the birth of the commercial internet in the mid 90s, e-commerce platforms have radically transformed the ways we transact and consume, with significant impacts on the economy and on all affected industries. Between 2018 to 2020, major economies saw a 41% rise in online retail sales. This is currently forecasted to grow to $7.5 trillion, which …

Are Our E-Commerce Platforms Safe and Sustainable? Read More »

WeProtect Global Alliance: Application 3018: An app to report harmful content online and protect children

On 8 February 2022, Safer Internet Day, e-Enfance – a French NGO fighting against children’s bullying and online harassment – launched a nationwide app, Application 3018, to facilitate the reporting of cyber harassment. The application is combined with a dedicated online trust & safety platform that enables faster victim support and more efficient removal of harmful content by …

WeProtect Global Alliance: Application 3018: An app to report harmful content online and protect children Read More »

Join our community

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.