EventTrust & SafetyContent Moderation

Content Moderation and Governance: Principles, Misconceptions and Challenges

Understanding the complexities of online moderation and its governance beyond censorship debates

Understanding the complexities of online moderation and its governance beyond censorship debates

trial

Online content moderation is at the center of numerous contemporary debates surrounding free speech, safety, business, democracy, and even national security and geopolitics.

To understand better the current implications of content moderation, we hosted the webinar “Content Moderation and Governance: Principles, Misconceptions, and Challenges” in collaboration with the MIT Alumni Association of the UK and the INSEAD AI Alumni-led Community.

Our speakers included Ludo Van der Heyden, the INSEAD Chaired Professor Emeritus of Corporate Governance, and Agne Kaarlep, the Managing Director of Policy and Advisory at Tremau. The discussion was moderated by Theodoros Evgeniou, Co-founder of Tremau and a professor at INSEAD.

Here’s what we learned:

We challenged some common misconceptions around moderation:

“Content moderation necessarily restricts freedom of expression.”
It doesn’t have to. In fact, effective content moderation can enable and support free expression. Without it, online spaces risk becoming neither free nor safe. This is why frameworks like Section 230 (since 1996) exist—to balance moderation with an open internet.

“The DSA dictates what content to remove and censors speech.”
→ It doesn’t. The DSA focuses on how content moderation should be conducted, ensuring transparency, accountability, and a fair process rather than mandating specific removals. Platforms must now inform users of moderation actions, provide reasons, enable appeals, and offer out-of-court dispute resolution. These safeguards help correct inevitable mistakes and rebalance power between platforms, users, regulators, and the public.

And on content moderation...

Fairness in Content Moderation: it’s about the process, not just the outcome

Fair content moderation isn’t just about achieving the “right” outcome – it’s about ensuring a just and fair process. While distributive justice focuses on fairness in results (equality, merit, and need), procedural justice ensures that decisions are made through clear, consistent, and accountable methods.
A fair moderation process must include:

✅ Clarity – Clear rules and expectations for users
✅ Transparency – Open communication on policies and decisions
✅ Consistency – Absence of bias and reliable enforcement
✅ Willingness to change – Adapting to data and evolving legislation

Content moderation at scale is inherently imperfect. That’s why it requires continuous learning and adaptation

Building an effective moderation infrastructure requires identifying risks, leveraging technology to manage them, and implementing robust systems and workflows. All keeping in mind that online spaces and their threats evolve and change. Therefore, for platforms, strong internal governance is essential to align all the moving parts and make sure moderation processes improve and adapt over time, drawing on data gathered from day-to-day activities.

Did you miss the webinar? Watch the recording: