Lead Forensics

Tremau and Pangram Labs partner to take on AI-generated content

As we stand on the cusp of the biggest election year in history, the intersection of technology and democracy takes centre stage once again. More than 50 countries around the world with a combined population of around 4.2 billion will hold national and regional elections in 2024, featuring seven of the ten most populous nations in the world. With the rapid advance of generative AI, which allows anyone to create realistic images, video, audio, or text based on user-provided prompts, the electoral processes can face new challenges. 

Generative AI has garnered attention for its potential to influence public opinion and therefore impact debates and decisions. From deepfake videos to “smart” targeted AI-generated campaigns at scale, the deployment of generative AI techniques can pose significant threats to the integrity of democratic processes. These risks can take many shapes and forms including last-minute attempts to deter people from voting or manufacture an event featuring a generated depiction of a candidate that is difficult to debunk, or spread targeted false stories.  

What does this mean for online platforms? Simply: avoid uncomfortable questions around accountability about the spread of questionable AI generated on your platform, particularly in the face of potential scandals, and improve your trust & safety operations to handle potential new threats. Pangram Labs is developing the most accurate AI-generated content detection methods to automate identification and moderation of AI content. Combined with human-in-the-loop technologies enabled by Tremau, this is an effective process to control and moderate AI content before it threatens the integrity of a platform or election.

When the European Commission first released its proposal for an AI Act in April 2021, generative AI was far from being an urgent concern of regulators. That all changed with the recent advances in AI such as GPT-4. Because of this, the European Parliament substantially amended the European Commission’s initial proposal, notably introducing specific rules that apply to generative AI systems (the Parliament Proposal). Generative AI falls under the category of “General Purpose AI Systems” that have to comply with transparency requirements including disclosing that the content was generated by AI. Meeting these regulatory obligations can be made simple with a compliant-by-design content moderation platform such as Tremau’s moderation platform.

Many businesses have recognized that preparing for the challenges and risks for AI is not only a regulatory issue. The fast development of AI poses challenges for brand safety and platform health – as such it is essential to be aware of how bad actors use AI content for spam and disinformation and use tools to keep it off of your platform. 

For all these reasons, and as part of our mission to build a safe and beneficial digital world for all, Tremau and Pangram Labs are partnering to provide AI-generated content detection and AI disclosures for user-generated content. This way, platforms can stay compliant and keep their user-generated content authentic.

How can we help you?

At Tremau we work to help you best navigate the new world of powerful AI and new regulations.

Pangram Labs is building tools to automate the detection of AI-generated content, starting with text and speech. Learn more at pangramlabs.com.

To find out more, contact us at info@tremau.com and info@pangramlabs.com.

JOIN OUR COMMUNITY

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.

Share This Post

Further articles

Best practices in trust & safety

Building a Robust Trust & Safety Framework

In today’s digital landscape, Trust and Safety have become paramount concerns for online platforms. With the introduction of regulations like the EU’s Digital Services Act (DSA) and the UK’s Online Safety Act (OSA), platforms face increasing pressure to ensure user safety and maintain trust. However, many find themselves at a loss when it comes to

DSA Compliance

Regulating Online Gaming: Challenges and Future Landscape

The online gaming industry has experienced significant growth in the past years. In the EU alone, the sector generated €23.48 billion in 2022, four times the revenue of digital music and almost twice that of video-on-demand. Only in the EU, the sector generated €23.48 billion in the EU in 2022, namely four times the revenue

Join our community

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.