Meet Julie de Bailliencourt

Julie de Bailliencourt
Director of T&S
Julie de Bailliencourt has spent over fifteen years shaping Trust & Safety at global platforms including as Global Head of Product Policy for ByteDance (TikTok), and at Meta, working across youth safety, policy and cross-market operations. She now joins Tremau as Director of Trust & Safety to deepen our in-house expertise, support clients on complex safety challenges, provide additional advisory services, and help shape how Nima, our AI-powered T&S orchestration platform, enables more efficient and transparent moderation operations. We sat down with Julie for a Q&A on the state of the industry.
The T&S Mission
Q1. “Julie, you’ve seen Trust & Safety evolve from a ‘back-office cleanup crew’ to a headline-dominating business-critical function. Why is this work more important now than ever, especially given the current state of online polarisation, and what continues to draw you in this field?”
Three things draw me in: the mission of keeping people safe, the T&S community, and the ever-changing nature of the challenges faced by platforms. The work is never over: problems evolve, bad actors adapt, new solutions emerge to deter abuse and detect violative content. Even if the T&S community has grown significantly, I always feel a deep sense of connection with everyone. Honestly, it’s in my blood: I just love the work. It is stressful and intense, but it is never boring and constantly requires good problem-solving skills.
It has been fascinating to watch this modest ‘back office’ function develop into a fully-fledged industry, reflecting the huge complexity of the work. While early tech platforms were questioning the need to further invest in T&S, the past few years have demonstrated its business-critical nature. A weak T&S structure leads to dipping engagement metrics and user churn, advertiser boycotts, negative media coverage, as well as concerns from parents. Ultimately this can lead to regulatory inquiries, fines or even platform blocks. Platforms rarely want to be seen as toxic cesspools and it’s absolutely fair to expect them to do their best for user safety; that’s why I am excited to see how we can help them succeed.
Q2. “Why leave the ‘Big Tech’ giants now? What are the specific T&S gaps you see for startups and online services that motivated your move to Tremau?”
My move wasn’t about company size, but about the specific challenges Tremau is solving. All big tech platforms started somewhere, so while today their scale is incredible, their T&S infrastructure represents years of iteration, requiring them to be scrappy, pragmatic and fast. They have had to make significant investments and innovate – none of this is easy. Regardless of a platform size, the core challenges remain the same: building policies and processes from 0-1, identifying gaps, quickly iterating without breaking anything.
What attracted me to Tremau is the opportunity to support diverse companies – from startups to established services – through their T&S and regulatory challenges. This includes advisory and product support, such as implementing our Nima platform. I like the mental challenge of helping T&S teams support their ecosystem despite different constraints, finite resources, and the need to anticipate scalability issues. Ultimately, all I want to do is to work with good people on interesting problems, and make the internet a safer place.
The Current Landscape
Q3. “For a global Head of Trust & Safety, the regulatory landscape has become dizzying. We are seeing conflicting demands across jurisdictions. How do leaders build a cohesive safety strategy today without getting paralyzed by compliance? Is it possible to have a global standard anymore?”
The regulatory landscape is challenging due to complexity and fragmentation, including the DSA in Europe, OSA in the UK, Australian regulation, and a patchwork of US state laws. But compliance is non-negotiable. All platforms need to adapt their processes, also taking into account additional regulatory requirements around data privacy, the demands are complex and require tight cross-functional collaboration across the business.
Regulations also apply across a wide range of platforms, all of which differ in shape, size, content delivery mechanisms, and feature configuration, making comparison difficult for regulators.
Regardless of regulation, putting user safety at the heart of the work provides a great compass for prioritisation. Going back to the mission, starting with user safety makes sound business sense and aligns with most regulatory requirements. Platforms usually already have policies covering illegal and harmful content. Core T&S activities include enforcing these community guidelines at scale, understanding coverage gaps, ensuring reasonable review times, providing users clarity content removals and appeals, and investing in safety-by-design reviews for new features. Regulation has simply brought these activities to the fore.
For the heavy lift or risk assessments and compliance reporting, as a T&S leader I would have sought expert advice, such as that provided by Tremau.
Q4. “There is immense pressure from Boards right now to ‘automate everything’ with AI. Drawing on your experience running operations: Where is the line? Where is AI a superpower for T&S, and where is it dangerous to remove the human from the loop?”
We have not reached peak AI yet, and the drive for efficiency gains through automation will likely continue.
T&S teams have always been under pressure to scale efficiently: avoiding throwing people at problems, while maintaining speed and quality of moderation as volumes grow. In that sense, AI is simply a new powerful tool. It performs well at tasks like scoring content egregiousness, prioritising reports within a queue, or outright moderating high-probability violative reports. Beyond content moderation itself, AI can accelerate policy development and iteration, shortening the overall policy launch cycle. We’ve seen that AI gains – when well orchestrated – have enabled platforms to significantly improve moderation efficiency and reduce overall costs. As a consequence, we have seen waves of industry layoffs and ‘right-sizing’.
However AI is not a replacement for humans: it cannot yet manage nuanced market knowledge, context-dependent abuse or similarly complex issues. Completely removing the human from the loop is dangerous: the effort to drive moderation quality would simply become the effort to drive AI moderation quality. In the same way, relying solely on AI for risk mapping and risk prevention would lead to poor outcomes.
In short, AI is an incredibly effective tool working alongside T&S teams. The right approach is not to ‘automate everything’, but ‘automate deliberately’. T&S leaders must assess the right type of AI implementation for their organisation based on their maturity stage, user growth forecasts, report volumes, and the specific problems they are trying to solve (proactive detection, moderation, policy development etc). It is also important to be realistic about the engineering effort needed to integrate, deploy and maintain these models.
Q5. “Youth safety is arguably the single biggest pressure point for platforms right now. We know there is no silver bullet when it comes to age-verification. If you are advising a platform on how to truly protect teens, what does a ‘Safety by Design’ approach actually look like? How do we move from compliance to genuine care?”
Effective ‘safety by design’ requires a layered approach, strong processes and cross-collaboration. Platforms must cover all angles to prevent underage users from accessing their service, swiftly remove the underage minors it has identified, and ensure eligible teens have a safe and age-appropriate experience.
As you said, there is no silver bullet when it comes to age verification, but combining multiple measures can offer an effective age-assurance strategy. This includes new user registration flows, deterrence mechanisms, parental controls, age-estimation technology, requests for additional information such as government ID, feature blocks, user education, reporting options, detection and enhanced moderation processes, and age appropriate controls.
A core element of ‘safety by design’ is continuous risk assessments across all product areas. T&S must be an early stakeholder in product reviews to mitigate concerns, with clear launch-blocking criteria, such as NCMEC reporting, or specific settings turned on by default for minors.
The challenge is balancing this with avoiding cumbersome friction, ensuring accessibility and fairness, and maintaining data minimisation while capturing sensitive age information. Complacency and carelessness are the enemy here. This is where I wish T&S teams always had a strong voice within their organisation. The hard part comes from balancing competing business interests; platforms want teens to be safe, but they may view some suggested approaches as too restrictive, fearing an impact on growth or retention. When platforms are fighting for advertising revenue, should they for example allow ads for perfectly legal products that could have a negative impact down-the-line on young people? Where to draw the line? Navigating this tension is the true test of a platform’s commitment to ‘genuine care’ over mere compliance.
Building for the Future
Q6. “We’ve talked about Regulations, AI, and Youth Safety. The reality is, you can’t manage any of that with a spreadsheet. What are the technical ‘non-negotiables’ a T&S team needs in their infrastructure to handle this complexity? Is the ‘build it yourself’ era coming to an end?”
A few years ago T&S teams often had to build their own solutions, leading to a patchwork of tools and clunky, manual processes. These in-house systems constantly need regular updates, bug fixes and configuration changes, which is hard to maintain without sustained ring-fenced engineering investment. Frankly those tools can quickly become inflexible and unable to scale.
I believe life should be simpler for T&S teams today. We start with a strong content policy foundation – this sets the tone for the community, and grounds all enforcement. From there, my technical must-haves are:
- A robust content review tool optimised for moderators, fed by comprehensive user-reporting flows
- The ability to intake all content and report types, including proactively detected abuse and specialised flows, as the platform grows
- Flexible queue configuration and routing
- Mandatory features like the ability to report CSAM content to NCMEC
- Automation capability to apply a range of AI automation and rules to mitigate spikes and scale enforcement
- A complete user-feedback loop to handle user appeals and provide real-time notifications
- Built-in reporting and analytics for operational metrics, including the data needed for easy transparency and regulatory compliance (especially for EU/UK)
For most T&S teams, securing long-term engineering resources for all this is simply unrealistic. That’s why vendor solutions, like Nima, may be ideal for many teams, allowing platforms to build with scale and regulation in mind from the start.
Q7. “If you could leave every Trust & Safety leader with one guiding principle to help them navigate the year ahead, what would it be?”
The last few months have been hard and I know T&S teams may feel unseen and stuck between competing priorities. My guiding principle remains: keep focusing on what really matters: embedding user safety (and child safety) at the heart of the work.
This focus is not just a core T&S principle, it is the actual foundation of business growth and resilience. In a heavily polarised world, T&S teams must provide a principled voice, prioritising user-facing harm mitigation over short-term optimisation. They must continuously educate their cross-functional partners on the undeniable link between safety and platform success.
I am delighted that by joining Tremau, I can help support T&S leaders in this mission: from establishing processes that fit their growing organisation, to advising on strategy, and of course, implementing our moderation orchestration platform – Nima, which provides the customisation and flexibility a growing T&S team needs.


