What is the TAKE IT DOWN Act?
The TAKE IT DOWN Act, signed into law in May 2025, is a U.S. federal law aimed at protecting individuals from non-consensual sharing of intimate images online. The new law adds to a growing body of global regulation requiring online platforms to establish operational moderation processes and maintain records.
Under the Act, covered platforms operating in the U.S. are required to establish clear reporting and removal systems for non-consensual intimate images (NCII), including AI-generated or manipulated content, and to remove such content within 48 hours of a valid request. Platforms now have one year to implement these systems and comply fully.
For many online services, this means adapting existing reporting and prioritisation workflows, providing relevant training and set up processes to maintain detailed compliance records.
Who must comply and what content is covered?
The TAKE IT DOWN Act applies to “covered platforms,” broadly defined as websites, apps, or online services that:
- Are publicly accessible; AND
- Primarily provide forums for user-generated content such as images, videos, audio, messages, or games; OR
- Regularly host, publish, or distribute non-consensual intimate visual depictions as part of their business.
In short, this is a very expansive law, covering services such as: social media platforms, content sharing and hosting services, video sharing platforms, adult content websites, gaming platforms with user interaction, messaging apps with public channels, cloud storage services with public sharing features, and blogging platforms.
Excluded from the Act are broadband internet providers, email services, and platforms that mainly offer preselected (non-user-generated) content.
The types of content covered by the Act include:
- Sexually explicit images or videos shared or created without the subject’s consent,
- AI-generated or manipulated content indistinguishable from authentic imagery,
- Graphic or explicit depictions of nudity, sexual acts, or simulated sexual content.
Non-compliance may trigger civil penalties of up to approximately $51,000 per violation. Since the term “violation” is not clearly defined, it could be interpreted broadly – such as per user, per image, or per day – potentially exposing platforms to cumulative fines in the millions for unresolved cases.
What operational steps platforms need to take
To comply with the TAKE IT DOWN Act, covered platforms will be required to implement a comprehensive set of measures and adapt internal operations, including:
- Provide a clear and accessible reporting process that allows impacted users or authorized representatives to submit removal requests. Platforms have to deploy capabilities for electronic signature capture, file upload for evidence like a screenshot or a URL, and a good faith statement. When implementing this new reporting flow, platforms should ensure design captures all necessary information upfront, minimizing delays and invalid reports.
- If classified as NCII, remove reported content within 48 hours of receiving a valid request. This entails implementing timestamp logging and tiered review workflows, potentially combining automated and human moderation. Platforms would also need clear policies defining workflow structures and escalation paths, including Quality Assurance processes and ensure that moderation staff receive appropriate training.
- Prevent re-uploads and duplicate content by embedding hash-matching into upload pipelines for real-time blocking, behavioral analysis to detect systematic re-upload attempts, and progressive enforcement for repeat offenders. Platforms should define precise criteria distinguishing identical from near-identical content and conduct regular auditing of filter effectiveness to minimize false negatives or positives.
Finally, platforms should maintain detailed documentation of all removal requests, actions taken, and communications. While not explicitly required, doing so is a critical part of demonstrating good-faith compliance and may be important for qualifying for safe harbor protections or responding to future regulatory scrutiny.
All these measures will have to be fully operational by May 19, 2026.
How Nima enables compliance with the TAKE IT DOWN Act
Dealing with the fast-moving regulatory landscape, with new laws like the TAKE IT DOWN Act, can be overwhelming, especially when it means building new processes, coordinating across teams, and responding to time-sensitive reports under pressure.
Here is where Nima comes in.
Tremau’s Trust & Safety orchestration platform helps take the weight off by enabling you to customise complex legal requirements into clear, scalable workflows.
It’s built to reduce manual effort, avoid delays, and give teams the tools they need to act quickly, document actions, and stay compliant with multiple regulations, reducing the burden on legal, T&S, and operations teams.
Here’s how platforms can adapt their internal operations using Nima:
- Customizable reporting form: A user-friendly, flexible and complete reporting form that can be embedded directly into platforms’ interface, guiding victims or authorized representatives through submitting valid removal requests, with support for multiple languages, file uploading, and accessibility standards.
- Intelligent workflow automation: Automated prioritization of NCII takedown requests to meet the 48-hour removal deadline, with efficient routing to appropriate moderation queues and real-time tracking and escalation alerts.
- Advanced content identification with hash-matching: Integration of image and video hashing alongside AI-powered similarity detection to identify exact and near-duplicate content, fulfilling the law’s requirements for duplicate removal and re-upload prevention.
- Appeal mechanism: While the TAKE IT DOWN Act does not require a complaint mechanism, it is best practice to allow for feedback loops with users. Nima allows platforms to manage appeals by sending user communications and setting up dedicated queues.
- Comprehensive audit trails: Logging of every action – report submissions, content removals, appeals, and communications – providing a transparent audit trail for regulatory reporting and internal quality assurance.
Not entirely sure if and how rules apply or struggling with the increasing number of overlapping processes and governance requirements? Reach out to our T&S and compliance experts.
More on our T&S infrastructure
A fair process is the essential element for Trust & Safety
Building fair content-moderation processes is a business imperative for platforms to attract and retain users.Tremau & User Rights: Strengthening Independent Remedies and Risk Mitigation under the DSA
We are launching a partnership with User Rights, one of the first out-of-court bodies designated under the Digital Services Act.Announcing €3M added funding to keep building Nima, our intelligent moderation platform
We've raised a new investment from Auriga Cyber Ventures and G+D Ventures to take Trust & Safety to the next level.