On May 19, 2026 — now just 46 days away — the compliance deadline for the TAKE IT DOWN Act takes effect, requiring all "covered platforms" to have established clear, accessible processes for individuals to report and request removal of nonconsensual intimate visual depictions, including AI-generated deepfakes. Platforms must remove flagged content within 48 hours of a valid request and take down all known identical copies.
Why It Matters
The TAKE IT DOWN Act represents the first federal law specifically criminalizing nonconsensual deepfake pornography and mandating platform takedown processes. Its May 19 compliance deadline arrives amid an unprecedented wave of deepfake regulation: the DEFIANCE Act passed the Senate unanimously in January, the EU banned nudification apps in March, and multiple state laws have been enacted. For adult content platforms, the 48-hour takedown requirement with FTC enforcement creates a compliance burden that could reshape operational models. For sex tech companies and creator economy platforms, the Act's broad definition of "covered platforms" means even peripheral players may need takedown infrastructure.The TAKE IT DOWN Act (Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act) was signed into law by President Trump on May 19, 2025, with a one-year runway for platform compliance. The Act's criminal provisions — making it a federal crime to knowingly publish or threaten to publish nonconsensual intimate imagery, punishable by fines and up to two years imprisonment — took effect immediately upon signing. The May 2026 deadline applies specifically to the platform-side notice-and-removal obligations enforced by the FTC.
"Covered platforms" is broadly defined to include any public website, online service, application, or mobile app that either primarily provides a forum for user-generated content or is primarily designed to publish nonconsensual intimate visual depictions. This sweeps in not just social media giants and adult content platforms but potentially any site with user uploads — dating apps, messaging services, image hosting sites, and AI-generated content platforms.
The 48-hour removal window is aggressive by industry standards. Content moderation experts note that identifying "known identical copies" across a platform requires hash-matching infrastructure that many smaller platforms have not yet built. For adult content platforms already under Mastercard and Visa content compliance requirements, the TAKE IT DOWN Act adds yet another layer of mandatory content moderation with federal enforcement teeth.
Sources
- 'Take It Down Act' Requires Online Platforms To Remove Unauthorized Intimate Images and Deepfakes — Skadden
- TAKE IT DOWN Act: What Creators and Platforms Must Know — CopyrightShark
Update — 2026-04-03
Initial entry — story first created.