On April 8, 2026, the Department of Justice announced that James Strahler II, 37, of Columbus, Ohio had pleaded guilty to cyberstalking, producing obscene visual representations of child sexual abuse material, and publishing digital forgeries — making him the first person convicted under the TAKE IT DOWN Act, the landmark federal deepfake law signed by President Trump on May 19, 2025.

Why It Matters

The first TAKE IT DOWN Act conviction establishes that the law's criminal provisions are actively being used — not just sitting on the books as a deterrent. With 15 deepfake-related bills enacted at the state level in 2026 alone and the federal platform compliance deadline 31 days out, enforcement of nonconsensual intimate image law is accelerating on multiple fronts simultaneously. For any platform hosting user-generated content with intimate imagery, the time to build compliant takedown workflows is now.

The case details are disturbing and precisely illustrate what the law was designed to address. From December 2024 through June 2025, Strahler used more than 24 AI platforms and over 100 AI web-based models installed on his phone to create nonconsensual explicit imagery of at least six adult female victims — using their real faces — and to generate AI-made CSAM using images of minor boys from his community. He posted these materials to a website dedicated to child sexual abuse and sent harassing messages to his victims. He had created more than 700 images in total.

The conviction is a milestone for the TAKE IT DOWN Act's criminal provisions, which took effect immediately upon the law's signing. The notice-and-removal provisions — requiring covered platforms to take down nonconsensual intimate images within 48 hours of receiving a valid request — have a separate compliance deadline of May 19, 2026, now just 31 days away. The FTC is responsible for enforcement of those platform obligations.

Rep. Maria Salazar (R-FL), who shepherded the House version of the bill, issued a statement: "This conviction sends a clear message — creating and sharing these vile images is a federal crime with serious consequences." The case was prosecuted in the Southern District of Ohio; sentencing is pending.

For the adult content and sex tech industries, the conviction underscores the enforcement velocity building around AI-generated intimate imagery. The same statute that reached a convicted CSAM producer today is the framework platforms operating in the non-consensual intimate image space must prepare compliance systems for by May 19.

Sources


Update — 2026-04-16

Initial entry — story first created.