On March 26, 2026, Minnesota's House Commerce Finance and Policy Committee unanimously advanced HF1606, a bill sponsored by Rep. Jessica Hanson (DFL-Burnsville) that would ban accessing, downloading, or using any website, app, or software to create AI-generated nude alterations of a person's image without their consent. The bill also prohibits performing nudification on behalf of others and bans advertising or promoting nudification tools.

Why It Matters

Minnesota's bill represents a growing U.S. state-level trend of targeting nudification technology at the tool level rather than just criminalizing output. Combined with the EU's outright ban, the Dutch court's injunction against Grok, and South Dakota's new deepfake felony law, the regulatory noose around AI nudification is tightening rapidly. For AI companies developing image generation tools, the compliance landscape now requires active prevention of non-consensual intimate image generation — not just post-hoc takedowns.

HF1606 now heads to the House Judiciary Finance and Civil Law Committee. A previous version of the legislation was introduced last session but failed after being rolled into a larger omnibus bill that did not advance. The current bill includes civil lawsuit provisions — the prior version allowed injured parties to seek damages of no less than $500,000 per violation, making it one of the most aggressive anti-nudification measures proposed in any U.S. state.

Minnesota already has a 2023 law addressing deepfake distribution with penalties of up to five years in prison or civil penalties of up to $10,000 per instance. HF1606 goes further by targeting the creation tools themselves rather than just the distribution of resulting images, following the EU Parliament's recent vote to ban AI nudification apps entirely under an AI Act amendment.

Sources


Update — 2026-03-28

Initial entry — story first created.