Ars Technica | May 1, 2026 10:36 AM
This week, Minnesota became the first state to pass a law banning nudification apps that make it easy to "undress" or sexualize images of real people. Under the law, developers of websites, apps, software, or other services designed to "nudify" images risk extensive damages, including punitive damages, if a victim decides to sue. Their offending products could also be blocked in the state. Additionally, Minnesota's attorney general could impose fines up to $500,000 per fake AI nude flagged. Any fines collected would be used to fund services for victims of "sexual assault, general crime, domestic violence, and child abuse." On Wednesday, the Minnesota Senate unanimously voted 65-0 to pass the law. Gov. Tim Walz is expected to sign the law when it reaches his desk, and the state will start enforcing the ban this August. Democratic Senator Erin Maye Quade introduced the bill after residents discovered that one man had nudified images of more than 80 women from his social circles. RAINN, the national nonprofit that runs the National Sexual Assault Hotline, also helped get Minnesota's bill passed. To prevent industry lobbying against it, RAINN consulted with tech companies when drafting the law. The law exempts products or services that require "the technical skill of a user to nudify an image or video." Even if Walz signs the law, tensions remain around enforcement. Notably, the service used to attack the Minnesota women, DeepSwap, is operated overseas, at times claiming bases in Hong Kong and Dublin. Anticipated state struggles to regulate foreign apps is why a federal ban would be preferable. Additionally, if Donald Trump revives an effort to deregulate the AI industry by blocking state laws like Minnesota's from requiring safeguards, the law could become toothless.
I-People discussions