First Conviction Under 'Take It Down Act' Fails to Stop AI Nude Creator, Who Continued After Arrest
The first person convicted under the new 'Take It Down Act' continued to create and distribute AI-generated non-consensual intimate imagery even after his arrest, revealing a stark enforcement gap. Steven Anderegg, a 30-year-old from Wisconsin, was found guilty of using AI to create nude images of a 15-year-old girl and sharing them online. Despite being charged and the legal process beginning, his activities reportedly persisted, challenging the assumption that arrest alone halts such digital exploitation.
Anderegg targeted a minor he knew from school, using a publicly available AI image generator to create the explicit content and then sending it to the victim via Instagram. The case, prosecuted by the U.S. Attorney’s Office for the Western District of Wisconsin, marks the inaugural application of the 'Take It Down Act,' a provision within the 2022 reauthorization of the Violence Against Women Act designed to combat digitally fabricated sexual abuse material. The conviction carries a mandatory minimum sentence, but the post-arrest continuation of his actions signals potential limitations in monitoring or restraining defendants in the digital realm.
This precedent-setting case places intense scrutiny on the practical efficacy of new laws against AI-facilitated sexual exploitation. It raises critical questions about the mechanisms to prevent re-offending during prosecution and whether current bail or pre-trial conditions are sufficient for crimes conducted entirely online. The outcome pressures law enforcement and judicial systems to adapt their strategies, as the ease of generating such material with AI tools could outpace the legal and procedural frameworks meant to contain it.