0

Wisconsin man arrested for allegedly creating AI-generated child sexual abuse material

A Wisconsin software engineer was arrested on Monday for allegedly creating and distributing thousands of AI-generated images of child sexual abuse material (CSAM). 

Court documents describe Steven Anderegg as “extremely technologically savvy,” with a background in computer science and “decades of experience in software engineering.” Anderegg, 42, is accused of sending AI-generated images of naked minors to a 15-year-old boy via Instagram DM. Anderegg was put on law enforcement’s radar after the National Center for Missing & Exploited Children flagged the messages, which he allegedly sent in October 2023.

According to information law enforcement obtained from Instagram, Anderegg posted an Instagram story in 2023 “consisting of a realistic GenAI image of minors wearing BDSM-themed leather clothes” and encouraged others to “check out” what they were missing on Telegram. In private messages with other Instagram users, Anderegg allegedly “discussed his desire to have sex with prepubescent boys” and told one Instagram user that he had “tons” of other AI-generated CSAM images on his Telegram.

Anderegg allegedly began sending these images to another Instagram user after learning he was only 15 years old. “When this minor made his age known, the defendant did not rebuff him or inquire further. Instead, he wasted no time in describing to this minor how he creates sexually explicit GenAI images and sent the child custom-tailored content,” charging documents claim.

When law enforcement searched Anderegg’s computer, they found over 13,000 images “with hundreds — if not thousands — of these images depicting nude or semi-clothed prepubescent minors,” according to prosecutors. Charging documents say Anderegg made the images on the text-to-image model Stable Diffusion, a product created by Stability AI, and used “extremely specific and explicit prompts to create these images.” Anderegg also allegedly used “negative prompts” to avoid creating images depicting adults and used third-party Stable Diffusion add-ons that “specialized in producing genitalia.”

Last month, several major tech companies including Google, Meta, OpenAI, Microsoft, and Amazon said they’d review their AI training data for CSAM. The companies committed to a new set of principles that include “stress-testing” models to ensure they aren’t creating CSAM. Stability AI also signed on to the principles. 

According to prosecutors, this is not the first time Anderegg has come into contact with law enforcement over his alleged possession of CSAM via a peer-to-peer network. In 2020, someone using the internet in Anderegg’s Wisconsin home tried to download multiple files of known CSAM, prosecutors claim. Law enforcement searched his home in 2020, and Anderegg admitted to having a peer-to-peer network on his computer and frequently resetting his modem, but he was not charged.

In a brief supporting Anderegg’s pretrial detention, the government noted that he’s worked as a software engineer for more than 20 years, and his CV includes a recent job at a startup, where he used his “excellent technical understanding in formulating AI models.”

If convicted, Anderegg faces up to 70 years in prison, though prosecutors say the “recommended sentencing range may be as high as life imprisonment.” 

#Wisconsin #man #arrested #allegedly #creating #AIgenerated #child #sexual #abuse #material

Source link
#Wisconsin #man #arrested #allegedly #creating #AIgenerated #child #sexual #abuse #material