Researchers warn of rise in AI-created non-consensual explicit images
UF cybersecurity experts leads study into the risks, ease and lack of regulation around
AI-generated “nudification” tools
A team of researchers, including Kevin Butler, Ph.D., a professor in the Department of Computer and Information Science and Engineering at the University of Florida, is sounding the alarm on a disturbing trend in artificial intelligence: the rapid rise of AI-generated sexually explicit images created without the subject’s consent.
With funding from the National Science Foundation, Butler and colleagues from UF, Georgetown University and the University of Washington investigated a growing class of tools that allow users to generate realistic nude images from uploaded photos — tools that require little skill, cost virtually nothing and are largely unregulated.
“Anybody can do this,” said Butler, director of the Florida Institute for Cybersecurity Research. “It’s done on the web, often anonymously, and there’s no meaningful enforcement of age or consent.”
The team has coined the term SNEACI, short for synthetic non-consensual explicit AI-created imagery, to define this new category of abuse. The acronym, pronounced “sneaky,” highlights the secretive and deceptive nature of the practice.
The team has coined the term SNEACI, short for synthetic non-consensual explicit AI-created imagery, to define this new category of abuse. The acronym, pronounced “sneaky,” highlights the secretive and deceptive nature of the practice.
“SNEACI really typifies the fact that a lot of these are made without the knowledge of the potential victim and often in very sneaky ways,” said Patrick Traynor, a professor and associate chair of research in UF's Department of Computer and Information Science and Engineering and co-author of the paper.

In their study, which will be presented at the upcoming USENIX Security Symposium this summer, the researchers conducted a systematic analysis of 20 AI “nudification” websites. These platforms allow users to upload an image, manipulate clothing, body shape and pose, and generate a sexually explicit photo — usually in seconds.
Unlike traditional tools like Photoshop, these AI services remove nearly all barriers to entry, Butler said.
“Photoshop requires skill, time and money,” he said. “These AI application websites are fast, cheap — from free to as little as six cents per image — and don’t require any expertise.”
According to the team’s review, women are disproportionately targeted, but the technology can be used on anyone, including children. While the researchers did not test tools with images of minors due to legal and ethical constraints, they found “no technical safeguards preventing someone from doing so.”
Only seven of the 20 sites they examined included terms of service that require image subjects to be over 18, and even fewer enforced any kind of user age verification.
“Even when sites asked users to confirm they were over 18, there was no real validation,” Butler said. “It’s an unregulated environment.”
The platforms operate with little transparency, using cryptocurrency for payments and hosting on mainstream cloud providers. Seven of the sites studied used Amazon Web Services, and 12 were supported by Cloudflare — legitimate services that inadvertently support these operations.
“There’s a misconception that this kind of content lives on the dark web,” Butler said. “In reality, many of these tools are hosted on reputable platforms.”
Butler’s team also found little to no information about how the sites store or use the generated images.
“We couldn’t find out what the generators are doing with the images once they’re created” he said. “It doesn’t appear that any of this information is deleted.”
High-profile cases have already brought attention to the issue. Celebrities such as Taylor Swift and Melania Trump have reportedly been victims of AI-generated non-consensual explicit images. Earlier this year, Trump voiced support for the Take It Down Act, which targets these types of abuses and was signed into law this week by President Donald Trump.
But the impact extends beyond the famous. Butler cited a case in South Florida where a city councilwoman stepped down after fake explicit images of her — created using AI — were circulated online.
“These images aren’t just created for amusement,” Butler said. “They’re used to embarrass, humiliate and even extort victims. The mental health toll can be devastating.”
The researchers emphasized that the technology enabling these abuses was originally developed for beneficial purposes — such as enhancing computer vision or supporting academic research — and is often shared openly in the AI community.
“There’s an emerging conversation in the machine learning community about whether some of these tools should be restricted,” Butler said. “We need to rethink how open-source technologies are shared and used.”
Butler said the published paper — authored by student Cassidy Gibson, who was advised by Butler and Traynor and received her doctorate degree this month — is just the first step in their deeper investigation into the world of AI-powered nudification tools and an extension of the work they are doing at the Center for Privacy and Security for Marginalized Populations, or PRISM, an NSF-funded center housed at the UF Herbert Wertheim College of Engineering.
Butler and Gibson recently met with U.S. Congresswoman Kat Cammack for a roundtable discussion on the growing spread of non-consensual imagery online. In a newsletter to constituents, Cammack, who serves on the House Energy and Commerce Committee, called the issue a major priority. She emphasized the need to understand how these images are created and their impact on the mental health of children, teens and adults, calling it “paramount to putting an end to this dangerous trend.”
"As lawmakers take a closer look at these technologies, we want to give them technical insights that can help shape smarter regulation and push for more accountability from those involved," said Butler. “Our goal is to use our skills as cybersecurity researchers to address real-world problems and help people.”