Award-winning UF team targets nonconsensual nude photo apps to protect privacy

  • Researchers from UF, Georgetown University and the University of Washington examined 20 websites that use AI to generate nude images from a photo in under 30 seconds for only a few cents. 
  • The concern is AI-based nudification applications, which can turn images of clothed people — easily and convincingly — into naked pictures. 
  • The research earned the team the NYU Center for Cybersecurity Awareness Week’s Social Impact award 

A University of Florida research team with an eye on privacy and real-world impacts is working with government leaders and other universities to combat AI-based platforms that turn personal images into nude photos without consent. 

 The research recently earned international recognition with the NYU Center for Cybersecurity Awareness Week’s Social Impact award. It is designed to inform discussions around state and international legislation and engage industry partners to combat serious privacy invasions posed by generative AI.   

Their paper, “Analyzing the AI Nudification Application Ecosystem,” was one of 11 finalists selected from 189 submissions to the Applied Research Competition. The research examined 20 popular websites that use artificial intelligence to generate nude or sexualized images from a single photo in under 30 seconds for only a few cents. The paper finds that most sites explicitly target women. 

The paper was authored by UF Graduate Research Assistant Cassidy Gibson, computer science Ph.D. candidate Daniel Olszewski, their UF faculty advisors and researchers from Georgetown University and the University of Washington.  

AI-based nudification applications can turn images of clothed people into nude images without consent. The team noted there has been no prior systematic study of this ecosystem, leaving victims vulnerable to abuse. 

“This research is impactful because AI has become an everyday tool,” explained Olszewski. “Few people stop to ask what harm these technologies can cause. Our research resonates with people because it demonstrates how personal and widespread the risks can be.” 

Among the conclusions, researchers noted the harmful effects of these websites “are not merely emotional, but pervasive throughout victim-survivors’ lives.” Companies, they said, may refuse to interview or hire people because their applicants’ search results include nude images or deep-fake sex videos. Further, creators of nude images may attempt to extort or harass the victims. 

The research identified stakeholders in the nudification pipeline as creators of the models used to power the nudification websites, the websites themselves, the payment processors for the websites, single-sign-on providers and site marketing. 

“Future work,” they wrote, “should consider what steps for responsible use should be taken by open-source model providers, including but not limited to tracking downstream uses of their models for abuse and governing access to model features, as well as technical interventions for identifying ‘responsible' model providers whose features underlie downstream abusive websites or copycat models.” 

The team contends this work will empower data-informed conversations between the scientific communities and policymakers on ways to better protect rights and minimize harm. 

UF researchers collaborated with U.S. and international policymakers, including Florida’s third district Rep. Kat Cammack and the Australian eSafety Commission. The research helped inform new policies and defenses against technology-enabled abuse, established best practices for detecting nude content and shared how to protect image owners’ rights. 

“Understanding the potential harms related to the abuse of AI is critical as we move forward. Work such as this not only helps technologists defend against such abuses but also creates opportunities for industry and government to work together to solve these problems as well,” said Professor Patrick Traynor, Ph.D., interim chair of the Department of Computer & Information Science & Engineering, known as CISE.  

Olszewski said his advisors — Traynor and Kevin Butler, Ph.D., and Georgetown's Tadayoshi Kohno, Ph.D., and Elissa Redmiles, Ph.D. — showed him how to approach research to make real-world impacts. 

“I am immensely grateful I have had the opportunity to work with world-class researchers on this project,” Olszewski said. “I would love to continue to tackle problems that make the world a better place for the everyday user.”  

The applied research competition at NYU’s Center for Cybersecurity Awareness Week is a premier contest for security research. The award highlights UF’s leadership in cybersecurity and commitment to addressing societal challenges of emerging technologies. 

“We were competing with over 1,600 papers that were published at top security venues this year,” said Olszewski. “It was an honor to receive the recognition. It reminds us that research has a real impact on society and can shape safer, more responsible use of emerging technologies.”