Thermal cameras used in drones and robots can be tricked by heat sources, study finds

  • UF researchers discovered three vulnerabilities in thermal cameras that could cause drones or autonomous vehicles to miss real obstacles or detect ones that aren’t there
  • The flaws can be triggered by environmental heat sources without hacking the device
  • The team also developed real-time defenses to detect and filter misleading thermal signals

As thermal cameras become commonplace on autonomous drones and vehicles, a University of Florida engineering professor is working to make sure they can’t be maliciously tricked into “seeing" things that aren’t there. 

Work by UF's Sara Rampazzi, Ph.D., and her research group reveals that thermal-based perception systems may be far less reliable and secure than previously assumed, especially for safety‑critical tasks like obstacle avoidance in autonomous robots and aerial drones.  

Thermal cameras “see” in conditions where normal cameras fail (night, fog, smoke, rain) by detecting heat differences rather than visible light.  These sensors help machines identify people, animals and obstacles when visibility is poor. 

Rampazzi is an assistant professor in the Department of Computer & Information Science & Engineering, known as CISE. The work was presented at the 2026 Network and Distributed System Security Symposium by her Ph.D. student Sri Hrushikesh Varma Bhupathiraju. 

The study identifies three previously unknown vulnerabilities in the way thermal cameras process images, specifically in image equalization, sensor calibration and lens behavior. These vulnerabilities can be triggered by heat sources naturally present or maliciously placed in the environment, altering the perceived relative temperature or generating misleading data that can undermine correct obstacle avoidance. 

Exploiting the specialized optics and proprietary signal processing algorithms onboard the camera can hide real obstacles or humans from a robot or drone’s perception system or can even create phantom obstacles that do not actually exist, essentially fooling the camera. 

No hacking or physical access to the system is required. 

These attacks exploit vulnerabilities that are baked into the camera’s sensors and the ways in which they construct images, not by manipulating the camera’s output. 

“Everything that we discovered is internal to the sensor, so the data are pretty much already manipulated when they are used by the drone or the car,” Rampazzi said. “We evaluate state-of-the-art algorithms and software running inside the cameras that  are deployed by the manufacturers, and we’re basically saying that they need to be safer.” 

To counter the risks, the team developed defensive signal‑processing techniques that actively detect and suppress malicious or misleading thermal signatures that “look” like something (when, in fact, they are not). 

These powerful strategies are nimble enough to work in real time. Accessing and modifying the device’s internal algorithms, Rampazzi’s tools detect and exclude sensor readings that are known to have been caused by suspicious heat sources. The system then tests the effectiveness of the altered algorithm.  

Characterizing these effects required running extensive experiments on real-world thermal datasets, far beyond what conventional hardware could support. Thus, researchers used UF’s supercomputer, HiPerGator.  

“HiPerGator’s parallel processing capabilities enabled us to efficiently run large batches of experiments, simulate diverse attack scenarios and analyze model behavior at scale,” Rampazzi said. “HiPerGator was crucial for understanding and quantifying the vulnerabilities and validating their significance in practical settings.” 

For Rampazzi, the work is about more than just finding faults in systems and products. The team is diligent about disclosing their findings to the relevant manufacturers in hopes they will modify their proprietary algorithms to make them safer.  

Whether they do or not, Rampazzi isn’t expecting to hear from the companies — they are often secretive about their proprietary technology. 

The work was co-authored by Bhupathiraju, Qi Alfred Chen and Shaoyuan Xie from the University of California, Irvine, Michael Clifford from Toyota InfoTech Labs and Takeshi Sugawara from the University of Electro-Communications. 

The work is partly supported by grants from the National Science Foundation, the U.S. Department of Transportation and research funds from Toyota InfoTech Labs.