UF/IFAS researcher helps test new way to probe remote ecosystems with satellite imagery

October 25, 2012

GAINESVILLE, Fla. — For scientists, making field observations of organisms and ecosystems can be a daunting challenge.

Travel to remote locations is costly and difficult. Observation methods are limited and must be devised so that they only capture accurate, relevant data.

Satellite imagery is one alternative for assessing wild places, and it has some advantages over boots-on-the-ground observations, said Matteo Convertino, a research scientist with the University of Florida’s Institute of Food and Agricultural Sciences.

“There’s currently not a lot of satellite imagery used in ecological studies,” said Convertino, with UF’s agricultural and biological engineering department. “Part of the reason is, there’s a strong need to improve mathematical formulas for analyzing the data, and that’s what we’re doing here.”

In the current issue of the journal PLoS ONE, Convertino and colleagues outline a new method for extracting information from digital images quickly and efficiently. The system identifies the components of photos based on their appearance, and pinpoints similar features or objects.

The research team hit accuracy levels as high as 98 percent with analyses of satellite photos showing Everglades wilderness. The team used this method to estimate the number of different plant species in the photos. Those results were compared with field observations.

“This method provides three benefits: improved accuracy, higher speed and reduced costs,” said Convertino, who is also a contractor at the Risk and Decision Science Team of the U.S. Army Corps of Engineers and part of the Florida Climate Institute.

Digital photos taken far above Earth can provide information that covers long periods of time and large tracts of land, with great clarity, he said. Satellites can also provide more thorough coverage of an area, compared with on-the-ground observation.

Add to that the fact that there are decades of satellite images available through digital archiving, and there’s a treasure trove of data for ecologists, biologists, foresters and others.

To unlock it, the research team has harnessed a probability formula called Kullback-Leibler divergence.

Computer software developed by the team can gauge the intensity of the light reflected off objects in a photo. Then the software notes the frequencies of the most prevalent light waves. Finally, the software classifies the objects into two or more groups, based on the amount and type of light they reflect.

The system could not tell researchers which plant species they were looking at, but it did reveal how many plant species were in an image, where they were, and how numerous they were. It also provided information about landscape features.

The study involved satellite images showing a part of the Florida Everglades known as Water Conservation Area 1. There, standard on-the-ground observations have been sparsely recorded. The Everglades and other wetlands need close monitoring because they are sensitive to rainfall, water management and other external factors that affect overall ecosystem health.

Ultimately, the analytical method may prove useful for other image-retrieval challenges, Convertino said. It has already been used to classify stem cells found in photos taken with microscopes, and can be used to analyze surface water and soil shown in satellite images.

“More work is needed,” he said. “But the first results are surprisingly definite and encouraging.”

The research team included Convertino, Igor Linkov of the U.S. Army Corps of Engineers and Carnegie Mellon University, and Rami Mangoubi, Nathan Lowry and Mukund Desai of Charles Stark Draper Laboratory in Cambridge, Mass.