For The Blind, A Computer Navigation System With Its Own "Map"

September 3, 2002

GAINESVILLE, Fla. — University of Florida researchers have wedded speech recognition software, wearable computers, satellite positioning technology and other emerging technologies in a 21st-century navigational aid for the blind.

Composed of a waist-worn computer and headset connected remotely to a map database server, the prototype delivers and responds to instructions verbally. It keeps track of the user’s location while giving directions to a destination – and may even warn the user against veering off a sidewalk or stepping into a road.

“When we started this project, we were looking for a compelling mobile application of wearable computing that would be not just for fun from a research perspective, but also useful to society,” said Steve Moore, who designed the system for his master’s degree in computer science and engineering.

Computer engineering Professor Sumi Helal and civil and coastal engineering doctoral student Balaji Ramachandran also helped with the project, which the researchers named DRISHTI, after the Sanskrit word for vision. While in the early stages, the system is a promising attempt to address the difficult problem of helping the blind get around in a world designed for sighted people. Nationwide, about 1.1 million people suffer from blindness.

Speaking into the microphone, the user tells the system his location and where he or she wants to go – for example, from the UF student union to the computer science building. The system responds with directions based on the user’s starting point, saying, for example, to turn 15 degrees and walk along a sidewalk for 230 feet. If the user veers off the sidewalk or travels too far, the system provides a verbal correction. It also may warn against impediments or hazards, such as picnic tables or streets.

To achieve such contextual real-time directions, the system relies on numerous hardware and software components, both mobile and fixed.

In addition to the headset, a blind person using the system carries a commercially available personal computer about half the size of an egg carton. Workers building jet aircraft use such systems to access wiring schematics, as do workers conducting large-scale inventories. In this case, however,

the wearable computer contains voice-recognition and other software. The user also carries a cell phone for wireless communication, an antenna and a backpack containing a Global Positioning System, receiver, batteries and other equipment.

Housed in a lab in UF’s computer engineering building, the database server holds a Geographic Information System, or GIS, database of the UF campus. Far too immense to fit onto the wearable computer, the database contains the latitudes and longitudes of thousands of points of reference on campus, from sidewalks to buildings to streets. It also can be easily updated to include construction activities or other temporary landscape changes on campus.

The system matches the user’s location – obtained using Global Positioning System technology – with the information provided by the database server in real time. The voice-recognition/wearable computer provides the user interface for the data.

A demonstration revealed the promises of the system as well as some of its challenges. On one hand, the system provided specific directions as requested and communicating with the computer by voice was surprisingly easy despite its limited vocabulary. However, its style – more command and response than conversational – took some practice. Additionally, because the GIS database consists only of the UF campus, the current system could not be used outside the university. But in the future, Moore says, similar GIS databases could be accessible for use in many other locations.

“What you would like is to be able to offer this as a service,” Moore says. “You go to a city, and say, ‘OK, I need to be navigated,’ and it taps into the GIS database for that city.”

Moore embarked on the project in part because his father, a UF math professor, is blind. Theral Moore, who tested the system, said he found it very helpful as an orientation tool.

“If you’re out there all alone with only a cane, you can make a little turn here and there, and first thing you know you have no idea which way is west,” he said. “But if you get directions from the voice telling you what direction you’re going, how far off course you are, and/or if you are leaving the middle of the sidewalk, you just feel much more comfortable.”

Pat Maurer, director of community relations for the National Federation of the Blind, said there are no similarly comprehensive electronic navigation systems currently on the market. UF’s Moore hopes to develop the system into a commercial product in the next two years. As they become more practical, such systems could be helpful to some blind people, Maurer said.