Computer engineers: Virtual patients also experience racial bias

June 26, 2008

Video | Audio interview with Benjamin Lok

GAINESVILLE, Fla. — For black people, it doesn’t matter whether their color shows up in pigments or pixels. Doctors may be less likely to heed their complaints either way.

So suggests a new that used virtual patients — computer-generated people able to carry on a limited conversation with human counterparts — to test how medical students respond to white- and dark-skinned patients. The study found that the white third-year students were less empathetic with dark-skinned than light-skinned virtual patients during brief one-on-one interviews, suggesting racial bias extends from real people to their virtual representations.

“You are seeing a transfer of bias come through the screen,” said , an assistant professor of .

Lok is one of five authors of a paper on the study set to appear this fall in the journal Intelligent Virtual Agents.

He also is the lead investigator of a four-year-old research project, funded in part by the , aimed at using racially diverse virtual patients as a new tool to train medical students to identify and avoid racial bias — a kind of human-relations equivalent to the equipment used to train pilots.

“We’re hoping that in the future, we will be able to automatically detect bias and, then and there, help medical students out,” Lok said. “That’s really our goal: An interpersonal simulator, just like a flight simulator, to help people get better at this skill of interacting with people who come from different racial and ethnic backgrounds.”

Deeply embedded racial bias against minorities has been a hot issue in medicine since at least 2002, when “Unequal Treatment,” a national study of the phenomenon commissioned by Congress, was released.

The study found that racial and ethnic minorities consistently receive lower quality health care than their white counterparts and attributed the problem in part to bias or prejudice — bias all the more pernicious because it may be unconscious. Such unequal treatment is thought to be one reason that minorities, especially blacks, consistently suffer higher rates of many serious diseases.

Lok said medical schools use actors or actresses in role-playing scenarios for a wide range of medical training, including helping future doctors combat racial bias. While the actors can be highly effective, they also come with logistical challenges, he said. For one, actors may not be available, especially in states or cities with low minority populations. They are expensive. And they cannot offer all the students exactly the same experience, he said.

Virtual patients pose none of these problems, Lok said. But before educators can embrace them, they must know practitioners respond to virtual people as they do real people.

The researchers used virtual patient technology developed in Lok’s UF laboratory. The technology taps voice recognition software to allow people to speak normally with the virtual patient, who appears on a projected screen. The human subject wears a hat studded with special tape that reflects infrared light. Detectors pick up the reflections, informing the computer of the motions of the subject’s head. This allows the virtual patient to move his or her head in response andmaintain eye contact during conversation.

The researchers divided a group of almost two dozen third-year medical students at medical school in the Southeast, with half the students interviewing a light-skinned virtual woman, the others a dark-skinned woman. But for the skin tone, the virtual women had the same voice, animation and appearance. Medical faculty and other observers watched recorded videos of the students in the interviews, but the observers were blinded to the skin color of the virtual patient. They then rated the students’ empathy toward the woman’s medical complaints, using a standardized scale.

The observers rated the students interviewing the dark-skinned woman as consistently less empathetic. The results correlated with standard psychological tests of the students, tests that showed they had an unconscious bias against minorities.

The technology is so new that Lok said it “barely works” – for example, the virtual patient may misunderstand the human user and respond with a non sequitur. But Lok said he is confident it will improve rapidly.

Brent Rossen, another author of the paper and a UF graduate student, said most of the medical students in the trial were keen to use it, because they recognized its intrinsic value.

“These are people who want to treat others equally, but racial bias is a very subconscious thing, and in the end it really has to be trained out,” Rossen said. “One way to do that is through repeated exposures to the subject of the bias, which is where this research comes in.”

Lok, Rossen and Kyle Johnsen, a graduate student at the time of the study who has since earned his Ph.D., collaborated on the study. The two other authors were Dr. Adeline Deladisma, a medical resident, and Dr. Scott Lind, chief of surgical oncology, both of the .