NIH's new program expands the use of AI in research

<p>The Bridge2AI program will help develop tools, resources and rich data to improve AI research and healthcare. Photo credit: Shutterstock</p>

The Bridge2AI program will help develop tools, resources and rich data to improve AI research and healthcare. Photo credit: Shutterstock

Welcome to From Florida, a podcast that showcases the student success, teaching excellence and groundbreaking research taking place at the University of Florida.

The National Institutes of Health announced this fall a new program to expand the use of artificial intelligence in biomedical and behavioral research. In this episode, Azra Bihorac and Barbara Evans discuss the program and the University of Florida's involvement. Produced by Nicci Brown, Brooke Adams, James Sullivan and Emma Richards. Original music by Daniel Townsend, a doctoral candidate in music composition in the College of the Arts.

<p>Azra Bihorac is the Director of the Intelligent Critical Care Center and Senior Associate Dean for Research in the College of Medicine. Photo Credit: Azra Bihorac.</p>

Azra Bihorac is the Director of the Intelligent Critical Care Center and Senior Associate Dean for Research in the College of Medicine. Photo Credit: Azra Bihorac.

Nicci Brown: The National Institutes of Health announced this fall a new program to expand the use of artificial intelligence in biomedical and behavioral research. The program is called Bridge2AI and NIH plans to invest $130 million over the next four years, with the hope of generating tools, resources and rich data about how best to use AI to improve research and, in so doing, healthcare. The University of Florida is one of the institutions participating in this exciting program. And we're going to hear today about what UF's role is, who's on the team and how they're approaching their assignment. Our guests today are Azra Bihorac and Barbara Evans. Azra is the Director of the Intelligent Critical Care Center and Senior Associate Dean for Research in the College of Medicine. Barbara is the Steven C. O'Connell Chair in the Levin College of Law and holds a joint appointment as a professor in the Herbert Wertheim College of Engineering. Welcome, Azra and Barbara.

Azra Bihorac: Thank you.

Nicci Brown: Let's start with the goal of the Bridge to Artificial Intelligence, or Bridge2AI program. Can you tell us a little bit more about that?

Azra Bihorac: Yes. So I think that NIH understood that with the rise of AR artificial intelligence, one of the main obstacles that investigators are facing is the lack of the data that was  already ready for the community. Not only that the data needs to be ready, but data also needed to be sampled or sourced in an ethical and representative manner. And it also had to be accessible for investigators. So the idea was to step in and really go on a drawing board and start creating these flagship datasets with the idea that once the project is over, the scientific community will have open access, not only for the data, but also the tools, standards, and training materials for workforce development.

Nicci Brown: And Barbara?

Barbara Evans: Looking to the future, it's going to be so important to make sure that the systems developed for use in healthcare provide equitable results that will be useful for all patients. We in the United States have a very diverse patient population, and it's just crucial to have datasets that represent everybody.

Nicci Brown: I understand we're part of a multi-center team. Who are our collaborators in this endeavor?

Azra Bihorac: Well, this is a really large group of investigators, mainly from the background of critical care medicine. It's a very unique group because of all the projects that are awarded, this is really one that is truly clinical. And the partners are intensivists, so specialists trained in critical care medicine. They operate intensive care units throughout the United States. The partners are multiple, including University of Florida, Massachusetts General Hospital, Duke University, UPenn, Emory, UCLA to name among others. And also including NVIDIA as an industry partner. And also Children's National Hospital.

Nicci Brown: So there's some really heavy hitters involved in this group.

Azra Bihorac: Yes.

Nicci Brown: Not to mention us, of course, but yes.

Azra Bihorac: Yes. Yes.

Nicci Brown: And in its announcement, NIH emphasized that teams with members from diverse disciplines and backgrounds would be participating in the program, as you just said. Can you tell us a little bit more about the multidisciplinary team that UF has created?

Azra Bihorac: I think UF has put forward a really strong team. So the project is constructed so that it has six modules. It's a very unique program. So the center of the program is the data generation module. So that would be a team that will go and work with hospitals to gather the patient data. Very rich, very high frequency data from intensive care units. And then de-identify and send to the central cloud, where the data from many different institutions serving very different populations will be merged together. But around this module, there are other modules, modules for teaming science, how to put people together, a module for ethics in AI, a module for standards, tool development and then workforce and research skill development.

So UF actually has leaders for all modules, all six modules. We have probably the biggest team in that regard. So from our university, we have three NPIs. With me, Dr. Parisa Rashidi, who is an engineer in the school of engineering. She's also co-director of the center I direct. Dr. Yulia Strekalova, who is a professor at the PHP College. She leads the teaming module. Dr. Evans, who is co-lead on the ethics module for the entire grant. Dr. Betsy Shenkman, who is also in the College of Medicine, she also participates in the ethics module. Dr. Benjamin Shickel and Dr. Baslanti, who work in the College of Medicine. They also work on the data generation module. Yi Guo and Serena Guo from the PHHP and College of Pharmacy. So as you can tell, this is a really big group.

Nicci Brown: And a diverse group as well. A lot of different viewpoints there, which I guess is the point. And can you tell us a little bit more about how everyone will come together? Will this be something where you have regular meetings? Is it something where you have a once in a semester meeting and then come together at a regular point throughout the semester?

Barbara Evans: There already are meetings occurring by Zoom. Each focus area, such as the ethics module, is having meetings. And then there's a structure so that across all of the different Bridge2AI programs, there will be a core so that discoveries that are made in one of the projects can be shared and used by all.

Nicci Brown: You mentioned earlier, too, about merging the data together. And of course we are talking about real people here at the essence of it all. So can you share a little bit about the privacy issues and how you're approaching that?

Barbara Evans: Yes, that is of course key to having people trust participating in these AI datasets, which are so important. The way it is, if you are not represented in the datasets, you may not be able to receive the benefits of this technology, five or 10 years. So it's very important for people to be included and to have diverse datasets, but people want their privacy protected. And one of the things AI in healthcare is forcing us to do is revisit the methods we've been using to protect privacy for the past 40 years and ask, "Are they really sufficient?" And so that one of the first things we're engaging in is a very hard look at all of the privacy laws and all of the ethics codes that we have been using since the late 1970s, and asking, "Are they adequate given this new technology?" And it's a challenging problem, and it's going to require not just legal and ethical expertise, but information science expertise. How can we make these data systems reliably safe for privacy?

Nicci Brown: And how are you going about doing that? Are you looking at other nations? Are you looking at other examples?

Barbara Evans: Yes, the first step is of course to cover all of the legal codes. And we've looked at, of course, all of the United States federal laws, but state law has a very important role here. And with multiple sites, it requires close interaction at each site. And then we do look at the European Union, GDPR, the framework in place in Europe, and get ideas from all of these, as well as a lot of things that are just soft law, private codes. Fair information practices are important and empirical and ask people what they want for their privacy protection. We have that built in to give people a voice.

Nicci Brown: On that note, how and what are you doing to help patients feel safe, feel reassured? Are there protocols that you're following?

Azra Bihorac: Of course. Well, all the protocols and all the research is done with the IRB, Institutional Review Board that protects privacy and confidentiality of the patients. Just to make it clear, all of this data will be completely de-identified. So you will not be able to gather who the patient is, the way how it's done. But I think what Barbara is alluding to is that at some point, for some other type of research, there might be needs not to have completely de-identified. So that's where we want to explore our research. Where do patients want to put boundaries, or what are the options they want to have if they want to share more than just de-identified data? Because there are pros and cons for everything. And I think that this is a very important topic moving forward.

But I want to also comment a little bit about something else in terms of uniqueness of this program. Another interesting thing is that I engage in community, not only through the research in ethics core, but also in our workforce development core. In that program, it's actually led by UF groups. Dr. Rashidi and I co-lead skill and workforce development, and we are planning several unique, innovative activities, such as microlearning, experiential learning and making partnership with the scientific communities and media.

However, we are also going to engage with the community. The Citizen Scientist Program is something that we have at University of Florida. We are going to build a special module for AI, allowing our citizen scientists to really get knowledgeable and trained in AI so they can be the voice of the community. And likewise, we will have an outreach to high school teachers to help them gain the knowledge. And for all of this, this data source will serve as educational material. The modules we create, the data, it will be real data. Like, go there and get your hands dirty. Play with the data, understand the real problems, see what we are facing. So that's very unique about this program.

Nicci Brown: Could you tell us a little bit more about the citizen science aspect of this program?

Azra Bihorac: The Citizen Scientist Program has been developed by the CTSI, Center for Translational Science Institute, at University of Florida. And it's a very unique program that helps the community engage in research, not only as participants, but also as creators. So the idea is that regular citizens can get enough training in science so they can participate in grant development, research question development, development of a plan on how to execute research, not become a participant of the research. So as of now, there is no training in AI for them. So we will create a completely separate curriculum to train our citizen scientists, to become AI experts, enough so they can be part of our grand developing process, or asking questions that are relevant for the community.

Nicci Brown: Can you tell us a little bit more about the timeline that people might start to see these things actually in the schools and in their communities?

Azra Bihorac: Well, the project started on Sept. 1. And just as of now, there are four data generation projects in the United States selected, and there is one coordinating center that will be led from UCLA, UCSD and Colorado. Each of these modules have a coordinating center, and we already have team building and onboarding going on in terms of creating presence, propagating the message, letting the community know about this, give their inputs and so on. And we expect by the end of this year to have some of our educational module already at the UF, through our team on canvas. We are also going to have the first ever UF CME conference, AI for Health, organized in Disney, in April by the College of Medicine. My office is leading that effort. And the course we'll do jointly with, for this conference, we'll have a small joint session, and the first teaming session will be at Florida.

So it's going to be really exciting announcing to the community and doing the first training session, where NVIDIA is going to help us train other investigators to become trainers, and they can go to their community and train their investigators.

<p class=Barbara is the Steven C. O'Connell Chair in the Levin College of Law and holds a joint appointment as a professor in the Herbert Wertheim College of Engineering. Photo Credit: Barbara Evans

">

Barbara is the Steven C. O'Connell Chair in the Levin College of Law and holds a joint appointment as a professor in the Herbert Wertheim College of Engineering. Photo Credit: Barbara Evans

Nicci Brown: And where do you expect attendants to come from to this conference?

Azra Bihorac: We are having investigators, speakers from all over the country, very well known in a applicationary AI, in medicine. And we really want physicians in Florida to become proficient, and eventually very good in AI. So we think it's a language of the future that everybody will have to know. So this is primarily starting with Florida, but we hope to be a nationwide conference.

Nicci Brown: It sounds to me, too, that you are building a framework that can be replicated in other places, that's part of the goal.

Azra Bihorac: Absolutely.

Barbara Evans: Yes.

Nicci Brown: Can you tell us a little bit more about the specific tasks for the team at UF?

Barbara Evans: Well, I'll speak to the ethical and legal team. It's a multidisciplinary team that is here at UF, and we have counterparts at some of the other participating sites in our projects. It includes ethicists, physicians, attorneys and some empirical scientists. The goal here is to not just say what law requires, but what does it require to get the public to trust this technology? And we realize that sometimes you have to go beyond just the minimum that law requires. And we hope to inform that not just with theoretical ethics, but some empirical studies. So we have one specialist who looks at issues for Native Americans. We have others, a very diverse group, looking at specific subpopulations and try to make sure we're aware of the concerns people have, and then develop strategies for addressing them.

Nicci Brown: I was going to follow up on that because we know that there are certain groups of our populations that have greater concerns when it comes to ethics, and very real concerns that are rooted in past issues. So do you have a diverse group of researchers that are helping you with this project as well?

Barbara Evans: Yes, we do. And we're acutely aware that it will be so important to make sure that the AI tools that move into the clinical setting for use in affecting decisions about people's healthcare have been trained on data that are truly reflective of the entire diverse patient population. And there's early empirical evidence that sometimes that's not true, and that's a major challenge that must be overcome to get people comfortable having their data inform these systems. And that means looking at different concerns. People have different attitudes about whether research is a valuable thing, and different willingness to put in and they have different worries about privacy. People may be concerned that their data will be used in a certain way that's harmful to them. And those concerns differ across demographic groups. And it's crucial to make sure that we're not just designing ethics for one population subgroup, and really finding out what do people care about?

Nicci Brown: The project really does seem to play to the university's strengths in AI, in that we are taking a holistic approach across the curriculum here at UF. Can you talk a little bit more about that?

Azra Bihorac: Yes, absolutely. So, as you pointed out, the University of Florida with its AI Initiative has recruited or plans to recruit 100 AI faculty. The Health Science Center and College of Medicine have recruited almost 35 of them. We onboarded close to 20 AI tenure track faculty at the College of Medicine only. This is the workforce that will absolutely be a crucial part of this project, not only in resourcing the data, taking data from our patient population, creating a cloud replica of the datasets here locally in the University of Florida College of Medicine, but also in creating computational tools that can be shared across not just University of Florida’s, entire college, but also across the nation. Some of the tools we are working on with Dr. Shenkman's team are developing social determinants of health tools. So our datasets also include all those important determinants of your health status and access, and merge all existing public datasets about social determinants into the patient datasets.

We are also building unique computational tools, how to analyze this data. In the workforce development modules, we are building on our AI curriculum development across the UF. As you know, we are very strong in that. With Dr. Rashidi, we are building micro modules and experiential learning modules where the physicians and computer scientists can get together and play together, develop new tools and exchange the perspectives, and create this unique transdisciplinary team.

And that is a new concept that this grant has brought together, transdisciplinary. So it's not inter, it's not cross, it's trans. So what does it mean, is that rather than putting together two different points of view, you become a bigger point of view. And I have worked with Dr. Rashidi for 10 years. She's an engineer and I'm a physician, trained also in computational science. Together, we become something different.

Barbara Evans: Like aliens.

Azra Bihorac: So I think that works really well. And I think that's the model. You will see that medicine will be taken by technology. And technology, we need to be ready for that, and we need a different type of workers for that. So that, I think, is one of the biggest things for the University of Florida from this grant.

Nicci Brown: Are there any other things that UF is doing to integrate AI into healthcare?

Azra Bihorac: Oh, a lot. I can speak about the College of Medicine. We have very many, many initiatives in that area. Our big ambition is to be number one in biomedical AI in the US. And for that, we are investing heavily in great people. Medical students, we are creating an AI curriculum for medical students. We are creating AI boot camps for the entire workforce, starting with the medical students, Ph.D students, residents, to the department chairs, and staff. Everybody will be trained for the new future.

 Nicci Brown: Could you tell us a little more about the Intelligent Critical Care Center?

Azra Bihorac: Yes, of course. So one thing that we are very, very proud of is that we have almost 10 years in making an AI research at University of Florida. And the Intelligent Critical Care Center is formed based on that large portfolio of research and investment we have done in bringing AI tools into the clinical care, implementing to answer questions that are relevant for physicians and patients, such as how can we do better monitoring of the patients in the intensive care unit? Using things like sensors, [inaudible 00:24:04] sensing, environmental sensor, how can we monitor exposure to light and noise? How does it affect patient ICU? How can we use patient videos of their movement to help them not fall in the room? So autonomously monitor them. And how can we integrate all that information to track how you are doing in real time? So we can always tell their relatives, "Hey, patient is doing great" or no, in a very simple way.

And the center, we call it UF ice cube, is really there to provide a community for the AI researchers who are focused on implementation of AI in clinical care. We have a lot of grants, a lot of research, but also a lot of educational community engagement activities. 

Nicci Brown: Is that being used as a model for anywhere else?

Azra Bihorac: I think this is very unique to the University of Florida. We are probably most developed in that sense. And that was the reason why we are part of this big initiative, because of our experience and expertise in this.

Nicci Brown: What about NVIDIA's role? Can you tell us a little bit more? You touched upon it earlier. What else does that role entail?

Azra Bihorac: So with NVIDIA, we developed a really complex and comprehensive partnership for this grant. Not only that they will provide classic industry innovation, collaborative, but also will be very intimately engaged in the workforce development. They will provide support for several post-doctoral fellowships in a very technical area, but also in the area of bioethical AI and equitable AI. They will also provide their team to train the trainers. So NVIDIA's employees will work with our workforce development crew and travel to different conferences for the national societies, and basically train physicians to become AI experts.

Nicci Brown: Barbara, NIH specifically referenced its interest in ensuring the tools and data that will result from this program don't perpetuate inequities or ethical issues. And you've done a lot of research in this area. Can you tell us how that work provides a foundation for what you're doing now?

Barbara Evans: There's been an ongoing concern for several years, and a lot of people have been reconsidering whether the ethical frameworks that we developed 40 years ago for clinical research necessarily will produce equitable AI and healthcare. Because we're transitioning from a world where a lot of medical knowledge will be gleaned from processing people's data, rather than doing experiments on their bodies. And people have great concerns about this, and the key thing that's going on is a general questioning of what is the best way to use data so that privacy is strongly protected.

And one of the trends that's emerging in the scholarship is that protecting privacy is becoming more and more a computer science and engineering problem. It isn't just philosophizing about ethics anymore. You need this transdisciplinary team that was discussed earlier to bring to bear this synthesis of what can we do through computer science? What can we do through law? Is there a need to train state regulators to understand the nature of these new issues they're facing? And so the work of the past has highlighted this need for the synthesis. And then this program that we're doing now is bringing that transdisciplinary team together to do the work.

Nicci Brown: Which I guess also underscores the need to have an AI across the curriculum, so that we are graduating people who really truly understand the power here. Azara and Barbara, thank you so much for joining us today. It's been a real pleasure.

Azra Bihorac: Thank you.

Barbara Evans: Thank you.

Nicci Brown: Listeners, thank you for joining us. Our executive producer is Brooke Adams. Our technical producer is James Sullivan. And our editorial assistant is Emma Richards. I hope you tune in next week.