Minnesota Now with Nina Moini

Experts discuss ethics as funding cuts and A.I. reshape scientific research

A Green Line train passed research labs at the U.
Research labs at the University of Minnesota on May 22, 2015.
Riham Feshir | MPR News

Audio transcript

NINA MOINI: Scientific research in the United States is going through big changes right now. Over the past year, thousands of federal science grants have been cut or reshaped by the Trump administration. At the same time, artificial intelligence is rapidly transforming how studies are designed, analyzed, and written.

So what happens to the ethical guardrails meant to protect patients and public trust, when both funding and technology behind scientific research are shifting at the same time? Tomorrow, national experts are convening to answer that question at a research ethics conference through the University of Minnesota. Joining me now is a conference organizer, Susan Wolf. She's a regents professor of law and medicine at the University of Minnesota. Professor Wolf, thank you so much again for being with us.

SUSAN WOLF: My pleasure.

NINA MOINI: For those who may not be familiar, why is it so important to focus on research ethics, such that there would be an annual conference about it? How does this impact your work?

SUSAN WOLF: You really can't do research responsibly without paying very close attention to how you're designing the research, how you're treating the human beings who are generous enough to participate in the research, even if it's research with animals. Are you treating them responsibly?

Modern research ethics, Nina, grows out of horrors committed in the name of science in the 20th century, dating from the concentration camps in World War II and then the Tuskegee syphilis trials. Sadly, there's a long list of terrible atrocities committed in the name of science. And that's really the root of modern research ethics. It's a kind of never again, building a system of guardrails to prevent those horrors from happening.

NINA MOINI: That is so true. I think sometimes we're thinking about research tables and maybe spreadsheets when we're thinking about what research is. But really, at the heart of it, a of times, are people or beings, to your point. What are some of the areas that you do focus on typically at this conference?

SUSAN WOLF: This is really a pioneering annual conference that the University of Minnesota first created 11 years ago. And every year at the beginning of March, we pull together the top experts across the country to focus in with our community-- actually, with people all over the country, who tune in through Zoom-- on what are the current challenges and opportunities to improve research ethics. This year, the focus is on the future of research ethics in the face of the huge federal shifts we see and the emergence of brand-new technologies where we need to figure out how to ethically use them in research and analyze them.

NINA MOINI: We have seen some really big shifts to funding and scientific research over the past year, to your point. The Trump administration cut more than 2,000 National Institutes of Health grants, NIH grants last summer. And the National Science Foundation awarded money to thousands fewer projects last year than it did in years prior. Are you mostly seeing these impacts in the area of sciences? Are there other types of research areas that you were also concerned about?

SUSAN WOLF: We are very concerned across the board-- science of all sorts, medicine. What we're seeing is some studies with human participants stopped midstream, leaving the participants high and dry sometimes. We're seeing new questions about how researchers can recruit the kind of inclusive and diverse samples that would allow them to learn about medical breakthroughs, for example, that need to apply to a broad population. We even see debates at the federal level about whether federal funding for research oversight in institutions like universities will continue to be funded.

After the atrocities that I mentioned that emerged in the name of science in the 20th century, one of the linchpins of safety for research participants was creation of local Institutional Review Boards, IRBs. But they need funding. They need resources. IRBs review proposed research so that when human beings are actually involved, the research protocol has already been vetted. Everybody knows that their safety and their rights are secure.

But we've seen more shifts, too, Nina, at the federal level. We've seen some key federal oversight bodies either abolished or sharply reduced. So this is a really important time to take stock and figure out where to go from here.

NINA MOINI: Let's talk about where AI and research can get ethically complicated. Where are the places that you think people are really still needed?

SUSAN WOLF: Oh, people have to be there needed to oversee the use of AI, to evaluate it, to make sure it's not hallucinating, to make sure that it's being properly used, to interpret its output. AI is transforming scientific research. It's really enabling enormous breakthroughs. But it has to be used responsibly.

One of the issues that's arisen in AI-powered research is, what should you tell the human beings that you recruit to be your research participants? Do you disclose the use of AI? How do you explain it? How do you make sure that the AI was trained on the right data sets so that it's really applicable to the population that you're using it in? There are all sorts of questions. AI has, at this point, enabled Nobel prizes in science. But challenging issues, including public trust, have to be addressed.

NINA MOINI: Yeah, and what about bias in AI? Who is, I guess, creating the artificial intelligence that would be setting up the research or involved in the research?

SUSAN WOLF: Bias is a really big issue. That's where you get into questions about, was this AI trained on people, all of whom have health insurance, or who are very homogenous? But now you want to use it in research with people who don't have health insurance, with people who are different from those it was trained on. And you want to ultimately use it in a broad population. So researchers really have to think about all of that. Actually, one of the challenges of AI, too, is whether our research oversight bodies, like institutional review boards, know enough about AI to really evaluate protocols that propose to use it.

NINA MOINI: And just lastly, I do want to touch on, Susan, some areas, like you mentioned, where AI can strengthen the process in research, or perhaps even strengthen ethical protections or consistency of guardrails in research. Do you have an example of how that can be?

SUSAN WOLF: Well, research oversight bodies themselves, Nina, are thinking, can we use AI? Would that help us? Would it speed up our process? One frustration I've always had with the research oversight system, where you have thousands of local institutional review boards reviewing protocols, is, we don't really have a system for them to share what they decided in a particular protocol.

But it's a way that you could build a data set that would allow an IRB getting a new challenging problem to go, huh, what have other IRBs that have faced this kind of problem seen as the issues? What have they decided? That would be potentially a way that you could use AI to build a system of precedent.

So there are all kinds of possibilities for using AI to spot ethical issues. But again, it needs to be carefully designed and overseen. We hope that tomorrow, people will join us to debate all of these issues. This is a free public conference on Zoom, zero barriers, where we really want to hear all points of view. We've got 10 great experts from around the country. And they're going to debate these issues.

NINA MOINI: How critical is it that these conversations take place now? It just feels like everything is moving really fast when it comes to AI. And people are having trouble keeping up. What happens in 10 years from now, 20 years from now, if folks don't get a handle on this?

SUSAN WOLF: I mean, if this gets away from us, we're really going to face a crisis of public trust. Science involving human beings in biomedicine and across the sciences, where we're asking people altruistically to participate in our research, unless they have confidence that they are going to be protected, their rights, their ability to consent, that they're going to understand what's being proposed, they're not going to participate in research. We're going to grind to a halt.

And unless the public has confidence in the technologies that we're developing, that we're thinking ahead, what can go wrong? How do we avoid that? We really risk losing public confidence. So ethics, responsible conduct of research is crucial to making progress.

NINA MOINI: Thank you so much for your time, Susan. Really appreciate it.

SUSAN WOLF: Thank you.

NINA MOINI: That was Susan Wolf, regents professor of law and medicine at the University of Minnesota.

Download transcript (PDF)

Transcription services provided by 3Play Media.