Paige Nong standing in a classroom standing and lecturing to students seated around tables

Separating hype from reality: New course trains students to analyze AI’s role in healthcare and health policy

Virgil McDill | April 28, 2025

When students walked into their Artificial Intelligence in Health and Health Care class (PUBH6570) in March, the first thing they saw was…a bingo machine. As the students filed in, Associate Professor Hannah Neprash and Assistant Professor Paige Nong handed them bingo cards emblazoned with terms not found at your typical bingo hall—think Turing Test, Techno-Optimism, Grok, and P(Doom).

The game was a fun way to break the ice and get to know one another, sure, but it also had a specific educational objective. “The point of the AI vocabulary bingo game was to take this amorphous blob of a term—AI—and put some structure on it,” says Neprash. “AI can refer to so many things, so this helped us formulate a shared vocabulary that we could use going forward as we discussed specific AI applications and tools.”

Hannah Neprash seated in a classroom speaking and gesturing with her hands
Hannah Neprash makes a point during a recent class

Developed jointly by Nong and Neprash, the seven-week elective class is the first School of Public Health (SPH) course to focus exclusively on AI’s role in healthcare and health policy. With just under two months to cover such a vast and rapidly evolving topic, the instructors packed a wide range of topics into the syllabus, including health policy, emerging AI products, clinical applications of AI, ethics, and AI’s potential to exacerbate bias and health inequalities.

Given the ubiquity of new AI technology, the pace at which these tools are becoming available, and the rapidly changing policy and regulatory landscape, Nong said that she and Neprash felt an urgency to develop the course. “We both wanted to move quickly because our students need this kind of AI literacy,” Nong noted.

“The field is moving so fast that the minute we started the class, some of the syllabus was already out of date,” Neprash observed, adding that in a recent class, Nong introduced research that had just been published a few hours before the class.

It’s all part of one of Nong and Neprash’s key goals for the course—ensuring that SPH students are grounded in AI and have the latest information they need to cut through some of the hype and rhetoric in order to critically assess various AI products and tools. “If you listen to the rosiest optimists, AI is the solution to all our problems—it can solve our spending challenges, improve healthcare quality, and reduce clinician burnout. It’s portrayed as the silver bullet for anything,” Neprash says.

Paige Nong standing in a classroom speaking
Paige Nong lecturing about AI in healthcare

Nong echoes this, noting that the course provides students with the tools they will need to be informed and critical decision makers. “These students are going to be in positions where they’re making decisions about which vendors to contract with and which AI tools to deploy in healthcare systems. AI is often made to sound like magic, so it’s important to give students information to fully evaluate a product or tool and ask critical questions of a vendor when they’re listening to a pitch.”

To that end, a recent guest speaker from Medtronic, the Minnesota-based medical technology company, delivered a pitch to the class about an AI-enabled clinical tool that assists with colonoscopies. Students got to pose questions and concerns directly to the tool’s manufacturer, a conversation that revealed what the students believed to be one of the tool’s shortcomings: the device isn’t connected to the Internet and so, contrary to what people might assume about an AI product, is not constantly updating and relearning. “It was a good reminder that some of the assumptions we make about what AI means might not hold, or be true in all cases,” Nong says.

In fact, Nong and Neprash said that a key lesson the class has discussed is that many AI tools, rather than learning and reflecting data from local patient populations, are developed elsewhere, purchased off the shelf, and come with built-in assumptions about patient populations—a situation that can perpetuate bias and exacerbate health inequities.

Another recent guest speaker and SPH alumni, Eric Maurer, emphasized this point. Maurer, the chief innovation strategy officer at the Community University Health Care Center (CUHCC), a Federal Qualified Health Center located in south Minneapolis’ Phillips neighborhood, said that half of CUHCC’s patients prefer a language other than English, 50% are on Medicaid, and 33% are uninsured.

Eric Maurer speaks to the class about his experiences with AI at a the Community-University Health Care Center

“That tells you how we’re different from the academic medical centers like Mayo or Duke that are creating these AI tools,” Maurer said, noting the stark differences in patient populations. The question then, Maurer explained, was how these models could be brought to CUHCC “without doing harm or perpetuating bias for the communities of patients we serve.” To guide that process, he helped develop a policy framework with some guiding principles, including local validation of all data, thorough training of staff, and other measures to ensure that the tools did no harm and did not introduce bias.

That kind of thorough process and critical analysis of AI tools is what students will take away from the class. As the bingo game underscored, AI is an overly broad term that serves as a catchall for literally hundreds of tools and technologies. “Not all AI is created equal,” Nong said, “so if we’re talking about one particular radiology application, it should be considered and evaluated as what it is, rather than as part of blanket statements of ‘AI can do this or AI can do that.’ The goal for this course is to move past broader ideas about how AI will transform healthcare and have students really think specifically about particular tools and how they might impact health outcomes in specific settings and among a specific population,” she said.

© 2015 Regents of the University of Minnesota. All rights reserved. The University of Minnesota is an equal opportunity educator and employer. Privacy Statement