Tech

Why nurses are protesting AI

Continuous data collection and analysis is hardly a replacement for knowledge, nursing unions say, and sometimes gets in the way of care.
article cover

Wachiwit/Getty Images

5 min read

The largest nursing union in the US, National Nurses United (NNU), is sounding the alarm about the use of AI in healthcare. In April, the union’s affiliate California Nurses Association (CNA) protested an AI conference helmed by managed care consortium Kaiser Permanente.

Like workers in other sectors who are worried about AI encroachment, the nurses fear that the tech is contributing to the devaluation of their skills amid what they say is already a “chronicunderstaffing crisis, nurses reported in an NNU survey of 2,300 registered nurses and members in early 2024.

But the NNU, which represents approximately 225,000 nurses across the country, also claims healthcare operators are using AI hype as a pretext to rush half-baked and potentially harmful technologies into service, says Michelle Mahon, NNU’s assistant director of nursing practice. Mahon warns continuous data collection and analysis is not a substitute for nursing knowledge or physical resources.

“The most harmful thing we’re seeing is the way it’s being used to redesign care delivery and usurp the skill of decision-makers,” Mahon told Healthcare Brew.

In an emailed statement to Healthcare Brew, Kaiser spokesperson Kathleen Campini Chambers wrote the company had provided nurses with “state-of-the-art tools and technologies that support our mission of providing high-quality, affordable healthcare to best meet our members’ and patients’ needs.”

“At Kaiser Permanente, AI tools don’t make medical decisions, our physicians and care teams are always at the center of decision making with our patients,” Chambers continued, adding the company makes sure AI outputs are “correct and unbiased” and do not “replace human assessment.”

Unforeseen consequences

Cathy Kennedy is CNA president, a VP of NNU, and a registered nurse. She works in a neonatal intensive care unit (NICU) at Kaiser-run Roseville Medical Center, and told Healthcare Brew that real-time tracking of nurse activity can draw out procedures that need to be completed quickly—such as NICU exams that can expose premature babies to the cold and pathogens.

“We wash our hands before we go to the patient, put gloves on, do what we need to do, take your gloves off, wash your hands, go to the computer, put the information in, and then you go back and forth,” Kennedy said. “You see how inefficient that is.”

In some cases, automated systems may be limiting some patients’ ability to communicate directly with their doctors. One test study of a Kaiser AI system found that nearly 32% of patient messages were never seen by a human physician.

Navigate the healthcare industry

Healthcare Brew covers pharmaceutical developments, health startups, the latest tech, and how it impacts hospitals and providers to keep administrators and providers informed.

With AI systems increasingly handling processes like predictive staffing, there’s also more room for minor discrepancies like data-entry omissions or mistakes that throw off results, Kennedy added. CNA members have said Kaiser’s implementation of Epic contributes to understaffing, in part because the electronic health record (EHR) system does not account for some time-consuming duties like chemotherapy prep.

Nurses also say they often can’t override decisions made by AI. The NNU survey found 40% of respondents aren’t able to override predictions made by algorithms used to determine patient outcomes, with 29% reporting they couldn’t override or alter algorithmically generated data on wounds or pain levels in EHRs.

The Wall Street Journal recently reported nurses at UC Davis Medical Center aren’t able to question the results of the Epic Sepsis Model, a prediction algorithm, without a doctor’s approval. A 2021 study published in JAMA found the system issues predictions “substantially worse than the performance reported by its developer,” raising “fundamental concerns about sepsis management on a national level.”

Untested and unregulated

Many AI tools entering widespread use in healthcare are not regulated—including not just admin software summarizing doctors’ notes or managing staff levels, but also predictive AI offering recommendations and diagnoses. (The latter category is only subject to non-binding FDA guidance, which lobbyists are challenging.)

Mahon says they’re also untested, and AI hype has given administrators an excuse to rush slapdash automation at an unprecedented pace.

“What’s different about this moment is the claim of intelligence,” Mahon said. “It’s not just data mining anymore. It is, really, a lot of these technologies are claiming to or seeking to displace skilled nursing judgment.”

Kennedy is quick to point out nurses aren’t opposed to technology that makes their jobs easier.

“Part of me likes the idea of being able to have my blood pressure, my temperatures, my heart rate, and all of that stuff being auto-populated into the medical record,” Kennedy said, citing use cases like managing ventilatory support equipment.

“Those things, that’s great,” she added. “But it’s the other stuff that really becomes a burden.”

Navigate the healthcare industry

Healthcare Brew covers pharmaceutical developments, health startups, the latest tech, and how it impacts hospitals and providers to keep administrators and providers informed.

H
B