Patients want clarity on the use of AI in healthcare

BOSTON – The ability to transform the way healthcare providers communicate with patients using artificial intelligence is not just about accuracy, transparency, fairness and maintaining data models, but also about finding ways to address personalization challenges.

What patients want to know and when adds a level of complexity — a challenge that is forcing the healthcare AI sector to consider both expected and unexpected patient perspectives, panelists said Thursday at the HIMSS AI in Healthcare Forum.

Artificial intelligence can contextualize and free doctors from data entry, allowing them to interact with patients in a more human way. In doing so, artificial intelligence can transform the doctor-patient interaction.

“To some extent, these impressive tools, which are developing much faster than a health system can even think about how to embed them, really do provide a position or a great opportunity to personalize that dialogue, what matters to that person, and advise and support them in making the decisions that are relevant to them,” said Anne Snowdon, Chief Research Officer at HIMSS, the parent company of Healthcare IT News.

While the utility of AI technologies is an important part of the trust conversation, mapping transparency, choice, autonomy, and decision-making is critical for patients.

“From that perspective, it starts to redefine and rethink care,” said Snowdon, the panel moderator. Snowdon has a doctorate in nursing.

Improving communication with patients

Snowdon was joined by Alexandra Wright, patient advocate and director of research at HIMSS, Dr. Chethan Sarabu, director of clinical innovation at Cornell Tech’s Health Tech Hub, Mark Polyak, president of analytics at IPSOS, and Dr. Lukasz Kowalczyk, a physician at Peak Gastroenterology Associates, for a deeper conversation about what patients want from an AI-powered healthcare experience.

“While healthcare is still navigating the challenges of AI hallucinations, AI can elevate conversations and build trust,” said Sarabu, who is also a board member of The Light Collective, a nonprofit organization dedicated to advocating for the collective rights, interests and voices of patient communities in health technology.

Sarabu said that during a patient panel discussion, he heard that a patient who thought she was communicating with a very helpful nurse in her clinic named Jessica through the patient portal reported that she lost trust in her when she asked about the nurse in person at her doctor’s office.

“She just said she wished they had told her ahead of time it was a chatbot,” he said.

“You shouldn’t fool your patients,” joked Kowalczyk, a practicing gastroenterologist and consultant at Denver-based Cliexa, a digital health platform.

But if a patient knows that healthcare chatbots like Jessica are not real people, the patient’s difficult circumstances can be improved by an AI that can communicate in a compassionate way.

“Compassion is a real thing in health care,” Kowalczyk said. “Sometimes it’s really hard, especially when you’re going through a day and it takes one or two patients to really make it hard to start the next day.”

Large language models are excellent for transforming and translating information and describing patients’ concerns to clinicians, giving them “a moment to catch their breath” and regain empathy, he said.

“I think these are the moments where patients feel like the AI ​​is acting as their advocate. It helps me understand them better as a person.”

The dynamics of personalization

AI may not contribute to the patient’s vision of care. In several scenarios, for example predictive analytics, it may provide information that patients do not want.

“Some patients may want more information, some may want less, or maybe someone wants less information during an in-person visit but wants more material to review afterward,” Sarabu said.

From a physician perspective, “it’s difficult to really personalize all the information, context and content for each individual patient.”

According to Polyak, there are three elements of care: access to care, access to accurate information, and speed of information.

He noted that 16% of patients who used ChatGPT asked questions about their healthcare to reduce their healthcare costs.

“(They) asked ChatGPT to give them different scenarios of how our physicians should approach their care, based on what they had – to reduce costs.”

“I actually didn’t expect that, but it was mainly about generating scenarios that they printed out and took with them to appointments.”

The sense of control also varies per patient.

For patients and their families facing a health crisis, “information is actually power,” Wright said.

“When you’re in these kinds of situations, you often feel like you’re losing control,” she says.

“And if you don’t fully understand your condition or what’s going on, it can really feel like you have no control over what’s happening to you.”

When the doctor is no longer in the room and patients have questions, they turn to search engines and ChatGPT for information, she said.

Context also plays a role in the information that patients want to check.

“When I first got to the hospital, would I have wanted them to tell me what my chances of survival were? Probably not, because I don’t think it would have improved the situation,” Wright said.

“But now that I think about it, if someone were to tell me about my risk of, say, a future cancer, would I want to know if there was anything I could do to prevent that? Probably.”

According to Snowdon, the detailed discussion revolves around the use of AI in healthcare: “How do we help people make their own decisions, inform themselves with a sense of confidence and (discover) what is most meaningful to them?”

Andrea Fox is Editor-in-Chief of Healthcare IT News.
Email address: afox@himss.org

Healthcare IT News is a publication of HIMSS Media.