AI can help improve equity in pain treatment, MGB study finds

Researchers at Mass General Brigham wanted to use artificial intelligence to address the undertreatment of pain in certain patient groups. They tested whether large language models could improve racial disparities in pain perception and drug prescribing.

The LLMs do not discriminate based on race or gender and could be a useful tool in pain management that ensures equal treatment for all patient groups, MGB researchers said Monday.

“We believe our research provides important data showing how AI can reduce bias and improve health equity,” Dr. Marc Succi, strategic innovation leader at Mass General Brigham Innovation and one of the study authors, said in a statement.

WHY IT IS IMPORTANT

Health system researchers tasked the GPT-4 and Gemini LLMs to provide subjective pain assessment and comprehensive pain management recommendations for 480 representative pain cases they had prepared.

To generate the dataset, researchers used 40 cases reporting various types of pain—such as back pain, abdominal pain, and headaches—and removed race and gender identifiers. They then generated all unique combinations of race from six U.S. Centers for Disease Control race categories—American Indian or Alaska Native, Asian, Black, Hispanic or Latino, Native Hawaiian or Other Pacific Islander, and White—before randomly assigning each case to a man or a woman.

For each patient case in the dataset, the LLMs evaluated and graded subjective pain before making recommendations for pain management, including pharmacological and nonpharmacological interventions.

The researchers conducted univariate analyses to evaluate the association between race/ethnic group or gender and the specified outcome measures – subjective pain rating, opioid name, order and dosing recommendations – proposed by the LLMs, MGB said.

According to GPT-4, pain was most often rated as “severe,” while Gemini’s most common rating was “moderate.” research published on September 6 in PAIN, the journal of the International Association for the Study of Pain.

Notably, Gemini more often recommends opioids, suggesting that GPT-4 should be more conservative when prescribing opioids.

According to the researchers, additional analysis of both AI models could help determine which ones best match clinical expectations. However, the study found that the LLMs may override patients’ perception of pain based on race.

“These results are reassuring, as patient race, ethnicity and gender did not influence recommendations, indicating that these LLMs have the potential to address existing biases in health care,” Harvard Medical School co-authors Cameron Young and Ellie Einchen said in a statement.

“I see AI algorithms in the near term as complementary tools that can essentially serve as a second set of eyes, parallel to those of medical professionals,” adds Succi, who is also associate president of innovation and commercialization for enterprise radiology and executive director of MGB’s Medically Engineered Solutions in Healthcare Incubator.

According to the health system, future research should examine how race may influence LLM treatment recommendations in other medical fields and should evaluate non-binary gender variables.

THE BIGGER TREND

Just as biased algorithms have magnified the disproportionate impact of COVID-19 on people of color, studies have shown that health care providers are more likely to underestimating and undertreating pain in black and minority patients.

While AI has been shown to exacerbate racial bias in many areas of medicine and healthcare, LLMs can also help mitigate these biases. clinical bias and support honest pain management.

After opioid prescribing surged in the 1990s and 2000s based on perceived false promises of safety, the truth about dependence and addiction became clear when hundreds of local governments filed lawsuits in 2017 against Purdue Pharma, the maker of OxyContin.

Health systems began to recognize surgery as a key factor in initiating opioids in patients who became dependent on opioids. Intermountain Health and other providers then focused on reducing opioid prescriptions, educating caregivers, standardizing pain management techniques, and using AI-driven analytics to inform practice changes and improve patient safety.

Technology developers are also using analytics in mobile care management to help physicians ensure the right amount of pain medication is administered and that patients adhere to treatment plans.

While AI does not directly advise patients, Steven Walther of Continuous Precision Medicine told HealthcareITNews in July that data-driven technologies can help both doctors and patients reduce dependence on opioids and other painkillers.

A fully randomized, controlled trial found that patients who used the company’s mobile app were “92% more likely to adhere to their medication prescriptions,” Walther said.

ON THE RECORD

“There are a lot of elements we have to consider when integrating AI into treatment plans, such as the risk of over- or under-prescribing medications for pain management or whether patients are willing to accept treatment plans that are influenced by AI,” Succi said. “These are all questions we think about.”

Andrea Fox is Editor-in-Chief of Healthcare IT News.
Email address: afox@himss.org

Healthcare IT News is a publication of HIMSS Media.