Ethical AI-based writing technology helps WellPower reduce documentation time

A recent Gartner survey found that most customers are “AI-shy”: 64% say they would rather not see companies integrate AI in the customer experience. Customers also had concerns about AI and misinformation (42%), data security (34%), and bias/inequality (25%).

Ethical AI can help organizations create innovative, trusted user experiences—protecting brands and enabling them to maintain a competitive edge and foster better customer relationships. And Ethical AI is part of the story at WellPower.

THE PROBLEM

In the mental health field, there are not enough therapists to help everyone with issues. Community mental health centers like WellPower in Colorado serve some of the most vulnerable populations in need of help.

Due to the complex needs of those served, WellPower clinicians face more complex documentation rules than therapists in private practice. These additional rules create an administrative burden that takes time that could otherwise be spent on clinical care.

WellPower investigated how technology could increase employment in mental health care.

The healthcare provider turned to AI-powered Iliff Innovation Lab to explore how health IT could make it easier for people to access their care, such as via telehealth. They also wanted to explore how people could progress through treatment faster by enabling reliable, evidence-based practices and remote treatment monitoring. And how WellPower could reduce administrative burdens by helping clinicians generate high-quality, accurate documentation so they can focus more on delivering care.

“When used correctly, clinical documentation is a particularly promising area for AI implementation, particularly in behavioral health,” said Wes Williams, CIO and vice president of WellPower. “Large language models have proven particularly adept at summarizing large amounts of information.

“In a typical 45-minute psychotherapy session, there is a lot of information to summarize to document the service,” he continued. “Staff often spend 10 minutes or more completing documentation for each service, adding up to hours that could otherwise be spent delivering clinical care.”

PROPOSAL

According to Williams, WellPower’s commitment to health equity drives the way the company approaches technology implementation, and partnering with Iliff is essential to advancing that mission.

“AI tools are often black boxes that hide how they make decisions and can perpetuate biases that have led to the health care disparities faced by the people we serve,” he explained. “This puts us in a difficult position, because not using these emerging tools would deny their effectiveness to the people who need them most, but adopting them without evaluating for bias could exacerbate disparities if an AI system had historical health care biases baked into the system.”

“We found a system that used AI as a passive listening aid that could sit in on therapy sessions (both telehealth and in-person) and serve as a kind of digital scribe, generating draft notes that our clinicians could review and approve,” he added. “However, we had to ensure that the digital scribe was reliable, in order to generate summaries of the therapy sessions that were accurate, useful, and unbiased.”

Behavioral health data is among the most sensitive, from a privacy and security standpoint; these protections are necessary to ensure people feel comfortable seeking the help they need, he continued. That’s why it’s critical that WellPower thoroughly vet any new system, especially an AI-based one, he said.

RESULTS

To implement the digital AI writer, WellPower had to ensure that the privacy or security of the people it serves would not be compromised.

“Many therapists were initially hesitant to try the new system, citing legitimate concerns,” said Alires Almon, director of innovation at WellPower. “We worked with the Iliff team to ensure the digital scribe was built ethically with a privacy-first mindset.

“For example, the system doesn’t record the therapy session, but encodes the conversation on the fly,” she continued. “This means that at the end of the session, only the metadata about what topics were covered during the session is stored. With the insights from the team at Iliff, we were able to ensure the privacy of our patients while freeing up more time for care.”

She added that the adoption of an AI-enabled platform to support transcription and drafting of progress notes has significantly improved the therapeutic experience for both staff and the people WellPower serves.

“Since implementing the Eleos system, WellPower has seen a significant improvement in the staff’s ability to complete their progress notes,” Almon reported. “Three out of four outpatient therapists use the system.

“For this group, the average time to complete documentation has improved by 75% and the total documentation time has decreased by 60% (time to write notes has decreased from 10 minutes to four minutes),” she said. “Our therapists are so excited to work with Eleos that some have indicated they would think twice about leaving WellPower because of their experience with Eleos.”

ADVICE FOR OTHERS

Artificial intelligence (AI) is a new and exciting development in healthcare IT, but it also brings its own unique baggage based on science fiction, media hype and the reality of its capabilities, Almon said.

“It’s important that your organization educates and defines AI for your workforce,” she advised. “Explain how it will be used and what processes and policies will be put in place to protect them and their clients. AI is not perfect and will continue to evolve.

“If possible, before deploying AI-enabled tools, take a pulse to assess their level of understanding and thinking about AI,” she continued. “Partnering with a program like Iliff’s Trust AI framework not only helps in selecting ethical technology to deploy, but also communicates that your organization has assessed the harms that can occur because of AI-enabled platforms.”

That’s more important than the results themselves, she added.

“Finally, reassure your staff that they cannot be replaced by AI,” she concluded. “Human relationships are the most important relationships in the healing of individuals. AI is there to assist the human in their role, it is an assistive technology. AI can support and assist, but it never replaces a therapeutic connection.”

Follow Bill’s HIT reporting on LinkedIn: Bill Siwicki
Send him an email: bsiwicki@himss.org
Healthcare IT News is a publication of HIMSS Media.

Related Post