A recent Gartner survey found that the majority of customers are “AI shy”: 64% say they would prefer companies not to integrate AI in customer experience. Customers were also concerned about AI and misinformation (42%), data security (34%), and bias/inequality (25%).
Ethical AI can help businesses create innovative and trustworthy user experiences, protecting brands and enabling them to maintain a competitive advantage and foster better customer relationships. And ethical AI is part of WellPower’s story.
THE PROBLEM
In the mental health field, there aren’t enough therapists to help everyone who’s struggling. Community mental health centers like WellPower in Colorado serve some of the most vulnerable populations who need help.
Due to the complex needs of patients, WellPower clinicians face more complex documentation requirements than therapists in private practice. These additional requirements create an administrative burden that takes time away from clinical care.
WellPower explored how technology could serve as a workforce multiplier for mental health.
The provider organization turned to Iliff Innovation Lab, which works with AI, to see how health IT could make it easier for people to connect to their care, for example by telehealth; how people could move through treatment more quickly by facilitating high-fidelity evidence-based practices and remote treatment monitoring; and how WellPower could reduce administrative burden by making it easier for therapists to generate accurate, high-quality documentation while freeing up more of their attention to delivering care.
“When used properly, clinical documentation is a particularly promising area for AI implementation, particularly in behavioral health,” said Wes Williams, CIO and vice president of WellPower. “Large language models have proven particularly adept at summarizing large amounts of information.”
“In a typical 45-minute psychotherapy session, there is a lot of information to summarize to document the service,” he continued. “Staff often spend 10 minutes or more completing documentation for each service, which is hours that could otherwise be spent providing clinical care.”
PROPOSAL
WellPower’s commitment to health equity drives how it approaches technology implementation, making working with Iliff necessary to further the mission, Williams said.
“AI tools are often black boxes that obscure how they make decisions and can perpetuate the biases that have led to the health care disparities faced by the people we serve,” he explained. “This puts us in a difficult position because not using these emerging tools would deprive the people who need them most of their effectiveness, but adopting them without assessing bias could serve to increase disparities if an AI system is infused with historical health care biases.”
“We have found a system that exploits AI as a tool for passive listening “However, we needed to ensure that the digital scribe was reliable, to generate summaries of therapy sessions that were accurate, useful and unbiased.”
Behavioral health data is among the most sensitive, from a privacy and security perspective. These protections are necessary to ensure people feel comfortable seeking the help they need, he continued. For that reason, it’s critical that WellPower carefully vet any new system, especially one powered by AI, he said.
RESULTS
To implement the AI digital scribe, WellPower needed to ensure it did not compromise the privacy or security of the people it serves.
“Many therapists were initially hesitant to try the new system, citing these legitimate concerns,” said Alires Almon, chief innovation officer at WellPower. “We worked with the Iliff team to ensure the digital scribe was built ethically, with privacy in mind.”
“One example is that the system doesn’t record the therapy session, but encodes the conversation on the fly,” she continues. “This means that at the end of the session, the only thing that is stored is the metadata about the topics discussed during the session. With the information provided by the Iliff team, we were able to ensure the confidentiality of our patients while freeing up more time for care.”
The application of an AI assistance platform to support transcription and developing progress note projects has greatly improved the therapeutic experience for both staff and the people WellPower serves, she added.
“Since adopting the Eleos system, WellPower has seen a significant improvement in staff’s ability to complete their progress notes,” Almon said. “Three out of four outpatient therapists are using the system.”
“For this group, the average time to complete documentation improved by 75%, and total documentation time decreased by 60% (reducing note-writing time from 10 minutes to 4 minutes),” she said. “Our therapists were so excited to engage with Eleos that some said they would think twice about leaving WellPower because of their experience with Eleos.”
ADVICE FOR OTHERS
Artificial intelligence is a new and exciting venture in health information technology, but it comes with its own unique baggage that has been defined by science fiction, hype and the realities of its capabilities, Almon noted.
“It’s important for your organization to educate and define AI for your staff,” she advised. “Explain to them how it will be used and the processes and policies that will be in place to protect them and their customers. AI is not perfect and will continue to evolve.”
“If possible, before you start deploying AI-based tools, take a pulse to assess the level of understanding of AI and what people think about AI,” she continued. “Partnering with a program like Iliff’s Trust AI framework not only helps you select ethical technology to use, but also lets people know that your organization has considered the harms that can arise from AI-based platforms.”
This is more important than the results themselves, she added.
“Finally, reassure your staff that they cannot be replaced by artificial intelligence,” she concludes. “Human relationships are the most important relationships in healing individuals. Artificial intelligence is there to assist humans in their role, it is an assistive technology. AI can support and help, but it never replaces a therapeutic connection.”
Follow Bill’s coverage of HIT on LinkedIn: Bill Siwicki
Send him an email: bsiwicki@himss.org
Healthcare IT News is a publication of HIMSS Media.