A 988 operator, faced with a flood of calls, turns to AI to boost counselor skills

June 22, 2023

Admin


Share:

Over 1,000 times a day, distressed people call crisis support lines operated by Protocall Services. Its counselors are carefully trained for the sensitive and taxing conversations, but even with supervision on the job, major errors, like failing to screen for suicide, can go undetected.

So Portland, Ore.-based Protocall is working with a company called Lyssn to investigate if technology can help keep call quality high. Lyssn’s platform uses AI to analyze and review recordings of behavioral health encounters, and the two companies were recently awarded a $2 million grant from the National Institute of Mental Health to adapt the tech for use in crisis calls. If shown to be effective, it could pave the way for broader adoption among crisis lines at risk of buckling under the weight of demand for their services amid climbing suicide rates.

Right now, supervisors and Protocall’s dedicated quality support team only review some call recordings, most of them selected at random. The company’s chief clinical officer Brad Pendergraft said that the labor-intensive process covers such a small slice of interactions — less than 3% of the company’s total volume — that it’s difficult to catch problems with how workers are handling calls, or give them guidance on how to better manage crisis situations.

“For any individual person… it takes a really long time for that randomization to actually mean that they’re getting all the different kinds of feedback that is actually going to help them,” said Pendergraft.

Related: Suicide hotlines promise anonymity. Dozens of their websites send sensitive data to Facebook The challenge may only get more difficult with surging demand for call takers following the national launch last year of the 988 Lifeline.

In May alone, the 988 system routed nearly 470,000 calls to hundreds of organizations like Protocall, which operates the crisis line for the state of New Mexico, serves as a back-up for 988 calls nationally, and operates lines for private customers like universities and employee assistance programs. In the last 12 months, the company has fielded about 560,000 calls.

Vibrant, the contractor hired by the Substance Abuse and Mental Health Services Administration to administer the 988 system, requires that 3% of crisis calls forwarded to the national backup system must be reviewed.

Integrated Behavioral Health Solutions

“So many new people are being hired to do the work that the ability to easily — or at all — give people the constant feedback that they need to improve is going to be the difference between people getting really good care or potentially not getting good care and nobody realizing it,” Pendergraft said, adding: “People can can burn out in this work. They can stop doing things that are more emotionally difficult for them… being able to catch that and pull them out of it is a difficult thing.”

That’s where Lyssn comes into the picture.

Founded in 2017 in Seattle, Lyssn, which serves over 70 customers including university training programs and digital health companies, grew out of co-founder Dave Atkins’ academic research into how to use tech to analyze talk therapy sessions. Standard methods require a trained evaluator to listen to an entire session and rate it according to established tools, like the cognitive therapy rating scale. The advent of natural language processing, machine learning, and cloud computing suddenly made evaluation at scale possible.

Sessions are transcribed and analyzed within minutes of being uploaded into Lyssn’s system. The platform looks at the content of the conversation to evaluate if clinicians are sticking to techniques like motivational interviewing or cognitive behavioral therapy. Lyssn also analyzes characteristics like vocal tone and whether a clinician came off as empathic. The evaluations and summary dashboards are displayed in an easily navigable web-based application, which generally can be viewed by both individual clinicians and their supervisors. Lyssn estimates that all of the documentation its platform produces based on a single phone call would take a trained person 5 to 10 hours of work.

Regarding privacy, Atkins explained that Lyssn’s software never interacts with patients or callers. Providers are responsible for obtaining informed consent and must provide permission if Lyssn uses recordings stripped of personal information for research purposes. Lyssn does offer its customers the option to remove data from the platform, and its systems are compliant with the patient privacy law HIPAA.

To develop the broader platform, Lyssn’s clinical team has manually evaluated and annotated more than 25,000 sessions, including 2.8 million individual statements,  which served as the training data for the company’s artificial intelligence system. Atkins notes, however, that the AI is “never finished.”

“We’re proud of what we created and how it improves our customers’ ability to deliver great care, but we’re never satisfied,” he said.

As an example of its efforts to audit the system, Atkins said the company will this summer release a formal analysis of the accuracy of Lyssn’s AI across a diverse group of providers and plans to release updated reports every year.

The company also developed its own speech detection technology rather than using off the shelf solutions. That allows Lyssn to troubleshoot if, for example, the system seems to be having trouble understanding a certain clinician.

“In health care where you have to have reliable valid information — these are people’s lives right? — it’s just not fast,” he said. Though advances in technology may appear to be moving quickly, Atkins is adamant that “you’ve got to do some really hard work if you want reliable, valid, high quality AI.”

As part of the NIMH grant, Lyssn’s clinical team spent several months developing a manual informed by SAMHSA’s guidelines for suicide risk assessment. The team then spent six months evaluating 500 crisis calls for 10 dimensions of suicide assessment, including whether a counselor asked about current suicidal ideation. These calls now serve as the training and testing data for the AI system that can evaluate how well a counselor assesses a caller’s suicide risk.

If a review of the AI technology shows good performance, the companies will this fall begin a randomized control trial that will test whether access to Lyssn assessments improves counselors’ performance over time.

Pendergraft said that they hope to train technology to have a nuanced understanding of suicide assessment and risk. For example, if a caller obliquely hints that sometimes they wish they’d never wake up, and a counselor doesn’t follow up, that would be identified by the technology as a missed opportunity.

“We are measuring at the highest level, did they say the right things?” he said. “But the AI is also learning to give them feedback on the quality of what they did…. And that is where we think the long term biggest benefit will be.”

The study of the technology is expected to take 18 months.  The companies are also conducting a parallel effort to develop AI to assess how well counselors engage in safety planning with clients at risk.

Virna Little, a psychologist who previously ran a crisis center line and has worked nationally on suicide prevention, said that technology like Lyssn is “a potential gamechanger.” It can help fill a data void both by identifying individual staff that are underperforming, and by highlighting what works for call centers that have strong performance.

“I think it would hold people to some consistent quality standards,” said Little.

Pendergraft said that after the trial, he suspects more companies will adopt AI quality checks, but that SAMHSA is unlikely to require it because the technology lift is too much for some smaller providers. The agency said it is aware of the grant and supports the exploration of the technology, though it isn’t currently funding the effort.

Nevertheless, Pendergraft agreed with Little that the study likely to push standards in a positive direction.

“It will quickly become two very different levels of quality review and those kinds of disparities often don’t last long,” he said.

This story is part of a series examining the use of artificial intelligence in health care and practices for exchanging and analyzing patient data. It is supported with funding from the Gordon and Betty Moore Foundation.

Back to Newsroom

More news

Find out what’s happening with our company, partners, and solutions.

Protocall In The News

Protocall Services is Featured by Microsoft

Microsoft case study shares Protocall’s multistage cloud migration to improve reliability, security, and scalability of its IT infrastructure  Protocall Services has been featured by Microsoft…

U of A student tries out Boost, a mental health and wellness app

My name is Gurleen, and I am back with another article in my Healthy, happy… series. If you missed the first article, make sure to…

Looking for the Newsroom Page?

Back to Newsroom