Skip to content

News

Therapy by Chatbox

Seattle startups test promises, challenges

By Judy Temes August 3, 2023

Using your mobile to communicate your therapist
Using your mobile to communicate your therapist
Sycikimagery / Getty

This article originally appeared in the July/August 2023 issue of Seattle magazine.

Could a chatbot be more empathetic than your therapist? Could bots augment or even replace a trusted mental health professional?

Those questions lit up the internet when JAMA, the influential Journal of the American Medical Association, published a study in late April concluding that ChatGPT (generative pertained transformer) responds with more empathy to patients than doctors or therapists. The results sparked excitement, fear, and curiosity about the potential of AI to fill the huge gap in mental health, where the need for care far outweighs the supply of qualified therapists.

Like physicians, mental health providers are increasingly in short supply. During the height of the pandemic, 32.6% of adults in Washington state reported symptoms of anxiety or depression, according to research firm KFF. Research from the National Library of Medicine says the U.S. will be short between 14,000 and 30,000 psychiatrists in coming years.

Here in Seattle, at least two companies bet that AI can help therapists better perform their jobs.

Lyssn is an AI-powered software platform that grew from 15 years of research at the University of Washington. It has the capacity to train therapists and evaluate the quality of therapy across a range of metrics, including empathy, for example. The tool provides what the company calls “gold-standard clinical evaluation” of statements made by a therapist in a counseling session and gives students in the field important and timely feedback.

“For counseling and psychotherapists, conversation is often the treatment, and the quality of that conversation matters a great deal,” says David Atkins, professor of psychology at the University of Washington and the founder and CEO of Lyssn. “In those instances, what is said and how it is said can determine if someone stays with therapy or drops out.”

Evaluating the quality of those conversations, however, is difficult. Professors and trainers have little time to review many hours of recorded therapy or their transcripts. Feedback can come weeks after the fact, which is never as useful. “You might do 20 hours of therapy in a week of training and what is actually reviewed by a supervisor could be five to 10 minutes,” Atkins notes. “It’s not a criticism of supervisors or trainers. It’s the sheer amount of time it takes. They just don’t have it.”

Enter artificial intelligence.  During the past 15 years, Atkins and his team have trained their AI on hundreds of thousands of hours of recorded, anonymized conversations between therapists and patients, then had human evaluators score those conversations for empathy. Leveraging that training data, the AI can now assess with a high degree of confidence how empathetic the therapist was, the quality of the questions he or she asked, or the effectiveness of statements he or she may have made. The AI can even point them toward more effective words to help people achieve change.

“There is a massive undiscussed issue as to the quality of the care patients receive,” says Zac Imel, director of clinical training at the University of Utah and chief psychotherapy science officer and cofounder of Lyssn. “You can be assigned to an amazing therapist, but also to someone who is not so great. How would you know?”

Until now, he says, there has been no scalable way to improve on the quality of services people receive in therapy. “That is the core problem we are trying to solve.”

Lyssn, founded in 2017, has no venture capital behind it; Atkins and Imel’s work has been entirely funded by the National Institutes of Health and is supported by more than 60 peer-reviewed publications. Some 70 customers from universities to child welfare and human services agencies are using the AI. The company wants to work with the 988 national suicide crisis hotline to train its volunteers.

A big difference between Lyssn and ChatGPT is that its predictive data are derived from actual conversations that are evaluated by clinicians on a scale for empathy and fed into the AI, which, over time, learns patterns of speech indicative of empathy. “There is a lot of human labor that goes into generating the data to train the AI,” Atkins explains.

While ChatGPT has shown impressive results, it is unpredictable because it uses data culled from the internet.  “It can hallucinate, it can provide information that is plain wrong, but it cannot tell you that it’s wrong,” Imel says.

Another startup is Seattle-based Aiberry, which uses an AI-powered virtual therapeutic assistant to evaluate the likelihood of someone having depression or other mental health disorders. The “assistant” asks patients questions and looks for patterns in their responses to come up with a score that could indicate mild, moderate, or severe depression. Because the conversation is conducted via video, the AI promises to be able to analyze not just spoken words, but also audio biomarkers and facial features.

“With tools like this, we can prevent severe depression and get people the treatment they need.”

Braxton Carter (Aiberry Investor)

Aiberry has raised $10 million in venture capital. It features a friendly voice and an animated character designed to put people at ease when answering questions. A study to prove Aiberry’s ability to predict mental illness with accuracy is under way, though not yet published. Co-CEO Johan Bjorklund says he’s confident the study will prove the AI to be more accurate than paper-based questionnaires.

Futures Recovery Healthcare, a residential addiction and mental health treatment center in Tequesta, Fla., utilizes Aiberry as a self check-in for patients, who also use it to monitor their progress through treatment. “Weekly assessment can be very useful to see if current interventions are effective,” says Aiberry COO Tammy Malloy. The AI can help clients who have trouble identifying or expressing their emotions understand those feelings better. She cautions, however, that Aiberry alone should not be used to diagnose or treat.

While it is now being marketed as a clinical-grade tool for mental health providers, the company has plans to make Aiberry available to a broader user base. “We just signed on with a corporate wellness program with 2.2 million members,” says co-CEO Linda Chung.

While the need is enormous and the shortage of qualified therapists very real, it remains to be seen how useful these tools will be in the hands of consumers. Braxton Carter, who was the first person to invest in Aiberry, recalled how a loved one’s suicide from depression reinforced for him the need for predictive tools that can detect signs of depression early. “This is so easy to use,” he says. “With tools like this, we can prevent severe depression and get people the treatment they need.”

But the bots also have their critics. The wrong tool in the hands of a vulnerable person could potentially do damage.

As of today, there are few, if any, guidelines for the claims that companies are making about efficacy, what data were used to train the AI, or how the bots should be regulated.

“This space is filled with people who exaggerate what their companies can do,” adds Seth Feuerstein, assistant clinical professor of psychiatry at Yale University, and an adviser to Lyssn. “It’s one thing if the product is an airport ride that gets you there five minutes late. When it collides with health care, it’s uniquely important to not exaggerate what the technology does. The data are what make the difference.”

Follow Us