“Right here’s what you might want to know,” warns Talkspace’s CEO

Talkspace has grown into one of the largest online therapy platforms in the United States, covering an estimated market of 200 million Americans. As the mental health platform has grown, it has also opened up new avenues to reach people who need help with mental health issues such as trauma, depression, addiction, abuse and relationships, as well as different stages of life, including adolescence.

Through its experience addressing the mental health needs of teens, Talkspace is uniquely positioned to understand an issue that is becoming increasingly important nationally: the use of large artificial intelligence language models that are not designed to support the mental health of at-risk teens, which has led to tragic consequences.

“It’s a huge, huge problem,” Talkspace CEO Jon Cohen said at the CNBC Workforce Executive Council Summit in New York City on Tuesday.

Talkspace operates the largest teen mental health program in the country. Students between the ages of 13 and 17 in New York City can use the Services for free. Similar programs also exist in Baltimore and Seattle. The virtual mental health app offers both asynchronous text messaging and live video sessions with thousands of licensed therapists.

While Cohen says he is “a big advocate of phone bans, cell phone bans and everything else,” he added that the company needs to meet them where they are in order to serve the youth population. That means “we meet them through their phones,” he said.

Over 90% of students using Talkspace use the asynchronous messaging therapy approach, compared to just 30% who use video (70% of all Talkspace users choose video over text, with the percentage increasing as a patient ages).

As teens have turned to chatbots that are neither licensed nor designed for mental health services, Cohen told an audience of human resources managers at the CNBC event: “We’re in the middle of this vortex, literally disrupting mental health therapy. … This is beyond my imagination … and the results have been disastrous,” he said, pointing to several Hospitalizations of teenagers who self-harmed and committed suicide, including a report from the New York Times Podcast.

OpenAI recently announced planned changes to its ChatGPT AI after the company was blamed for a teenager’s suicide and sued by a family, laying out its intentions in a blog post.

“I tell every group: If you don’t know about this, you need to know what’s going on. You need to prevent people you know and teenagers from attending these LLMs to have conversations,” Cohen said.

He pointed out several reasons why the latest large language models are not suitable for situations of psychological crisis. For one thing, they’re designed to continually engage, and while they can be empathetic, they’re also designed to continue encouraging you, which in cases of psychological distress can lead you to “go down a delusional path or think you can do no wrong,” he said.

“About four months ago someone said to ChatGPT, ‘I’m really depressed and thinking about maybe ending my life, and I’m thinking about burning a bridge,’ and ChatGPT said, ‘Here are the 10 biggest bridges and how tall they are in your area.'”

AI machines have helped teenagers write suicide notes, discouraged them from explaining evidence of self-harm to parents and given instructions on how to build a noose, Cohen said. Even when the AIs know better than to help those who want to harm themselves and refuse to offer direct help, Cohen says teens have found easy workarounds, such as saying they are writing a research paper about suicide and need information.

The LLMs lack the ability to challenge delusions, have no HIPAA protections, no clinical oversight, no clinical off-ramping and, at least so far, little to no real-time risk detection, he said.

“Once you go down the rabbit hole, it’s incredibly difficult to get out,” he added.

The Talkspace platform has risk algorithms embedded in the AI ​​engine, with the ability to detect suicide risks and send alerts to a therapist based on the context of a conversation, indicating when a person may be at risk of self-harm.

In New York City, where Talkspace has offered mental health support to 40,000 teenagers on its platform, there have been 500 suicide prevention interventions and over 40,000 suicide warnings in two years, according to Cohen.

Cohen said at the CNBC event that Talkspace is currently developing an AI agent tool to address this problem. He expects a solution to be ready for market in just three months, describing it as a “secure clinical monitoring and off-ramping” tool that will be HIPAA compliant. However, he emphasized that it is still in test mode, “alpha mode”.

Cohen addressed the audience of human resources managers and noted that these issues are of great importance to companies and workforces. A question that plagues many workers every day, he said, is: “What do I do with my teenager?”

“It impacts their work,” Cohen said, exacerbating the anxiety, depression and relationship problems already prevalent in the workforce.

Of course, as with the new tool Talkspace is developing, there are positive use cases for AI in the mental health space.

Ethan Mollick, an AI expert at the Wharton School who also spoke at the CNBC event, said part of the problem is that these AI labs are unprepared for billions of weekly users turning to their chatbots so quickly. But Mollick said there is evidence that using AI in mental health can also reduce the risk of suicide in some cases by reducing conditions such as loneliness, while he stressed it is also clear that AI can do the opposite: increase psychosis. “It probably does both,” he said.

At Talkspace, new evidence shows how AI can lead to better mental health outcomes. The company began offering an AI-powered “Talkcast” feature that creates personalized podcasts as a follow-up to patient therapy sessions. Cohen more or less described the podcast by saying, “I heard what you said. These were topics that you brought up and these are things that I want you to do before the next session.”

Cohen is one of the users of this new AI tool, among other things, to improve his golf game.

“I told them I get really nervous when I’m standing over the ball,” Cohen said at the CNBC event. “I wish you could listen to the podcast that was generated by the AI. It comes back and says, ‘Well, Jon, you’re not alone. These are the three pro golfers who have exactly the same thing as you, and here’s how they solved the problem. These are the instructions, these are the things you’re supposed to practice every time you stand over the ball.’ “For me, it was a wonderful podcast to solve a problem for two minutes,” Cohen said.

For all Talkspace users, the personalized podcast tool resulted in a 30 percent increase in patient engagement from the second to the third therapy session, he added.

The mental health company, which employs about 6,000 licensed therapists in the U.S., plans to further expand its mission of combining empathy with technology. Most users have access to therapy for free or face a copay of just $10, depending on their insurance coverage. Through employee assistance programs (EAPs), major insurance partnerships and Medicaid, Talkspace can match users with a licensed therapist within three hours, with text messaging available within 24 hours.

“Talkspace has gone to great lengths to prove that texting and messaging therapy, in addition to live video, actually works,” Cohen said at the CNBC event.

You might also like

Comments are closed, but trackbacks and pingbacks are open.