Like it or not, people are using artificial intelligence (AI) chatbots as therapists.
A quick perusal of Reddit shows testimonies from users who swear by ChatGPT as a confidant, and several companies have designed chatbots specifically for this purpose. But there are also tales of AI stoking delusions and chatbots telling a user to “get rid of” his family.
As use of AI grows, regulators are struggling to keep up. Mental health experts told us AI can potentially fill gaps amid a shortage: A reported 53% of psychologists had no openings for new patients in a 2024 survey. But providers caution against relying on AI too heavily.
“There’s a mental health, global crisis,” Russell Fulmer, a licensed professional counselor and associate professor at Kansas State University, said. “AI can be a piece of that puzzle to help, and it will continue to gain ground and probably help us with things like diagnosis. But I don’t see it replacing [practitioners] because of that human connection that is needed.”
Charting the future
Clinicians hold varied perspectives about AI for therapy, Vaile Wright, American Psychological Association (APA) senior director for healthcare innovation, told us.
“There is a group of individuals who recognize that it’s very human to turn to ChatGPT for emotional support,” Wright said. “There’s a group of individuals that just think AI has absolutely no place in this space at all. These can be, in some ways, overlapping groups.”
Olivia Uwamahoro Williams, a licensed professional counselor and clinical assistant professor at the department of School Psychology and Counselor Education at William & Mary, said she sees chatbots as a tool. For example, they could be used to help “triage” people who call chronically understaffed crisis lines like 988.
But she doesn’t think chatbots can replicate the therapeutic alliance—an honest, empathetic connection between a therapist and client—which research suggests can be crucial to therapy’s effectiveness.
Looking forward
Uwamahoro and Fulmer worked in a group formed by the American Counseling Association to draft a number of recommendations. The group urges avoiding overreliance on the tech, advocates for further research, and says AI should be a tool, not a replacement, for human care.
Wright said APA hasn’t yet drafted guidelines on the use of AI therapists, but plans to start soon.
Separately, Brown University is set to lead a research institute focused on AI assistance in behavioral health, backed by $20 million from the US National Science Foundation, the university announced July 29.
For now, APA is urging federal regulators to put out guidelines and require apps to have safety measures like crisis support. One of the biggest concerns is making sure users understand what these apps can and can’t do and risks associated with using them, Wright added.
Enter Ash
One newer app—a chatbot called Ash—was created by startup Slingshot AI with these concerns in mind.
Navigate the healthcare industry
Healthcare Brew covers pharmaceutical developments, health startups, the latest tech, and how it impacts hospitals and providers to keep administrators and providers informed.
Backed by $93 million in funding, the company unveiled Ash on July 22 after 18 months of development with 50,000 beta users.
Ash is different from other AI offerings because it’s trained on its own large language model that includes “hundreds of thousands of hours” of clinical conversations and a vast array of therapeutic approaches, Daniel Reid Cahn, co-founder and CEO of Ash, told us.
Unlike other AI chatbots, it’s also not designed to tell people what they want to hear, agree with them, or be their friend, Cahn added.
Ash is advertised on the company’s website as a nonjudgmental tool that will “challenge” users and remember conversations, during which the AI will ask constructive questions and offer users insights.
The app primarily targets people without clinical-level mental disorders, but it can identify and flag people in crisis or experiencing delusions for the app’s team of clinicians to review, founding clinical lead and clinical psychologist Derrick Hull said. Unlike a therapist, it won’t try to evaluate or recommend treatment.
“We don’t believe that the relationship will be exactly the same as the relationship that people would form with the human therapist. Nor should it be exactly the same,” Hull said.
Clinicians react
Yet Ash is not immune from concerns.
Fulmer said the directors and advisors at Ash “appeal to clinical expertise,” but one thing that makes him wary is the lack of peer-reviewed research.
Cahn said that’s coming, but it’s a long process. Early results have him feeling confident, though, he added.
In addition, Ash, like similar apps, advertises “AI built for therapy” and “AI-powered mental health support,” but in the terms and conditions, the company says the tool “is not a healthcare provider or a provider of mental health services and does not engage in the practice of medicine.” Instead it gives users an opportunity to “self-help.”
“Are they a tool that actually treats mental illness?” Wright said, speaking generally about AI therapy apps. “They’ll say in the fine print, no, but that’s how they’re marketing themselves. And if that’s the case, then they really ought to be regulated by the FDA, and they’re not. They’re in the direct-to-consumer wellness space.”
Hull said the app makes clear what it can and can’t do.
“The nuances of healthcare are lengthy and complicated. We save a lot of that for the in-app experience. So if there are expectations that people have of Ash that Ash cannot and should not meet, there’s signposting throughout and baked into Ash’s behavior,” he said.