We have written about initial research on AI therapy chatbots. Recent reports point to a drastic recent increase in AI therapist use. But how effective are AI therapists? And what is the mental health field to do about the rise of AI therapists?
Some of these research findings have been surprising: humans enjoy their interactions with these machines and feel heard! Clients even show improvements on specific difficulties, like depression and anxiety.
Newer research finds that AI therapists may even be more accurate than human therapists. In particular, they show improved ability to recognize and respond to mental health difficulties clients experience. Wait…. Really??
There are a few possible reasons for how effective AI therapists appear to be. First, it is long understood that numerical probabilities are usually more accurate for predicting human behavior than clinician judgment. AI is more or less a powerful probability machine, so this appears to make it more accurate in its predictions.
There also appears to be a related role of bias. Human therapists are, of course, valuable. I mean, don’t we like to think so…. And research consistently shows that therapy helps people compared to no therapy. But we are also human.
As we all know, inherent human bias can lead to errors. Machines lack important elements for therapy (e.g., empathy), but they also lack bias and therefore appear prone to fewer errors in assessing and pinpointing specific difficulties clients experience.
The rise of AI therapy chatbots raises some tough questions for therapists. Will AI replace human therapists? Do we push back and lobby against it? And if we push back, how much of an impact could we have in that fight? Or is AI growth too strong and fast?
More realistically, it is probably about using some integration of humans and AI. These integrations should be effective for how AI therapists are used for clients. For example, therapists conduct therapy as usual while clients use AI as-needed between sessions. (So long as that use does not function to avoid real life interactions or other therapy “homework”).
Or clients use AI as the main method to learn many coping strategies. Meanwhile, therapists guide clients through those strategies and help clients learn to apply skills to specific challenges that arise.
These are just a few examples. More time will help human therapists understand effective use of AI therapists for clients’ overall therapy goals. Research currently only suggests potential AI-therapy integrations. Further research testing combined human and AI therapists will help us further understand the role of AI in therapy.