The digital mental health industry is already worth billions of dollars – but does AI therapy really work?
One in six people experience a mental illness in a given week. In spite of this, there are no NHS funding pots dedicated to psychological wellbeing support. The government boasts they’re tackling poor mental health with force, but just nine per cent of the total NHS budget is spent on mental health services and it can still take people up to 229 days to access the help they need.
With NHS mental health services in such a bad state, many are turning to private therapy. In 2018, research found that there had been a 65 per cent rise in demand for private therapy services since 2016 – a number which is only likely to have swelled in the intervening years, especially given the detrimental impact of the pandemic on people’s wellbeing. But with private therapy costing between £40 and £80 per hour, it’s no wonder so many people are looking for alternative ways to improve their mental health.
Enter AI-generated mental health care. In recent years, apps like Woebot and Heyr have sprung up to provide an alternative to both NHS services and expensive private therapy. On both apps, users speak to an AI-powered chatbot that guides them through cognitive behavioural therapy (CBT) and other therapeutic techniques to help manage mental health problems. It’s almost exactly like chatting with a therapist online.
Cherry, 28, from Liverpool, started using ChatGPT for therapeutic purposes after her previous therapist ‘over-identified’ and left Cherry feeling uncomfortable. Unlike Woebot and Heyr, ChatGPT hasn’t been ‘approved’ for therapeutic use – but Cherry said she still found speaking to the chatbot helpful (and, of course, it’s free).
“I found it interactive and realistic,” she says, adding that in her experience “when you speak to professionals a lot of stuff is scripted anyway.” Cherry also found that ChatGPT was “open to being corrected” too, while “real therapists”, in her view, “tend to get stuck on the idea of what they believe is correct”. She has nothing but positive things to say about using ChatGPT as a free therapist: “I’d 150 million per cent recommend [ChatGPT] to someone else.”
However, while it’s vital that people feel comfortable with their therapists, sometimes the process of managing your wellbeing requires therapists to stick to their guns. Therapists are meant to challenge you, after all, and we go to therapy to get a more educated, experienced opinion about our mental health. AI bots may have the entirety of Google at their fingertips, but they don’t have psychology PhDs or the capacity for true empathy.
People regularly avoid the root cause of their emotional trauma, so feeling like your therapist is getting it wrong isn’t necessarily unusual, and having AI change the topic at our whim might not be what we need to better our mental health. That being said, there are inherent problems within psychology and psychiatry stemming from white supremacy, patriarchy, and cis-heteronormativity that can sometimes make therapeutic practices, and sometimes therapists themselves, unfit for purpose.
“You lose something even when you’re working on Zoom. Talking face-to-face with a human being who has human experiences enables clients and therapists to create a different kind of connection” – Celeste Farey
Digital mental health has already become a multi-billion pound industry, with more than 10,000 apps to choose from, so perhaps these new technologies will avoid the pitfalls of traditional therapeutic methods. This aspiration to create an inclusive approach to therapy is what inspired Lee McPherson and Kevin Kwong to create their mental wellbeing app, Heyr.
Both McPherson and Kwong entered the wellbeing space after personal experiences with mental health problems, creating the app to act as a 24/7, personalised, support system. And, it’s no surprise Heyr recently won ‘App of the Year’ at the Black Tech Awards (BTAs), given that its algorithm is informed specifically by marginalised communities and monitored weekly for any emerging biases.
Heading up the tech, McPherson reflects on his motivations for designing Heyr: “BAME people can face barriers, and mental health support might not be culturally sensitive,” he says. “The Heyr virtual assistant is designed for Gen Z and provides a safe and judgement-free space to foster open conversations around mental health.”
Also schooled in CBT and applied positive psychology, Kwong explains how the AI function of Heyr intends to ward off the prejudice present in other AI chatbots. “We have to be very intentional about what sort of data we use,” he explains, “And it needs to be a codesigned process.”
“It’s good data in, good data out,” McPherson continues. “Heyr was co-developed by a diverse group of young adults, from across two reputable universities, London South Bank and Imperial College.”
But he also urges users to understand the purpose of the app. “Heyr is not there to replace traditional face-to-face therapy. It’s there to add support. We are a wellness product. We are not a crisis service or a medical device.” Clearly, AI mental health chatbots have their drawbacks. “Limitations include self-diagnosis which can lead to undue anxiety, and there’s a question about over-reliance where users may be less willing to seek professional help due to easy access to apps that use AI,” McPherson explains. “There’s also a limited focus: apps in this space could be too streamlined, too generic, and they could disregard certain aspects about one’s mental health.”
Counsellor Celeste Farey agrees with McPherson. Farey, who specialises in supporting teenagers and young adults with mental health issues, is keen to highlight how in-person support differs from AI wellbeing support. “You lose something even when you’re working on Zoom,” she says “Talking face-to-face with a human being who has human experiences enables clients and therapists to create a different kind of connection. It’s important to feel empathised with, and to build a sense of trust, so you can take what you learn in therapy and apply it in real life. I don’t think AI can do that – not yet, at least.” Concerningly, one chatbot designed to help people suffering with eating disorders was recently taken offline after it began offering users harmful advice.
She also explains how advocating for yourself needs to be practiced in an environment with real people for emotional intelligence to improve. Because it’s easy to set boundaries and broach tough topics when you’re talking to a robot, but as most of us will know, when it comes to confronting a real-life person, it’s not as easy.
For now, at least, it seems like AI can’t be used to revolutionise how we treat mental health problems. Professionals, as well as the creators of apps like Heyr themselves, warn against using AI chatbots as a replacement for face-to-face mental health treatment. However, AI could still massively change the way we track moods and engage with our mental health care for the better. Easily accessible apps with tailored support in the form of a virtual AI buddy might just offer some people’s mental wellbeing the TLC it so desperately deserves.