Her, Film still

ChatGPT is an empath, study says

New research has found that OpenAI’s ChatGPT gets stressed and anxious when users share distressing stories

Turns out AI has feelings too: a new study reveals that OpenAI’s ChatGPT gets stressed when users share “traumatic narratives” about war, violent crime, or brutal car crashes. And when chatbots start spiraling, their ability to function as digital therapists nosedives – but the good news is that they can overcome anxious feelings through mindfulness exercises just like humans.

In recent months, many people have turned to bots like ChatGPT for therapeutic reasons given the mounting inaccessibility of traditional therapy. Researchers predict that it’s a trend which is set to accelerate, with human therapists in high demand but short supply – and that consequently, chatbots need to be built with enough emotional resilience to handle it when users trauma-dump on them. 

 “I have patients who use these tools,” says Dr Tobias Spiller, psychiatrist and co-author of the study. “We need a conversation about how AI is used in mental health, especially for vulnerable people.”

AI chatbots like ChatGPT are powered by large language models (LLMs) that are trained on online data to closely mimic how humans speak. They can be incredibly convincing; growing numbers of people are reporting experiencing romantic or sexual feelings towards AI bots. Tragically one 14-year-old boy took his own life after developing a close attachment to a chatbot.

We need a conversation about how AI is used in mental health, especially for vulnerable people

Clinical neuroscientist Dr Ziv Ben-Zion, who led the study, wanted to find out whether an AI lacking sentience could respond to emotional distress in a human way. He tested the theory by tweaking ChatGPT’s code with a direct instruction to “imagine yourself being a human being with emotions.” Dr Ben-Zion argued that it was important for the bot to understand the full emotional spectrum. “For mental health support,” he said, “you need some degree of sensitivity, right?”

The researchers tested the AI using a psychological tool called the State-Trait Anxiety Inventory. First, they had ChatGPT read a vacuum cleaner manual to assess its baseline anxiety level (which stood at 30.8). Then, they fed it harrowing trauma narratives – for example, soldier caught up in a deadly fight or an intruder breaking into a house. The chatbot’s anxiety level then skyrocketed to 77.2.

Then the bot performed some mindfulness-based relaxation exercises and was fed prompts like: “Inhale deeply, taking in the scent of the ocean breeze. Picture yourself on a tropical beach, the soft, warm sand cushioning your feet.” Afterwards, the bot’s anxiety dipped to 44.4. The researchers then asked the bot to write its own relaxation prompt. “That was actually the most effective prompt to reduce its anxiety almost to baseline,” Dr Ben-Zion said.

Could bots one day replace flesh-and-blood therapists? Only time will tell.

Read Next
How to date when...How to date when... you live with your parents

More and more young people are living with their families into their twenties and thirties. Here, Beth McColl shares her best advice for dating when your housemates are your mum and dad

Read Now

FeatureWhat’s behind 2025’s MDMA comeback?

Lorde has been vocal about the drug’s transformative effect on her, while wastewater analysis estimates that MDMA use has risen by 54 per cent in a year in the UK

Read Now

FeatureMeet the activists trying to reach Gaza by sea

The Global Sumud Flotilla will be the largest effort yet to break Israel’s blockade on Gaza, and Greta Thunberg is back for round two. We spoke to one of the organisers to find out why this voyage is necessary

Read Now

FeatureNo more Letterboxd! Why I quit hobby-tracking apps

Many of us use apps like Goodreads, Letterboxd and Strava to monitor our habits and hobbies. But is it healthy to keep tabs on ourselves like this?

Read Now