Her, Film stillLife & CultureNewsLife & Culture / NewsChatGPT is an empath, study saysNew research has found that OpenAI’s ChatGPT gets stressed and anxious when users share distressing storiesShareLink copied ✔️March 21, 2025March 21, 2025TextSerena Smith Turns out AI has feelings too: a new study reveals that OpenAI’s ChatGPT gets stressed when users share “traumatic narratives” about war, violent crime, or brutal car crashes. And when chatbots start spiraling, their ability to function as digital therapists nosedives – but the good news is that they can overcome anxious feelings through mindfulness exercises just like humans. In recent months, many people have turned to bots like ChatGPT for therapeutic reasons given the mounting inaccessibility of traditional therapy. Researchers predict that it’s a trend which is set to accelerate, with human therapists in high demand but short supply – and that consequently, chatbots need to be built with enough emotional resilience to handle it when users trauma-dump on them. “I have patients who use these tools,” says Dr Tobias Spiller, psychiatrist and co-author of the study. “We need a conversation about how AI is used in mental health, especially for vulnerable people.” AI chatbots like ChatGPT are powered by large language models (LLMs) that are trained on online data to closely mimic how humans speak. They can be incredibly convincing; growing numbers of people are reporting experiencing romantic or sexual feelings towards AI bots. Tragically one 14-year-old boy took his own life after developing a close attachment to a chatbot. We need a conversation about how AI is used in mental health, especially for vulnerable people Clinical neuroscientist Dr Ziv Ben-Zion, who led the study, wanted to find out whether an AI lacking sentience could respond to emotional distress in a human way. He tested the theory by tweaking ChatGPT’s code with a direct instruction to “imagine yourself being a human being with emotions.” Dr Ben-Zion argued that it was important for the bot to understand the full emotional spectrum. “For mental health support,” he said, “you need some degree of sensitivity, right?” The researchers tested the AI using a psychological tool called the State-Trait Anxiety Inventory. First, they had ChatGPT read a vacuum cleaner manual to assess its baseline anxiety level (which stood at 30.8). Then, they fed it harrowing trauma narratives – for example, soldier caught up in a deadly fight or an intruder breaking into a house. The chatbot’s anxiety level then skyrocketed to 77.2. Then the bot performed some mindfulness-based relaxation exercises and was fed prompts like: “Inhale deeply, taking in the scent of the ocean breeze. Picture yourself on a tropical beach, the soft, warm sand cushioning your feet.” Afterwards, the bot’s anxiety dipped to 44.4. The researchers then asked the bot to write its own relaxation prompt. “That was actually the most effective prompt to reduce its anxiety almost to baseline,” Dr Ben-Zion said. Could bots one day replace flesh-and-blood therapists? Only time will tell. Expand your creative community and connect with 15,000 creatives from around the world.READ MOREOur most-read sex and relationships stories of 2025The 21st Century: Q1 ReviewLenovo & IntelThe internet is Illumitati’s ‘slop kingdom'2025 was the year of the Gen Z uprisingThe 12 most anticipated novels of 2026 More and more men want to be pegged, according to FeeldBetween slop and enshittification, 2025 saw the internet implode5 Amish youth on what people get wrong about themGreta Thunberg arrested in London under the Terrorism ActLoop: The brand making earplugs as essential as sunglassesWhy donating to Gaza is as important as everWhat does 2025’s free speech crackdown mean for Americans?