Last autumn, 23-year-old Juan Cassanova was going through one of life’s most dreaded partings: a friendship breakup. The Brooklyn-based photographer knew his friend was hiding things from him, but didn’t know how to express how he felt. The fact that it was a long-distance friendship, with Casanova’s friend living in Florida, only compounded their issues and made the erosion of the relationship even trickier to navigate. So he turned to AI for advice. “I’m a recovering people-pleaser, and I was afraid to sound a certain way,” he says. “I was relying on ChatGPT to tell me how I can tell my friend that she is upsetting me without sounding like I'm attacking her character.”

Cassanova fed ChatGPT a voice memo detailing how he felt, what he wanted to say to his friend, and a brief history of their relationship. In return, the chatbot offered pointers on how to approach the conversation. It suggested using the “sandwich method”: first expressing how he felt about the friendship, then addressing the issues, and finally offering reassurance. It also told him what not to do. “I wasn't expecting the AI to disagree with me, but it did,” he says. “It would tell me, ‘If you do this, it might be received this way. Address it after you tell her how you feel instead.’”

Cassanova isn’t the only one using artificial intelligence to navigate uncomfortable conversations. As AI becomes more commonplace, research suggests that a growing number of people are folding it into their interpersonal lives. In a study conducted by EVA AI, 28 per cent of respondents said they had used AI companions to rehearse emotionally difficult conversations before having them in real life. Meanwhile, a study by Resume.org found that 94 per cent of Gen Z workers surveyed use AI chatbots to help navigate workplace issues. From friendship fallouts and office drama to anti-ghosting texts, AI is quickly becoming some people’s go-to resource for getting through life’s most awkward conversations.

“I’m a recovering people-pleaser, and I was afraid to sound a certain way. I was relying on ChatGPT to tell me how I can tell my friend that she is upsetting me without sounding like I'm attacking her character

But why exactly are people turning to AI for guidance instead of, say, their friends? For Rebecca*, a 24-year-old living in New York, the answer is a blend of exasperation and embarrassment. Last summer, as Rebecca and her boyfriend approached their fourth anniversary, their relationship was crumbling. “My friends were all split; some were saying I should break up with him, some were saying I shouldn't,” Rebecca says. “I just felt so whiny; I wanted to talk to a non-biased third party.” Part of Rebecca also feared that, if she kept going to her friends with her relationship qualms and then didn’t break up with her boyfriend, their image of him would be ruined beyond repair. “I started thinking, ‘I don’t know who to turn to, so I need to start talking to Chat about this,’” she says.

ChatGPT became her sounding board as she navigated the relationship’s end. On long walks, Rebecca would think about what she wanted to say to her partner. Then, she’d take it back to the chatbot, writing, “This is what I want to say, help me say it better”. This helped her prepare for when she would finally meet up with her now-ex to talk. “I would show up and have this pretty long thing written out about how I was feeling, what had happened and what he had done, and a lot of it was based on what Chat and I had decided to say,” she says. “Chat really was my backbone, and because I was having such a hard time standing up for myself, it made the delivery so much easier.”

Many others are using AI to help find the right words, including Aniya*, a 24-year-old living in Maryland. At the peak of her dating app era, Aniya would routinely find herself needing to initiate a “this isn’t going to work” conversation with someone she had been seeing. To avoid sounding too detached or nonchalant, Aniya would often use ChatGPT to help craft these messages, giving it ample detail to generate the best response. “This probably isn’t good, but at this point, [ChatGPT] knows my past relationships and past situations,” says Aniya. “It can reference them and bring up themes. It’ll say things like, ‘I noticed you switched to the emotional part of your relationship fast again’”.

So, is rehearsing your conversations with AI actually helpful? Dorothy Leidner, a professor at the University of Virginia who studies the ethics of artificial intelligence, sees the value in using AI to work through difficult conversations. “I think it’s actually a very reasonable use of the language models,” she says. “It’s pretty clever, especially for people who have a little shyness or insecurity.” And for some, it really does seem to help. Rebecca, for example, felt more equipped to push back against the gaslighting she experienced from her ex because she had already worked through her feelings with AI. “I was looking at him and saying, ‘You can't tell me my emotions aren’t real,’” she says. “I wouldn’t even have been able to say that if I hadn’t had Chat behind me.”

“I was looking at him and saying, ‘You can't tell me my emotions aren't real’. I wouldn't even have been able to say that if I hadn’t had Chat behind me”

It hasn’t worked out for everyone, however. 26-year-old Ftsum Michael had a period of using AI to craft messages to people he was no longer interested in seeing, with the goal of landing on a message that was clear but not hurtful. “Oftentimes, because I have perfectionist tendencies and want to get things right, I’ll lean towards Claude or ChatGPT to express my thoughts in a ‘perfect’ way,” he says. In these scenarios, though, the technology fell flat. “When I was using it, I didn't feel like it was really expressing my feelings in the way that I wanted,” he says. “It’s never going to be as poignant as when you go back and forth with someone to find exactly what you’re itching at.” He has resolved never to use AI in this way again.

Social scientist and researcher Julie Carpenter, who studies human-AI interactions, is one of the more sceptixperts. Carpenter argues that because social interactions are not fixed, simulating conversations with scripts developed using AI may fall apart once you use them on a real person. This is because, as Carpenter points out, AI is an emerging social actor, but not a human one, so it has no responsibility, accountability or understanding of the stakes. “It doesn’t understand risk, emotional or physical, and the context of gender, race or history. All of those things are what make you human and make you feel vulnerable.”

As our society becomes more reliant on AI, it’s easy to view this kind of emotional outsourcing as impersonal or even dystopian. But many people are not acting out of laziness, but out of a sort of perfectionism. In an attempt to handle delicate situations the “right” way, AI, with its resolute responses, can feel like a cheat code – so much so that some people come to trust it more than their own intuition. The reality, of course, is that interpersonal relationships are always messy and unpredictable. To speak to one another is to constantly be at risk of having your message be misunderstood, no matter how many times that message is run through ChatGPT. As Carpenter puts it: “AI might be a form of self-soothing in the moment, but it’s a temporary salve.”