Do you turn to ChatGPT when you’re spiralling because your situationship left you on read? Bad news: Sam Altman, the CEO of OpenAI, has said that the company is unable to protect user privacy when it comes to sensitive, therapy-style conversations.

“If you go talk to ChatGPT about your most sensitive stuff, and then there’s like a lawsuit or whatever, we could be required to produce that. I think that’s very screwed up,” Altman told podcaster Theo Von in a recent podcast episode. “I think we should have the same concept of privacy for your conversations with AI that we do with a therapist or whatever – and no one had to think about that even a year ago.”

People talk about the most personal shit in their lives to ChatGPT. People use it – young people, especially, use it – as a therapist, a life coach; having these relationship problems and [asking] ‘what should I do?’” he continued. “Right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s like legal privilege for it – there’s doctor-patient confidentiality, there’s legal confidentiality. And we haven’t figured that out yet for when you talk to ChatGPT.”

As ChatGPT is not encrypted – unlike WhatsApp or Signal – it is also possible for OpenAI to read chats between users and ChatGPT. This includes OpenAI staff analysing conversations to fine-tune the AI model and monitoring for misuse. 

When Altman asked Von about his own ChatGPT usage, the podcaster revealed that he didn’t use the chatbot very much due to his own privacy concerns. Altman stressed that he understood why people might be reluctant to use ChatGPT when they have concerns about privacy: “I think it makes sense… to really want the privacy clarity before you use [ChatGPT] a lot — like the legal clarity,” he said.

Relatedly, in June, The New York Times and other news plaintiffs filed a court order against OpenAI seeking that it retain all ChatGPT user logs, including deleted chats, indefinitely. OpenAI is currently appealing the order.