ChatGPT Chats Are Not Private and Could Be Used in Court – OpenAI CEO Sam Altman Warns

Courtesy Image: Open AI CEO Sam Altman [Topic: ChatGPT Chats Are Not Private and Could Be Used in Court]
In a candid conversation on This Past Weekend with Theo Von, aired on July 23, 2025, OpenAI CEO Sam Altman dropped a surprising revelation: conversations with ChatGPT, the popular AI chatbot developed by OpenAI, are not as private as many users assume.

Speaking from OpenAI’s San Francisco headquarters, Altman noted that many, especially younger users, often treat the AI Chatbot as a therapist or life coach, sharing deeply personal information without realising these chats could be used as evidence in legal cases.

“People talk about the most personal sh** in their lives to ChatGPT,” Altman said. “Right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it… And we haven’t figured that out yet for when you talk to ChatGPT.”

He emphasised that, unlike doctor-patient or attorney-client confidentiality, there’s no legal framework protecting AI conversations, meaning OpenAI could be compelled to provide chat records in lawsuits or investigations.

“I think that’s very screwed up,” he added, calling for new privacy protections similar to those granted to human professionals.

This warning isn’t theoretical. In June 2025, The New York Times, as part of a copyright lawsuit, sought a court order requiring OpenAI to retain all ChatGPT user chats worldwide, including deleted ones and API data, indefinitely, except for users on specific exempt plans. This request aims to preserve potential evidence of copyright infringement, such as instances where ChatGPT outputs closely resemble NYT articles. OpenAI is appealing, arguing that this demand threatens user privacy and oversteps legal bounds. This case exposes how vulnerable your data can be when using AI tools.

Beyond copyright disputes, experts warn that ChatGPT conversations could be subpoenaed in contract disputes, harassment claims, or criminal investigations, just like emails or texts. Even less obviously, these digital records might affect decisions in visa or border checks if they emerge during background investigations.

Altman’s warning comes as millions use ChatGPT for sensitive topics, from relationship advice to mental health support. The absence of a legal framework has sparked concern among privacy advocates and legal analysts alike.

Until new laws emerge, it’s advised to treat AI chats like public documents: don’t share sensitive information unless you fully understand the risks.

Is it time for an “AI privilege” law? Altman thinks so. But until then, your safest move is to think before you type, especially if you’re discussing anything sensitive or potentially incriminating. With your associated email address, authorities in any country, be it Uganda, could request access to your ChatGPT data from OpenAI, that is, if they can’t first force you to hand it over yourself.

What do you use ChatGPT for?

For more tech truths, join our WhatsApp channel.

 

Leave a Comment

Your email address will not be published. Required fields are marked *

Picture of Isaac Odwako O.

Isaac Odwako O.

Isaac Odwako O., professionally known as Isaac Nymy, is a Ugandan digital designer and founder of Nymy Media and Nymy Net, a weblog and news network.

RELATED

Scroll to Top