
In a candid conversation on This Past Weekend with Theo Von, aired on July 23, 2025, OpenAI CEO Sam Altman dropped a surprising revelation: conversations with ChatGPT, the popular AI chatbot developed by OpenAI, are not as private as many users assume.
Speaking from OpenAI’s San Francisco headquarters, Altman noted that many, especially younger users, often treat the AI Chatbot as a therapist or life coach, sharing deeply personal information without realising these chats could be used as evidence in legal cases.
“People talk about the most personal sh** in their lives to ChatGPT,” Altman said. “Right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it… And we haven’t figured that out yet for when you talk to ChatGPT.”
ChatGPT CEO Sam Altman says people share personal info with ChatGPT but don’t know chats CAN be used as court evidence in legal cases. #ChatGPT #NymyNet
— Nymy Net (@nymynet) July 26, 2025
He emphasised that, unlike doctor-patient or attorney-client confidentiality, there’s no legal framework protecting AI conversations, meaning OpenAI could be compelled to provide chat records in lawsuits or investigations.
“I think that’s very screwed up,” he added, calling for new privacy protections similar to those granted to human professionals.
Related Stories: Diagnosing Malaria Using a Smartphone and AI – Mak Ocular
Related Stories: How to Create Studio Ghibli AI Images with OpenAI’s GPT-4o
This warning isn’t theoretical. In June 2025, The New York Times, as part of a copyright lawsuit, sought a court order requiring OpenAI to retain all ChatGPT user chats worldwide, including deleted ones and API data, indefinitely, except for users on specific exempt plans. This request aims to preserve potential evidence of copyright infringement, such as instances where ChatGPT outputs closely resemble NYT articles. OpenAI is appealing, arguing that this demand threatens user privacy and oversteps legal bounds. This case exposes how vulnerable your data can be when using AI tools.
recently the NYT asked a court to force us to not delete any user chats. we think this was an inappropriate request that sets a bad precedent.
we are appealing the decision.
we will fight any demand that compromises our users’ privacy; this is a core principle.
Related Stories: 8 Clear Signs You Are Texting a Chatbot, Not a Human
Related Stories: Who Is Jenesis Kimera? Personal Life & Music – Everything You Need to Know (Part One)
— Sam Altman (@sama) June 6, 2025
Beyond copyright disputes, experts warn that ChatGPT conversations could be subpoenaed in contract disputes, harassment claims, or criminal investigations, just like emails or texts. Even less obviously, these digital records might affect decisions in visa or border checks if they emerge during background investigations.
Altman’s warning comes as millions use ChatGPT for sensitive topics, from relationship advice to mental health support. The absence of a legal framework has sparked concern among privacy advocates and legal analysts alike.
Until new laws emerge, it’s advised to treat AI chats like public documents: don’t share sensitive information unless you fully understand the risks.
Related Stories: Tech Skills to Learn in 2025: What Every Ugandan Needs to Know
Related Stories: Why Customer Experience Matters in Business Today
Is it time for an “AI privilege” law? Altman thinks so. But until then, your safest move is to think before you type, especially if you’re discussing anything sensitive or potentially incriminating. With your associated email address, authorities in any country, be it Uganda, could request access to your ChatGPT data from OpenAI, that is, if they can’t first force you to hand it over yourself.
What do you use ChatGPT for?
For more tech truths, join our WhatsApp channel.