No Legal Confidentiality with ChatGPT Therapy, Warns Sam Altman

Why You Shouldn’t Use ChatGPT as a Therapist, According to Sam Altman

Many ChatGPT users, especially younger generations, turn to the AI chatbot for emotional support, life coaching, and even pseudo-therapy. It feels convenient, fast, and private — or so it seems. But OpenAI CEO Sam Altman just delivered a wake-up call: conversations with ChatGPT don’t enjoy the same legal confidentiality as those with a licensed therapist or medical professional. This crucial insight has raised questions about AI privacy laws and how data shared with artificial intelligence may be used or disclosed in the future. If you're among the growing number of users relying on AI for mental health advice or relationship help, it’s time to reassess.

Image Credits:This Past Weekend w/ Theo Von #599

This blog explores Altman’s concerns, how legal confidentiality works (and doesn't) in AI contexts, why ChatGPT users should be cautious with personal data, and what OpenAI’s stance might mean for the future of AI-based emotional support. Here's everything you need to know about the risks of using ChatGPT as a therapist — and what’s being done to address them.

No Legal Confidentiality When Chatting with ChatGPT

During an interview on This Past Weekend with Theo Von, Sam Altman was candid about a major gap in how AI fits into the current legal system. One of the most concerning revelations? There’s absolutely no legal confidentiality when you talk to ChatGPT like you would a therapist, lawyer, or doctor.

Altman stated: “People talk about the most personal sh** in their lives to ChatGPT… And right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it… We haven’t figured that out yet for when you talk to ChatGPT.”

That means if a legal case ever arises, OpenAI may be compelled by a court order to hand over a user’s conversation — including deeply personal topics like mental health, trauma, relationships, and even potential criminal disclosures. Unlike traditional therapy sessions, which are shielded under doctor-patient confidentiality or attorney-client privilege, chats with ChatGPT are not legally protected.

For individuals who consider ChatGPT their go-to for emotional support or decision-making help, this could have real-life consequences. Your AI chat isn’t just a safe space — it’s potentially discoverable evidence in a courtroom.

AI and Privacy Law: The Missing Framework in 2025

Despite massive strides in generative AI over the past few years, lawmakers still haven’t caught up. As of mid-2025, there's no robust legal framework outlining how AI conversations should be protected, stored, or disclosed. This is particularly alarming when considering how users treat ChatGPT as a digital confidante, asking questions like:

  • “How do I deal with anxiety about my family?”

  • “I think I’m depressed, what should I do?”

  • “I cheated in my relationship, do I tell them?”

  • “Can I be arrested for what I did when I was a teen?”

Altman’s comments highlight how vulnerable users are to unforeseen consequences. In the U.S., there are no AI-specific confidentiality laws — meaning OpenAI is obligated to comply with subpoenas or legal disclosures unless a user is protected by another layer of service, like ChatGPT Enterprise, which excludes chats from data retention policies.

Meanwhile, OpenAI is fighting a legal battle with The New York Times, where a court order could force the company to preserve chats from hundreds of millions of global users. This high-profile case is testing the limits of AI data privacy and could set a precedent that impacts everyone who uses conversational AI tools.

Until governments create enforceable AI privacy policies, your sensitive interactions with tools like ChatGPT are legally exposed.

Why You Should Think Twice Before Using AI for Therapy

While ChatGPT can be a helpful first step for information gathering or even emotional clarity, it’s essential to understand the boundaries. It is not a licensed therapist, and more importantly, it is not bound by confidentiality laws. That difference can have serious implications for your privacy, legal security, and mental health outcomes.

Here’s what users should keep in mind:

  • ChatGPT doesn’t understand context like a human therapist. While its responses may seem insightful, it lacks empathy, nuance, and long-term care strategies.

  • Your data can be stored and reviewed. Unless you're using services like ChatGPT Enterprise with data controls, your conversations may be saved and used for training or could be disclosed if required by law.

  • There is no guarantee of privacy. What feels like a one-on-one therapy session may, in reality, be logged somewhere — potentially accessible to legal or corporate entities.

  • AI can’t handle crises. If you're experiencing suicidal thoughts, emotional trauma, or other mental health emergencies, AI is not a substitute for real help.

Even Altman agrees that this is “screwed up” — he believes AI chats should eventually have the same privacy standards as therapy sessions. But until that becomes law, users must protect themselves by knowing the limitations.

What OpenAI and the Industry Can Do Next

Altman’s comments might just be the catalyst needed to push governments and tech companies to close the AI-privacy gap. Moving forward, OpenAI and other developers can take several steps to align AI conversations more closely with human confidentiality standards:

  1. Advocate for AI-specific privacy legislation. Just as HIPAA protects medical records, we need a digital confidentiality law for AI interactions.

  2. Allow user-level privacy settings. Users should be able to opt out of having sensitive chats stored or used for training.

  3. Develop AI therapist tools with oversight. If AI is going to assist with mental health, it should be certified, regulated, and designed in collaboration with psychologists.

  4. Improve transparency around data handling. Users must be clearly informed about what data is stored, for how long, and for what purpose.

Until these solutions are in place, the burden is on the user to be cautious. Don’t treat ChatGPT as your therapist unless you’re fully aware of what that means. Whether you're discussing emotional struggles or sensitive life decisions, assume that your AI conversation is not private in the legal sense.

Post a Comment

Previous Post Next Post