ChatGPT CEO Sounds Alarming Warning: Is Relying on AI for Therapy Putting You at Risk?
Ever find yourself chatting with ChatGPT about your day, only to wonder—am I venting to a therapist or just typing into the void? Lately, more folks are turning to AI not just for quick answers or dinner ideas, but for some serious personal advice. Sam Altman, the big boss over at OpenAI, threw down a wild curveball recently: what if your heart-to-heart with ChatGPT isn’t as private as a therapist’s couch? Imagine spilling your deepest thoughts and then—bam! —they get subpoenaed in a lawsuit. Creepy, right? Altman insists it’s high time the law treats our AI convos with the same hush-hush respect as doctor-patient talks, especially as younger crowds lean on digital pals for emotional support. The whole thing makes you wonder… is your AI therapist secretly taking notes for the courtroom? Intrigued? LEARN MORE
While most of us rely on ChatGPT for simple tasks like summarising reports or meal planning, a growing number of people are reportedly turning to the AI software as a form of therapy.
According to OpenAI CEO Sam Altman, more people are asking ChatGPT for advice on personal issues, whether related to health, relationships or anything else, but he wants users to be aware of the implications of doing so.
If a person goes to see an actual therapist, their conversations are protected against being used as evidence in legal proceedings — though there are exceptions to the rule. Altman believes this legal privilege should extend to person conversations between users and ChatGPT, as he explained on a recent podcast appearance.
“So, if you go talk to ChatGPT about your most sensitive stuff and then there’s like a lawsuit or whatever, we could be required to produce that, and I think that’s very screwed up,” he told podcast host Theo Von in an episode of This Past Weekend.

Altman says the laws around ChatGPT logs need to change. (Justin Sullivan/Getty Images)
“Right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s like legal privilege for it — there’s doctor-patient confidentiality, there’s legal confidentiality. We haven’t figured that out yet for when you talk to ChatGPT.”
Altman believes this needs to change, as he calls for the ‘same concept of privacy for your conversations with AI’ as we currently have with therapists, adding that the issue should be ‘addressed with some urgency.’
With a growing number of young people relying on tools like ChatGPT for personal issues being a relatively recent trend, the tech mogul noted the issue was a very new concern.
He said: “No one had to think about that even a year ago, and now I think it’s this huge issue of like, ‘How are we gonna treat the laws around this?'”

More young people are turning to AI for personal issues. (Getty Stock Images)
As it stands, ChatGPT conversations can be accessed by OpenAI in order to improve the service or to monitor for misuse, however, deleted chats are permanently deleted within 30 days, unless the company deems it necessary to keep them for ‘legal or security reasons’.
So, basically, they’re not deleted.
Meanwhile, the New York Times and a number of other plaintiffs took OpenAI to court last month in a bid to request the company keeps hold of all its user logs indefinitely as part of a copyright lawsuit. The publication claims the tech company uses its articles to train their AI models without permission.
Post Comment