The Shocking Truth Behind Sharing Your Darkest Secrets with ChatGPT—Experts Warn of Hidden Dangers

The Shocking Truth Behind Sharing Your Darkest Secrets with ChatGPT—Experts Warn of Hidden Dangers

Ever found yourself spilling your deepest, darkest secrets to a chatbot like ChatGPT, thinking it’s a digital therapist who won’t judge? Well, hold up! Before you go unloading every cringeworthy detail about your love life or venting about your boss, consider this: an Oxford professor warns that this AI isn’t your empathetic bestie — it’s just a sophisticated parrot fed on internet scraps, repeating what you feed it without a hint of sympathy. And here’s the kicker — what you share today could end up training future versions of the bot, making your secrets public fodder in the digital shadowland. Plus, the human cost behind the scenes? Not so glamorous, with workers exposed to some seriously disturbing content to keep these systems humming. So, maybe chatbots aren’t the best pen pals for your soul after all. Curious to dive deeper (but maybe keep your secrets close)? LEARN MORE.

Expert reveals disturbing reason why you should never tell your dark secrets to ChatGPT
Published 20:18 14 Jul 2025 GMT+1

Perhaps a set of code using data it’s mined from the internet archive isn’t your BFF?

When it comes to ChatGPT, there seems to be two camps; those who vow to never use it and those who use it to outsource the most mundane, time-consuming and even human elements of life.

Some people, for whatever reason, use this form of artificial intelligence as a therapist or friend to whom they confide in and ask for personal advice.

But be careful of befriending ChatGPT and telling the tool too many of your secrets – you never know where it’ll end up and how it could be exploited.

Don’t forget that there’s also a human cost of creating speedy software such as ChatGPT, not least the workers in the global south, including Kenyan workers who were forced to endure vivid descriptions of sexual abuse, violence, racist and hateful text.

AI can use the information you feed it to train future models (Getty Stock)

AI can use the information you feed it to train future models (Getty Stock)

An Oxford University computer science professor has shared his warning against making the AI platform your best friend. Mike Woolridge told The Daily Mail: “It has no empathy. It has no sympathy.

“That’s absolutely not what the technology is doing and crucially, it’s never experienced anything. The technology is basically designed to try to tell you what you want to hear – that’s literally all it’s doing.”

And considering that human connection is about having compassion and empathy, especially if you’re in need of interpersonal advice, perhaps a set of code behind a screen isn’t the best option?

ChatGPT is not you friend (Getty Stock)

ChatGPT is not you friend (Getty Stock)

Not only that, Professor Woolridge also warned against the data breach potential of sharing sensitive information. In 2023, Italy became the first Western country to ban ChatGPT due to the information being regurgitated for training purposes. An Italian data-protection authority said the app had breached data involving user conversation and payment information.

The watchdog said OpenAi had had no legal justification for ‘the mass collection and storage of personal data for the purpose of ‘training’ the algorithms underlying the operation of the platform.’

And although ChatGPT says users between the ages of 13 to 18 need to obtain parental consent before using it, the Italian watchdog claimed it still ‘exposes minors to absolutely unsuitable answers compared to their degree of development and awareness.’

Professor Woolridge echoed the concerns, saying: “You should assume that anything you type into ChatGPT is just going to be fed directly into future versions of ChatGPT.

“It’s extremely unwise to start having personal conversations or complaining about your relationship with your boss, or expressing your political opinions on ChatGPT.”

Featured Image Credit: (Getty Stock)

Topics: Artificial Intelligence, Mental Health, Technology, News, World News, Science

Post Comment

RSS
Follow by Email