Samsung leaked top secret information to ChatGPT, sparking investigation

Three Samsung employees unintentionally leaked top-secret information to OpenAI's ChatGPT, sparking an internal investigation and an immediate reaction.

Samsung leaked top secret information to ChatGPT, sparking investigation
Comment IconFacebook IconX IconReddit Icon
Tech and Science Editor
Published
Updated
2 minutes & 30 seconds read time

Samsung has made itself an example of how you aren't supposed to use AI-powered chatbots such as OpenAI's ChatGPT, Microsoft's Bing Chat, or Google's Bard.

Samsung leaked top secret information to ChatGPT, sparking investigation 415

A new report from The Economist claims that at least three Samsung employees have accidentally leaked sensitive information with OpenAI's ChatGPT, with one instance involving a source code of a confidential database being entered into the chatbot to check for errors. Another instance includes a Samsung employee sharing code optimization, and the last was an employee requesting the chatbot convert an internal Samsung video of a meeting into minutes.

Problems such as what Samsung is facing right now are real examples of what some digital privacy experts have sounded alarms about since the emergence of AI chatbots. It was only yesterday that a law professor from George Washington University revealed that OpenAI's ChatGPT wrongfully accused him of sexual assault, which falls under defamation and disinformation. The law professor asked who is culpable for AI chatbots spewing misinformation about individuals that could have very real impacts on reputations and, by extension, careers, and lives.

Samsung's issue has seemingly been fixed, with reports indicating the company immediately rolled out a response that prevents each Samsung employee from uploading more than 1024 bytes to ChatGPT. Additionally, Samsung has launched an internal investigation into the people involved in to leak and will be creating its own internal chatbot to prevent any further embarrassment. However, this new limitation is more so a band-aid fix on the larger issue. Individuals will continue to accidentally share sensitive information with AI chatbots the more they become accessible and integrated into different facets of society and business.

Privacy experts have already voiced their concerns about hypothetical scenarios of someone sharing confidential legal documents, medical information, contact information, or any other sensitive data with an AI chatbot. The basis of this concern is the warning from developers to refrain from sharing any sensitive information with any AI chatbot, as developers are unable to delete specific prompts from users' history. OpenAI warns it's "not able to delete specific prompts from your history" and that the only way to delete information on ChatGPT is to delete the account that inputs the data, which can take up to four weeks to complete.

Notably, all data entered into ChatGPT is being fed right back into the machine to improve its efficiency, response variety, accuracy, creativity, and more. So, if you enter sensitive information into ChatGPT and don't delete the account, the AI will consume that information and add it to its database. This is if you haven't chosen to opt out of your chat histories being used to train ChatGPT.

Photo of the DALIX NASA Hat Baseball Cap Washed Cotton Embroidered
Best Deals: DALIX NASA Hat Baseball Cap Washed Cotton Embroidered
Today7 days ago30 days ago
$19.98 USD$19.98 USD
$51.19 CAD$51.19 CAD
£27.32-
$19.98 USD$19.98 USD
Check PriceCheck Price
* Prices last scanned 4/24/2026 at 9:58 am CDT - prices may be inaccurate. As an Amazon Associate, we earn from qualifying purchases. We earn affiliate commission from any Newegg or PCCG sales.
News Sources:mashable.com and engadget.com

Tech and Science Editor

Email IconX IconLinkedIn Icon

Jak joined TweakTown in 2017 and has since reviewed 100s of new tech products and kept us informed daily on the latest science, space, and artificial intelligence news. Jak's love for science, space, and technology, and, more specifically, PC gaming, began at 10 years old. It was the day his dad showed him how to play Age of Empires on an old Compaq PC. Ever since that day, Jak fell in love with games and the progression of the technology industry in all its forms.

Follow TweakTown on Google News
Newsletter Subscription