Samsung puts ChatGPT in its cross hairs after staff leak insider source code

After an internal Samsung source code was leaked onto ChatGPT, the company has issued a ban on all generative AI services by employees.

2 minutes & read time

Early last month, Samsung staff accidentally leaked an internal source to OpenAI's ChatGPT, sounding alarm bells within the company and outside of it.

Samsung puts ChatGPT in its cross hairs after staff leak insider source code 9653

There were at least three instances of Samsung employees sharing confidential information with OpenAI's AI-powered chatbot, ChatGPT. The first was a staff member leaking the source code of a confidential database into ChatGPT, which was followed by a request for the AI to check for any errors. Another instance was code optimization being shared with ChatGPT, and the last was a request for ChatGPT to convert an internal Samsung video of a meeting into minutes.

Now, Bloomberg News has read a new memo issued to staff notifying them of a new policy change. According to the memo, Samsung is terribly concerned about the rise of AI-powered tools and how it may affect intellectual property. In an effort to reduce any leaks of confidential data that may or may not impact the company, a widespread ban has been placed on any AI-generative tools on employee-issued devices. Samsung staff are now prohibited from having any AI-generative tools or applications on company-owned computers, tablets, phones, and on any of Samsung's internal networks.

Samsung puts ChatGPT in its cross hairs after staff leak insider source code 3625

Furthermore, Samsung has requested that all staff refrain from entering any personal or company-related information into any AI-generative tool. If an employee is caught exposing company-related information that could reveal Samsung's intellectual property, the staff member could have their employment terminated under a breach of the newly implemented policy. It should be noted that Samsung's ban is strictly on company-owned devices and doesn't apply to personal devices.

It also seems that Samsung employees understand that AI-generative tools pose a risk to the company's IP, as Samsung reports that an internal survey indicated that 65% of Samsung employees agreed that AI-generative tools pose a security risk. So, what will be done? Samsung will build its own AI tools that it's sure won't be a security risk.

According to the company, Samsung developers are already working hard on AI tools capable of translating and summarizing internal documents, software development, and a way to stop employees from uploading confidential information to external AI tools such as ChatGPT. The internal memo states that Samsung recognizes that AI tools can be used to increase productivity, but until they are developed safely in a secure environment, they will be banned.

The concerns surrounding digital privacy also affect the everyday person, with privacy experts warning that any sensitive information shouldn't be shared with AI chatbots. This sentiment is also shared by OpenAI, which warns users not to share any sensitive information with ChatGPT as developers are unable to delete specific prompts.

Buy at Amazon

Vintage NASA Approved Worm Logo Retro Graphic 80s Full Long Sleeve Tee Shirt

TodayYesterday7 days ago30 days ago
* Prices last scanned on 9/28/2023 at 11:12 am CDT - prices may not be accurate, click links above for the latest price. We may earn an affiliate commission.

Jak joined the TweakTown team in 2017 and has since reviewed 100s of new tech products and kept us informed daily on the latest science, space, and artificial intelligence news. Jak's love for science, space, and technology, and, more specifically, PC gaming, began at 10 years old. It was the day his dad showed him how to play Age of Empires on an old Compaq PC. Ever since that day, Jak fell in love with games and the progression of the technology industry in all its forms. Instead of typical FPS, Jak holds a very special spot in his heart for RTS games.

Newsletter Subscription

Related Tags