By Using ChatGPT to Generate Notes, Samsung Employees Leak Confidential Info to AI Chatbot

Three Samsung employees accidentally leaked their company's sensitive data to ChatGPT. According to the latest reports, the staffers used the AI chatbot to help with general office tasks. In some instances, however, they were unknowingly providing confidential information to the chatbot that it could share with other ChatGPT users.

Samsung
Photo by JUNG YEON-JE/AFP via Getty Images

Samsung Staffers Feed Company's Source Code, Meeting Clip to ChatGPT

The Samsung employees who reportedly leaked secret information to ChatGPT appear to have no ill intentions to put their company in jeopardy. According to a report from The Economist Korea via Mashable, the staffers were just doing their job in the semiconductor division with the help of the AI chatbot.

However, there was an instance when an employee used ChatGPT to debug a confidential source code. There was another employee who pasted a top-secret code to the chatbot for "code optimization." Lastly, an employee provided ChatGPT with a recording of an office meeting to generate notes.

On those separate occasions, the employees inadvertently exposed their company's secrets to anyone who is also using ChatGPT. It seems that the Samsung division handling these employees did not expect that allowing engineers to use ChatGPT for code debugging and optimization could endanger the company. The worst part of this fiasco is that Samsung has no way to erase the confidential information that the employees had already shared with ChatGPT.

As per Engadget, ChatGPT's data policy states that all the information coming from the users will be used to train the AI model. Therefore, it is a bad idea to share anything with the chatbot that is top-secret, or else there is a good chance that ChatGPT will relay that secret to another person who is using the chatbot for a related matter.

The AI model cannot delete prompts from the history, and the sure way to remove them is to delete the user's account. Unfortunately, account deletion can take up to four weeks.

Before using ChatGPT, OpenAI advises the public to think about the information that they are going to share with the chatbot. While people can benefit from the AI model to perform work-related tasks, it is not advisable to use ChatGPT to summarize confidential work documents.

Data Privacy Concerns Plague ChatGPT

The Economist Korea reported that Samsung addressed the problem immediately after learning about the incidents. Aside from investigating the three employees involved in the data leak, Samsung also put a restriction on the employees' use of ChatGPT. Now, a staffer's prompts are limited to only a kilobyte, which is equivalent to 1024 characters. Furthermore, Samsung is reportedly working on its own chatbot to prevent similar incidents from happening again.

Samsung's recent experience with ChatGPT just adds to the growing list of reasons why some entities want to ban the use of the AI bot. In a separate report from Mashable, Italy banned ChatGPT in the country, stating that it violates the General Data Protection Regulation (GDPR). Some data experts also expressed their concern about ChatGPT's use of private data such as users' medical records and legal documents to improve the AI model.

© 2024 iTech Post All rights reserved. Do not reproduce without permission.

More from iTechPost

Real Time Analytics