Disclaimer: This post has been created by AI but moderated by a human editor to eliminate false information or hallucinations.

ChatGPT and other generative AI chatbots are proving to be of immense use, but they are talebearers. The data is meant to train and refine the chatbot, which is a good reason not to trust it with any private or confidential work-related information. If you do, be assured the chatbot will not only add it to its library but may also accidentally reveal it to other users.

Samsungs' employees learnt the hard way. The company has identified three instances where employees shared the necessary code from internal frameworks to "check for errors" and optimize code, The Economist Korea reports.

Samsung Electronics had previously blocked the use of ChatGPT at the workplace due to security concerns, but the Device Solutions (DS) division recently permitted its use to train all executives and employees to be aware of technological changes. The company notified staff to be careful about internal information security and not to enter private information.

Samsung Exynos chipset

Samsung Electronics is now preparing measures to prevent future leaks through ChatGPT. The company warned staff that access to the tool might be blocked on the company network if similar accidents occur. It also informed its executives and employees, "As soon as content is entered into ChatGPT, data is transmitted and stored to an external server, making it impossible for the company to retrieve it."

An employee of Samsung Electronics' DS division confirmed an error while executing the source code of the semiconductor facility measurement database (DB) download program. They copied the problematic source code, entered it into ChatGPT, and inquired about a solution.

Similarly, a second employee entered a program code written to identify yield and defective equipment into ChatGPT, causing an accident. At the same time, a third one converted the contents of a meeting recorded on their smartphone into a document file through the Naver Clova application and then entered it into ChatGPT to request the preparation of meeting minutes.

Samsung Electronics is taking the incidents seriously and is investigating the executives and staffers involved in the leaks and will proceed with disciplinary action will be taken if necessary. The company is also notifying staff to be careful about using ChatGPT, reminding them that it transmits and stores data to an external server as soon as it is entered, making retrieval impossible.

Did you like what you just read? Share it!