ChatGPT has become a popular tool for various work tasks due to its ability to process and summarize information quickly. However, it’s essential to remember that any information shared with the chatbot can be used to train the system and potentially appear in its responses to other users. It can be labelled as ChatGPT’s Potential Risk. Unfortunately, Samsung employees learned this lesson hard after reportedly sharing confidential information with ChatGPT.
According to a report by The Economist Korea, workers in Samsung’s semiconductor division leaked secret information to ChatGPT on at least three occasions. One employee asked the chatbot to check sensitive database source code for errors, while another solicited code optimization. In another instance, an employee fed a recorded meeting into the chatbot and asked it to generate minutes.
Microsoft Bing unveils Chat GPT powered search engine to rival Google
Chat GPT: Can it be an alternative to Google Search?
To prevent similar security breaches in the future, Samsung is said to have restricted the length of employees’ ChatGPT prompts to a kilobyte. The company is also investigating the employees involved. The company is also building its own chatbot to ensure data security.
ChatGPT Data Policy
It’s important to note that ChatGPT’s data policy states that unless users explicitly opt-out, their prompts are used to train the system’s models. OpenAI, the chatbot’s owner, advises users not to share confidential information with ChatGPT since it cannot delete specific prompts from a user’s history. The only way to delete personally identifying information is by deleting the account, which can take up to four weeks.
ChatGPT’s Potential Risks
The Samsung incident serves as a cautionary tale for users of chatbots and those engaging in online activities. It’s difficult to predict where your data might end up, so it’s essential to be mindful of what information you share. Users can ensure their data remains secure when using ChatGPT and other chatbots by taking the necessary precautions.