HomeTechnologyChatGPT: According to report,...

ChatGPT: According to report, Samsung employees gave company sensitive data to chatbot

Three Samsung employees leaked sensitive company data to ChatGPT while they were using it. | Fountain: Hide | Photographer: BABAK HABIBY/ZAKH WOLF

adUnits.push({
code: ‘Rpp_tecnologia_mas_tecnologia_Nota_Interna1’,
mediaTypes: {
banner: {
sizes: (navigator.userAgent.match(/iPhone|android|iPod/i)) ? [[300, 250], [320, 460], [320, 480], [320, 50], [300, 100], [320, 100]] : [[300, 250], [320, 460], [320, 480], [320, 50], [300, 100], [320, 100], [635, 90]]
}
},
bids: [{
bidder: ‘appnexus’,
params: {
placementId: ‘14149971’
}
},{
bidder: ‘rubicon’,
params: {
accountId: ‘19264’,
siteId: ‘314342’,
zoneId: ‘1604128’
}
},{
bidder: ‘amx’,
params: {
tagId: ‘MTUybWVkaWEuY29t’
}
},{
bidder: ‘oftmedia’,
params: {
placementId: navigator.userAgent.match(/iPhone|android|iPod/i) ? ‘22617692’: ‘22617693’
}
}]
});

ChatGPT changed the paradigm of what a system is artificial intelligence mass consumption can reach and is extremely useful for a wide range of tasks. However, a significant portion of its millions of users forget that by asking a chatbot to summarize important notes or check their work for errors, it can use this information to train its system and can even display it in other users’ responses. This became clear after the latest case involving workers from Samsung.

According to the portal Economist Koreaby using pureeseveral employees Samsung did not know about the above detail prior to sharing confidential information about the company with ChatGPT. Shortly after the company’s semiconductor division allowed its engineers to use the chatbot, workers passed secret data to it at least three times.

Issues with ChatGPT and company privacy

Apparently the employee asked ChatGPTchat bot Open AIto check the source code of a confidential database for errors, another requested code optimization, and a third entered into the platform a recorded confidential meeting with a request to generate a document based on it.

According to reports, upon learning of these leaks, Samsung tried to limit the scope of future errors by limiting the length of requests to ChatGPT employees per kilobyte or 1024 characters of text. It is also reported that the company is investigating the three mentioned employees and even decided to create its own chatbot to avoid incidents of this magnitude.

Delicate situation with ChatGPT

Data policy ChatGPT states that unless users explicitly exclude it, the system can use their messages to train its language models. Open AIthe company that owns the chatbot urges users not to share sensitive information with ChatGPT in conversations because “it cannot remove certain messages from history.”

Support for the artificial intelligence platform indicates that the only way to remove personal information from ChatGPT is to delete the account, a process that can take up to four weeks. The case for employees Samsung is another example of why it’s important to be careful when using chatbots, something that every user has to spread across all their online activity, as you never know where that data ends up.

We recommend you METADATA, an RPP technology podcast. News, analytics, reviews, recommendations and everything you need to know about the tech world.

Source: RPP

- A word from our sponsors -

Most Popular

LEAVE A REPLY

Please enter your comment!
Please enter your name here

More from Author

- A word from our sponsors -

Read Now