Microsoft has asked its employees not to share sensitive information with ChatGPT out of concern that it could be used to train the chatbot, Insider reported.

ChatGPTPhoto: Jonathan Raa / Zuma Press / Profimedia Images

The policy was announced after an employee at the US company asked on an internal Microsoft forum whether he could use ChatGPT or other products from OpenAi, the company that developed the chatbot, at work.

“Please do not share sensitive terminal data with OpenAI as it may be used to train future models,” he was told.

The world’s sudden obsession with ChatGPT seems to have prompted some tech companies to warn their employees against interacting with the technology.

Last December, weeks after ChatGPT launched, Amazon issued a similar warning to its employees after a group of them also asked on an internal company forum whether the chatbot could be used in the workplace.

Microsoft and Amazon oppose the use of ChatGPT in the workplace

An Amazon lawyer warned employees not to share “any confidential information” with ChatGPT.

“This is important because the input data can be used to train a future chatGPT model, and we wouldn’t want its output to include or resemble our sensitive information (and we’ve already seen cases where it closely resembled existing material),” stressed the Lawyer Amazon.

In the case of Microsoft, the situation stands out even more, as the company founded by Bill Gates announced this month that it is in talks with OpenAI to invest and integrate ChatGPT into products such as Word, Outlook and Powerpoint, after already investing a billion dollars in OpenAI in 2019 year

The cost of new investments will be 10 billion dollars.