Use of ChatGPT is restricted at Apple for fear of leaks

2023-05-19 12:13:49

Following in the footsteps of other companies, Apple restricted the use of ChatGPT and other external artificial intelligence tools for some employees, according to a document analyzed by The Wall Street Journal.

According to the report, Apple is concerned that employees who use these services might end up leaking confidential data. Maçã also alerted its collaborators regarding the use of GitHub Copilotwhich automates the programming of software codes.

Apple’s concern is well founded. That’s because when people use these tools, data is sent back to the developer for enhancements to those models, presenting the potential for sensitive information to be leaked — basically the antithesis of what Apple stands for.

As we mentioned, several organizations have also taken cautious steps toward using these technologies as their employees have begun to use them for a variety of tasks—from writing emails and marketing materials to programming—including JPMorgan Chase, Verizon, and Amazon, which urged their engineers who take advantage of AI tools for programming to use their own in-house solution.

It is worth noting that Apple was one of the first companies to offer artificial intelligence technologies when it launched Siri, its virtual assistant, in 2011. But the company lagged behind most of its rivals in terms of generative technologies — regarding which the CEOChief executive officeror executive director.”>1 of the Apple, Tim Cookhas already expressed concerns.

Back to the ChatGPT case, the timing announcement of the measure is intriguing to say the least, since the official app of the OpenAI tool for iOS was launched yesterday — only in the United States, however. However, how noticed by journalist Mark Gurman (and Bloomberg), it is possible that this restriction has been in effect for some time within the Cupertino giant.

via The Verge


1684573633
#ChatGPT #restricted #Apple #fear #leaks

Leave a Replay