ADVERTISEMENT

International

OpenAI Says China-Linked Group Tried to Phish Its Employees

The ChatGPT virtual assistant logo on a smartphone arranged in Riga, Latvia, on Friday, Aug. 16, 2024. The public release of advanced generative AI tools such as Google’s Gemini, Meta AI, and OpenAI’s ChatGPT over the past two years has heightened fears that millions of workers could be displaced. (Andrey Rudakov/Bloomberg)

(Bloomberg) -- OpenAI said a group with apparent ties to China tried to carry out a phishing attack on its employees, reigniting concerns that bad actors in Beijing want to steal sensitive information from top US artificial intelligence companies.The AI startup said Wednesday that a suspected China-based group called SweetSpecter posed as a user of OpenAI’s chatbot ChatGPT earlier this year and sent customer support emails to staff. The emails included malware attachments that, if opened, would have allowed SweetSpecter to take screenshots and exfiltrate data, OpenAI said, but the attempt was unsuccessful.“OpenAI's security team contacted employees who were believed to have been targeted in this spear phishing campaign and found that existing security controls prevented the emails from ever reaching their corporate emails,” OpenAI said.The disclosure highlights the potential cybersecurity risks for leading AI companies as the US and China are locked in a high-stakes battle for artificial intelligence supremacy. In March, for example, a former Google engineer was charged with stealing AI trade secrets for a Chinese firm.China's government has repeatedly denied allegations by the US that organizations within the country perpetrate cyberattacks, accusing external parties of organizing smear campaigns.OpenAI revealed the attempted phishing attack as part of its latest threat intelligence report, outlining its efforts to combat influence operations around the world. In the report, OpenAI said it took down accounts from groups with links to Iran and China that used AI for coding assistance, conducting research and other tasks.

--With assistance from Rachel Metz.

©2024 Bloomberg L.P.