Advertisement

OpenAI shuts down influence networks in China and Russia

  • The covert campaigns used the ChatGPT- maker’s AI tools to try to manipulate public opinion or shape political outcomes
  • Artificial intelligence is employed to generate text and images in larger volume and with fewer language errors than would have been possible by humans alone

Reading Time:3 minutes
Why you can trust SCMP
1
OpenAI said that in all of the operations it identified, AI-generated material was used alongside more traditional formats, such as manually written texts or memes on major social media sites. Photo: AP

OpenAI said it has cut off five covert influence operations in the past three months, including networks in China, Russia, Iran and Israel that accessed the ChatGPT-maker’s artificial intelligence products to try to manipulate public opinion or shape political outcomes while obscuring their true identity.

Advertisement

The new report comes at a time of widespread concern about the role of AI in global elections slated for this year.

In its findings, OpenAI listed the ways in which influence networks have used its tools to more efficiently deceive people, including using AI to generate text and images in larger volume and with fewer language errors than would have been possible by humans alone.

But the company said that ultimately, in its assessment, these campaigns failed to significantly increase their reach as a result of using OpenAI’s services.

“Over the last year and a half there have been a lot of questions around what might happen if influence operations use generative AI,” said Ben Nimmo, principal investigator on OpenAI’s Intelligence and Investigations team, in a press briefing Wednesday. “With this report, we really want to start filling in some of the blanks.”

Advertisement