Meta says it has taken down about 20 covert influence operations in 2024 | Meta | The Guardian

See original article

Meta's Actions Against Covert Influence Operations

Meta has taken down around 20 covert influence operations worldwide in 2024. While Russia remains the primary source of such operations, the use of AI in these campaigns was surprisingly low, according to Nick Clegg, president of global affairs at Meta.

AI's Limited Role in 2024 Elections

Despite warnings about AI-fueled disinformation, its impact on elections seems to have been modest. Meta received over 500,000 requests to generate AI images of political figures in the lead-up to the US election but observed limited significant influence from AI-generated content.

Examples of Disinformation Campaigns

  • A Russian network used fake accounts and websites to target Georgia, Armenia, and Azerbaijan.
  • Another Russia-based operation used AI to create fake news websites mimicking reputable brands to undermine Western support for Ukraine and promote Russia's role in Africa.

Other sources of foreign interference include Iran and China. However, experts warn against complacency, as the impact of AI in manipulating content is likely to increase.

AI's Subtle Influence

While the direct impact of AI-generated disinformation was limited in 2024, research suggests AI may have subtly shaped election discourse. For instance, misleading claims and xenophobic memes, assisted by AI, spread during the US election.

Sign up for a free account and get the following:
  • Save articles and sync them across your devices
  • Get a digest of the latest premium articles in your inbox twice a week, personalized to you (Coming soon).
  • Get access to our AI features