Announcements

ASD would like your feedback! Please fill out our survey about ASD’s work and publications. (Estimated time to complete: one minute or less)

Our Takes

This week’s US Senate Intelligence Committee hearing showed bipartisan concern over foreign interference in US elections but underscored the need for clearer government guidelines on how to communicate with the public about a foreign operation. Read more takeaways from ASD Co-Managing Directors Rachael Dean Wilson and David Salvo here and what they were watching going into the hearing here.

The tactics behind Russian malign influence operations have remained “fairly static, but the technology is actually what has evolved”, allowing content to be spread more rapidly and convincingly, Managing Director David Salvo said on an episode of the Center for a New American Security’s Brussels Sprouts podcast.

ICYMI: Foreign interference in US elections is part of authoritarian actors’ broader efforts to “subvert a rules-based order” and portray democracy as a failed model to the rest of the world, Senior Fellow David Levine said during a panel on countering foreign threats to the 2024 US election at the McCain Institute.

Hamilton 2.0 Analysis

Russian diplomats and state media focused on two main narratives this week:

  • Protests in Georgia: Russian state-backed media last week continued to portray protests in Georgia against a recently passed “foreign agents” law as “provocations … organized by foreign-funded political leaders and non-governmental organizations”. Russian state-funded outlet Life.ru suggested that protest organizers were attempting to provoke a “color revolution”, a standard Russian trope, and that the protests were being influenced by “Georgians fighting in Ukraine”. Russian Foreign Ministry Spokeswoman Maria Zakharova called the EU “a patient with a bipolar disorder” over its response to the protests, and Russian First Deputy Permanent Representative to the United Nations Dmitry Polyansky expressed hope that “[Russia’s] Georgian neighbors will have enough wisdom to resist this push to sell out their country to the [United States] and its allies as Ukraine did”.
  • Andrii Derkach’s Interview: Andrii Derkach, a former member of the Ukrainian Rada whom the US Department of Justice labeled an “active Russian agent” and indicted for his efforts to interfere in the 2020 US presidential election, gave an interview last week to a Belarusian state media outlet that was widely circulated by monitored Russian state media accounts. In the interview, Derkach called for an “international tribunal for NATO crimes”, claimed that Ukraine is “under the control of Western special services” and creating a “dirty bomb”, and alleged that US President Joe Biden’s family is “funding terrorism”. Russian state media’s coverage universally failed to mention Derkach’s alleged role as a Russian intelligence asset, instead presenting him as a “Ukrainian politician” and “public activist”.

The People’s Republic of China’s (PRC) diplomats and state media focused on two main narratives this week:

  • US Tariffs: The PRC Ministry of Foreign Affairs denounced the new tariffs imposed by the Biden administration on Chinese electric vehicles (EVs) and chips on Tuesday and accused the United States of seeking “to dismantle global trade”. State media concurred, with Xinhua calling the measure “protectionist” and the Global Times estimating that Washington was “adding mistakes to mistakes”. On a related note, the People’s Daily highlighted the German car industry’s opposition to “EU investigation and punitive tariffs” on Chinese EVs.
  • Xi in Europe: PRC leader Xi Jinping ended his week-long visit to Europe by traveling to Serbia and then to Hungary last Friday. PRC messaging highlighted the flag-waving crowds in Belgrade and the Serbian president stating that Xi was “one of the very rare people … who can really contribute to establishing a long lasting peace in the world”. Similarly, CGTN ran a 30-minute interview of Hungarian Prime Minister Viktor Orbán heaping praise on the PRC. On X, PRC diplomats relayed Orbán’s rejection of his allies’ concerns on overcapacity and de-risking.

News and Commentary

Russian-linked network uses AI to doctor legitimate news content: An online influence group with suspected ties to Russia used generative artificial intelligence (AI) tools to plagiarize, translate, and add biased framing to legitimate mainstream media content about contentious issues, including Russia’s war in Ukraine, the Israel-Hamas war, and other US domestic political divides; the group reportedly published 19,000 such posts in a month, according to research by Insikt Group. Co-Managing Director David Salvo told the Dispatch, “The industrialization of information manipulation through AI tools may be in its infancy, but content laundering has long been one of Russia’s key information manipulation tactics. Even prior to the introduction of ChatGPT, ASD research uncovered networks of faux local US ‘news’ outlets that laundered content from RT by changing enough words to avoid plagiarism and spam detectors. Using technology to churn out articles at a pace that far surpasses a human journalist’s ability, these sites could present pro-Kremlin propaganda to Americans as authentic US local news at rapid speed. Generative AI tools now make it even easier to manipulate articles at scale, including from legitimate news outlets, by instructing ChatGPT to add spin and make enough changes to avoid plagiarism detectors.” 

House introduces bipartisan bill to prepare election workers for AI threats: A bipartisan group of four US House lawmakers introduced a bill to require the Election Assistance Commission (EAC) and the National Institute of Standards and Technology to prepare a joint report that offers election administrators voluntary guidelines on preparing for threats posed by artificial intelligence (AI). Senior Fellow David Levine said, “This proposed legislation is an important bipartisan effort that could be strengthened by requiring the Cybersecurity and Infrastructure Security Agency (CISA), rather than the EAC, to develop additional voluntary guidance that addresses the use and risks of artificial intelligence technologies. CISA has established dedicated election security advisors in each region of the country to help counter similar threats and has provided important guidance on how generative AI-enabled capabilities could impact the security and integrity of election infrastructure, as well as on how to secure election infrastructure against foreign malign influence operations, including AI-enabled cyberattacks on election infrastructure and AI-powered disinformation. It makes more sense to allow the EAC to focus on bolstering its strengths—building up its clearinghouse function, bolstering its data collection and research, and dispersing grants—than to give them an added responsibility that CISA is likely better positioned to currently tackle.”

In Case You Missed It

  • TikTok will become the first social media platform to automatically label certain AI-generated content—including content generated by AI tools from Adobe, OpenAI, and TikTok itself.
  • A Russian-aligned threat actor has claimed responsibility for more than 50 cyberattacks against Moldovan websites since the beginning of March, according to NetScout.
  • OpenAI will share a deepfake detector tool that identifies most images generated by the company’s DALL-E AI image creator with a group of disinformation researchers.
  • Unidentified state-sponsored hackers are suspected to be behind three attempts to compromise government systems in the Canadian province of British Columbia.
  • Foreign cybercrime in Germany increased by 28% over the previous year, according to the German interior minister.

ASD in the News

Quote of the Week

Where threat actors use advanced technology—like artificial intelligence—to make their crimes more dangerous and more impactful, the Department of Justice will seek enhanced sentences.”

—Deputy Attorney General Lisa Monaco, speaking about AI-related threats to US elections at a meeting of the Election Threats Task Force in Washington, DC on May 13.

The views expressed in GMF publications and commentary are the views of the author alone.