Thank you Chairman Burr, Vice Chairman Warner, and Distinguished Members of the Committee.  I submitted my full statement for the record, but let me highlight key points on the national security context of these activities and steps we need to take to address them.

The health and strength of our democracy depends on Americans’ ability to engage freely in political speech, to hold vibrant debates free from manipulation, and to obtain reliable information about the issues of the day. 

I come at this issue as a national security professional who has watched social media and online platforms be weaponized to attack these foundations of our democracy. I watched from inside the National Security Council when Russia test-drove these approaches in Ukraine, as our government struggled to understand and respond. And I watched from the campaign trail in 2016 as our government was surprised that these tools were used against American democracy. 

The 9/11 Commission characterized the failures that preceded that attack as a “failure of imagination.”  I believe the failure to detect and disrupt the Russian government’s weaponization of online platforms to be a similar failure to imagine – not just by the government, but also by those who ought to understand these tools best: their creators.

Thanks in part to the bipartisan work of this Committee, we now know that Russian government-linked actors use a range of means to manipulate the information space using nearly every social media and online platform — to amplify extreme content and promote polarization, manipulate search results, encourage action offline, undermine faith in institutions, insinuate themselves to target audiences to influence public opinion on geopolitics, and spread hacked information.  And it’s not just the Internet Research Agency; we know Russian military intelligence officers also used fake social media personas and websites.  And the United States is not the only target.

The Chinese government has also begun to use social media to manipulate conversation and public opinion outside its borders.  Our authoritarian adversaries are using these platforms because controlling the information space is a powerful means to undermine democratic institutions and alliances and advance their geopolitical goals. 

But meaningful actions to close off these vulnerabilities by both government and the private sector are lacking.  And as we focus on the past, we are missing what still is happening, and what will happen again. What may have once been a failure to imagine is now a failure to act. 

Fundamentally, this is not a content problem. This is a deliberate manipulation of the information space by actors with malicious intent engaging in deceptive behavior. 

Transparency and exposure of manipulation is critical to reducing its effectiveness and deterring it.  But tech companies have remained defensive and reluctant to share information. Their focus cannot be on public relations campaigns – it needs to be on detailing the nefarious activity these companies are seeing and curtailing.  Facebook’s announcement yesterday is what we need more of.

Transparency is also critical for accountability, and outside researchers need greater access to data – in a manner that protects users’ privacy.  Users also need more context about the origin of information and why they see it, including disclosure of automated accounts while protecting anonymity. 

Identifying malicious actors and their patterns of activity requires new mechanisms for data sharing, both between the public and private sectors and among technology companies.  Nascent efforts along these lines are welcome but need to be streamlined and institutionalized, and protect privacy and speech. 

We also need to identify threats in new technology before they are exploited. AI presents new tools to combat the problem, as well as new ways to make it worse – such as “Deep Fakes.” 

Government and tech companies need to close off vulnerabilities that are being exploited, including by providing a legal framework such as the Honest Ads Act that applies the same standards to political ads online that apply offline.

Manipulation of social media is one part of a larger strategy to weaken our democracy.  My bipartisan program recently released a “Policy Blueprint for Countering Authoritarian Interference in Democracies” endorsed by a bipartisan and transatlantic group of former senior national security officials.  Our recommendations include sending clear deterrent warnings to foreign actors about the consequences for such activity, and identifying our own asymmetric advantages. 

Government also needs to expose foreign interference publicly, and legislating reporting requirements for the Executive Branch would ensure that politics are not a consideration. I hope the measures being considered in the Intelligence Authorization Act will be enacted and address the full scope of such activities.

We also need to harden our electoral infrastructure through measures like the SECURE Elections Act, as cyberattacks remain a core part of Moscow’s arsenal.  More broadly, the government needs a unified and integrated approach – including through a counter-foreign interference coordinator at the National Security Council and a National Hybrid Threat Center.

Finally, this is a transnational challenge, and it is essential that we work more closely with allies and partners to share information about threats and collaborate on responses. 

Distinguished Members, robust action from tech companies, Congress, the Executive Branch, and civil society are required to counter these threats to our democracy. There are steps that we CAN take – today – to make our democracy more secure.  We need to come together – across party lines and between the public and private sector – to address this challenge.  Putin’s strategy is to divide Americans from one another in order to weaken us as a country.  In the face of this threat, standing together as Americans has never been more important.


The views expressed in GMF publications and commentary are the views of the author alone.