Russian influence operations exploit the vulnerabilities of social media platforms to disseminate false narratives and amplify divisive content in order to undermine democracies, divide societies, and weaken Western alliances. In conducting these operations, the Kremlin employs a variety of tools across the social media space, including fake accounts/personas, political advertisements, bot networks, and traditional propaganda outlets. Additionally, Russian influence operations utilize a range of social media platforms, each with a different role, to distract public discussion, foment social unrest, and muddle the truth.
In order to successfully counter the Kremlin’s online offensive, Western policymakers will need to understand and address Moscow’s use of the social media ecosystem as a whole. Further, politicians will need to partner with the private sector and civil society organizations to help construct effective, sustainable, and forward-looking responses to Russian influence efforts.
Facebook CEO Mark Zuckerberg testified on Capitol Hill this week following the recent exposure of his company’s failures to protect up to 87 million Facebook users’ data.1 Facebook has been at the center of a whirlwind of revelations regarding the Kremlin’s manipulation of social media since the site first announced that it had shut-down several hundred Kremlin-linked accounts in September 2017.2 Facebook announced last week that it deleted an additional 135 Facebook and Instagram accounts, as well as 138 pages, linked to Kremlin influence efforts.3 We have also learned that Russian trolls used the social media site to create events that reached over 300,000 Americans.4 The House Intelligence Committee last fall revealed a slew of Kremlin-linked political ads that were published on Facebook in an attempt to exacerbate divisions in American society.5 Investigations into the Russian cooptation of Facebook and Twitter have unveiled a network of social media influence efforts operating at the Kremlin’s behest. These efforts utilize a range of social media platforms to dispense Kremlin narratives and sow division and discord abroad.
As important as Facebook is in the information space, it is just one of many platforms actively being exploited by Russian information operations. As Putin seeks to project influence and weaken institutions abroad, Moscow makes use of a wide range of tools including disinformation, cyber-attacks, illicit finance, support for fringe political groups, and the weaponization of energy.6 Putin’s foreign policy efforts are often characterized by decentralized ad-hocracy, encouraging Kremlin-affiliated actors to apply tools if and when they are applicable to pursue the state’s larger strategic goals.7 The flexibility and synthesis of these operations allows the Kremlin to seize on opportunities and adapt to local contexts in order to maximize the impact of its efforts.
Russia uses similar tactics in its social media campaigns, deploying simultaneous operations across a range of platforms in order to multiply effects and undermine Western efforts to counter and defend against this activity. As Clint Watts describes, “Within the Kremlin’s playbook, each social media platform serves a function, a role in an interlocking social media ecosystem where Russia infiltrates, engages, influences and manipulates targeted American audiences.”8 As new platforms and technologies emerge, the Kremlin will seek to coopt and integrate them into this network of online influence.
The Network
Social media platforms present a significant threat to the legitimacy of the Putin regime by challenging the Kremlin’s monopoly on information within Russia. The power of these sites to stir public unrest and foment resistance to government oppression during the Arab Spring had a significant impact on Moscow, prompting fears that similar instability may arise at home.9 In order to keep a lid on domestic unrest, Russian President Vladimir Putin maintains strict control over the country’s cyberspace, suppressing domestic internet freedom and constricting information flows to ensure the dominance of Kremlin-constructed narratives. The Internet Research Agency (IRA), although famous for its efforts targeting the West, focused the majority of its resources on distributing disinformation to Russia’s domestic population. However, recognizing the power to use the openness of the internet to sow chaos abroad and the ability of cyberspace to supercharge the old active measures playbook, Putin has turned these tools of information control against the West. As instruments of disinformation, social media platforms present the Kremlin with a cheap investment that provides plausible deniability, direct access to audiences, easy amplification of narratives, and rapid tools for dissemination.10
Russian attempts to project influence and disruption through social media are not limited to a specific platform. Instead, the Kremlin pursues its agenda across a variety of mediums, seeking low-cost/high-reward methods for undermining democracy and unity in Western countries. The synthesis of these efforts presents a major threat and challenge for policymakers, as responses must address the wider vulnerabilities in the industry rather than the specific short-comings of any one platform. Additionally, policymakers must walk a narrow line, combating the foreign manipulation of social networks without compromising the core democratic values that threaten Putin’s regime in the first place. This piece will highlight the necessity for comprehensive, forward-looking measures to combat online foreign influence campaigns by exposing the pervasive nature of Moscow’s efforts across the ever-growing social media ecosystem.
The Platforms
The Kremlin’s war of online influence has been most heavily exposed through recent revelations about the exploitation of Facebook, along with its photo-sharing company Instagram. Facebook, with over 2 billion monthly active users,11 is not only the largest of the social media networks, but also a key platform for Russian disinformation operations. Efforts by the Putin-linked Internet Research Agency (IRA) reached an estimated 150 million Americans across Facebook and Instagram in the lead-up to the 2016 presidential election.12
Kremlin-affiliated actors employ a range of tactics to spread influence throughout Facebook and Instagram, including the creation of fake accounts and pages, the promotion of fake events, and the purchase of political advertisements.13 Fake Facebook accounts, like that of “Melvin Redick” from Harrisburg, Pennsylvania, put a seemingly American face to Russian operations intended to destabilize U.S. democracy.14 Meanwhile, Kremlin-linked Facebook pages, such as “Secured Borders,” 15 spread content to over 129,000 followers, with one post reaching 4 million users after accruing 80,000 shares and 300,000 likes.16 Similar Kremlin-linked pages created Facebook events for political rallies and protests in the United States.17 In Houston, IRA-linked Facebook groups organized simultaneous opposing rallies, drawing American citizens into the streets in direct opposition to one another.18 Kremlin-funded political ads, around a quarter of which were specifically geographically targeted,19 also spread hostile narratives to an estimated 23-70 million Facebook users.
Although it has received significantly less attention, research reveals that Instagram plays a similar role in the Kremlin’s information war. Facebook’s recent purge of Kremlin-linked accounts included 65 fake Instagram accounts, along with 70 Facebook profiles.20 Jonathan Albright of Columbia University writes that “Instagram is a major distributor and re-distributor of IRA [Internet Research Agency] propaganda,” with an effect that is potentially “far more impactful than Twitter.”21 An analysis of the posts from just 28 known Kremlin-linked Instagram accounts revealed 2.5 million recorded interactions with Instagram users, as well as 145 million likely interactions from people who passively viewed the posts.22 Many of these accounts may be directly linked to Kremlin-affiliated Facebook accounts, often sharing posts with similar language and themes, and indicating a concerted effort to disseminate divisive narratives across platforms.23
The content and focus of Kremlin-created ads, pages, and events indicates the larger role that Facebook and Instagram play in Moscow’s social media influence operations. Ads and posts from Internet Research Agency accounts focused heavily on the promotion of divisive social and political messages on both sides of the political spectrum.24 Kremlin accounts are not creating new disputes between groups, but rather inserting themselves into existing fractures in American society in an attempt to push people further from compromise. The overall goal of Kremlin activity on Facebook and Instagram is thus to seize on existing political and societal grievances to divide civil society and weaken faith in the American system.
Kremlin-linked operatives similarly manipulate Twitter to pursue their goals. In the ten weeks leading up to the 2016 U.S. presidential election, Twitter identified 3,814 accounts linked the Internet Research Agency, which posted around 176,000 tweets.25 Additionally, in the three months before the election, 36,000 other Kremlin-linked automated bot accounts tweeted 1.4 million election-related tweets, accumulating an estimated 288 million views.26 Twitter has promised to contact 677,000 users who followed or engaged with these Kremlin-linked accounts.27 Although this was just a small portion of election-related tweets, the efforts illustrate a concerted campaign to manipulate online political discussion in the United States across several platforms.
As with Facebook, Russian operatives on Twitter utilized accounts that were intended to appear as legitimate American sources in order to build credibility. One account, @TEN_GOP, pretended to be an official representative of the Tennessee Republican Party, but was instead issuing IRA-created narratives to its more than 100,000 followers.28 Some accounts explicitly targeted specific groups, such as military veterans, for engagement.29 Along with Facebook and Instagram, Twitter also played host to Moscow’s efforts to organize and promote fake events, and to influence political perceptions through advertisements.30
One more key aspect of the Kremlin’s manipulation of Twitter has been its focus on utilizing networks. Decentralized networks play a major role in the Kremlin’s larger influence strategy, allowing Moscow to leverage proxies, cut-outs, and quasi-independent actors to execute policies directed toward Putin’s broader strategic objectives.31 On Twitter, networks that consist of Kremlin-affiliated actors, accounts sympathetic to Russian narratives, and automated bots are able to quickly promote and disseminate specific narratives across Twitter’s information space.32 These networks often pick up on false or sensationalized news stories propagated by the Kremlin’s more traditional media outlets, occasionally inciting great interest and activity amongst non-Russia-related users.33 The promotion of a story or theme by a vast, interconnected network allows the Kremlin to manipulate Twitter’s trending algorithms and makes the story seem like a buzzing new issue to the average user.
Online networks represent a rapid reaction capability in the Kremlin’s social media toolkit, allowing Moscow to seize on ongoing developments to attempt to shape political discourse in the United States. Twitter is often used by reporters to identify leads for stories and as sources for public opinion, providing an inviting opportunity for foreign actors to influence media coverage.34 News outlets such as the Washington Post (8 times), the Miami Herald, Buzzfeed and Vox have cited tweets by IRA-linked Twitter accounts in their articles, inadvertently amplifying the Kremlin’s online voice.35
Further, Twitter networks are not just an offensive influence tool, but also a defensive method, as Moscow is able to muddle the true narrative of events by promoting an array of explanations. For example, during the recent crisis surrounding the poisoning of former Kremlin spy Sergei Skripal, the Russian-controlled outlets published and promoted multiple potential narratives blaming various countries for the attack in order to blur and devalue the truth.36 Using online networks on Twitter and Facebook, Kremlin sources worked to drown out credible media outlets by flooding the information space.37 The Kremlin’s various stories, however implausible, are intended to provide news consumers who may be critical of the official story with alternative explanations to satisfy their perceptions.38 In this way, Moscow can distract public discussion and weaken potential responses.
The Alliance for Securing Democracy’s Hamilton 68 Dashboard tracks the focus of one of these networks in real-time.39 The Atlantic Council’s Digital Forensics Research Lab also offers in-depth research into Kremlin bot networks, as well as tips for identifying Russian troll accounts.40 Overall, the Kremlin’s manipulation of Twitter networks represents an impressive amount of coordination and tech-savvy within Moscow’s information operations that may be similarly applied to the greater social media sphere.
The content and focus of the Kremlin’s Twitter accounts reveal a role in Moscow’s larger social media efforts that is both similar to and different from that played by Facebook and Instagram. On one hand, Twitter accounts play a comparable part in hijacking online public discussion to disseminate divisive and often inaccurate political and social narratives. These accounts often impersonate real Americans or official accounts in order to seem credible to the average reader. On the other hand, Moscow’s Twitter networks also maintain a secondary duty of flooding the information space with false narratives, often created by more traditional media outlets. This method allows for quick reactions to developing events, and gives the Kremlin networks an opportunity to shape and distract political discussion. These two tasks illustrate the Kremlin’s sophisticated understanding of the internet media sphere, and serve the larger objective of undermining Western democracy.
YouTube
Another less discussed platform for Russian influence operations is Google-owned YouTube. YouTube, the number one site for video-sharing, is the second most popular site on the internet.41 Compared with Facebook, Instagram, and Twitter, the Kremlin’s investment in covert influence on YouTube seems minimal. In October 2017, Google reported that Kremlin-linked actors spent just $4,700 on advertising across the company’s various products, including YouTube, to influence the 2016 U.S. presidential election.42 According to Google, Russian groups also created 18 YouTube channels and uploaded around 1,100 English-language videos, amounting to over 300,000 views during the election cycle.43
However, YouTube’s main role in the Kremlin’s social media offensive is to provide a platform for the spread of Moscow’s more overt propaganda efforts to specifically targeted groups. In 2013, Russian state-controlled news agency, RT (formerly Russia Today), became the first news outlet on YouTube to accumulate 1 billion views.44 RT remains one of the most popular news providers on the site with just shy of 2.5 million subscribers on its central channel, more than BBC, MSNBC, or Fox News.45 RT, along with its partner Sputnik, maintain a network of YouTube channels in various languages including English, Arabic, German, and Mandarin. RT’s YouTube success allows the agency to spread Kremlin narratives to a wide audience online, often laying the groundwork for expansion to more traditional outlets. In December 2017, RTlaunched a French television channel building on the foothold of the agency’s French-speaking YouTube audience.46 Additionally, RT’s YouTube presence targets specific audiences that it believes are susceptible to pro-Kremlin narratives. Examining the HTML source of RT’s main YouTube channel reveals that the company utilizes keyword tags to steer specific online traffic to its content.47 One of the tags for the channel includes the words, “protest,” “Snowden,” “NSA,” and “Assange.” The Kremlin may use similar tactics on many of its YouTube videos, manipulating keyword tags to push pro-Kremlin content to the top of search results for important geopolitical issues.
YouTube plays an important, unique role in promoting influence through Moscow’s social media network. Rather than utilize the platform to covertly amplify divisive social and political perspectives, the Kremlin prefers to employ YouTube as the key online mouthpiece for its more overt propaganda efforts. In this effort, the Kremlin deploys its efforts across an array of languages and uses specific targeting methods to reach out to intended audiences.
Tumblr, Reddit, and Other Platforms
The Kremlin’s online offensive also utilizes smaller social media platforms to contribute to its network of influence, although research on many of these sites remains thin. Tumblr, a popular blogging platform, announced in March 2018 that it had deleted 84 IRA-linked accounts in cooperation with the U.S. Department of Justice.48 As with Facebook and Twitter, Kremlin-affiliated users reportedly created fake personas and pages on the site to target specific audiences, including minority groups.49 These accounts often published relatable posts to build credibility with their audience before pushing more politically divisive content.50
Forum-based sites, like Reddit and 4Chan, also play an important role in the Kremlin’s social media campaigns. This week, Reddit’s CEO Steve Huffman announced that the site had identified and deleted nearly 1000 accounts linked to the IRA.51 Earlier this year, Reddit removed several hundred accounts linked to Russian influence operations after a Daily Beast report revealed that Internet Research Agency operatives directly targeted sites like Reddit and 9gag (a viral meme site) to spread divisive narratives.52 Several of the fake accounts impersonated regular users and reportedly spammed the site with links to IRA-created websites, which Reddit has been slow to remove.53
The open, anonymous nature of sites like Reddit provides the perfect medium for the Kremlin to insert divisive content into online discussion (or to simply amplify existing content) that can be later laundered across the social media space to build credibility and amplify impact. Divisive posts on these sites “quickly proliferate to other social media platforms where they’re used to support anti-government narratives or enflame social divisions.”54 Bret Schafer and Kirill Meleshevich describe that just as criminal enterprises launder money through intermediaries to hide its illicit origins, “disinformation is most powerful when a façade of legitimacy is created through ‘information laundering.’”55 Kremlin actors distance false narratives created on sites like Reddit and 4Chan from their point of origin by “layering” them through more credible sources in order to trick social media users and news sources into unwittingly adopting and propagating the message. The Kremlin’s use of these platforms for “information laundering” demonstrates a sophisticated understanding of the social media ecosystem, and a complex level of coordination across various platforms.
These revelations illustrate the necessity for continued research into Moscow’s online influence to better uncover the scope of the Kremlin’s social media operations. Further research should focus on generating a better understanding of cross-platform coordination efforts. Although manipulations on a particular single platform may seem insignificant, the synthesis of Kremlin efforts across the social media ecosystem has a far greater effect in influencing foreign populations. Additionally, as social media platforms continue to grow, change, and evolve, further study will be necessary to understand the potential vulnerabilities inherent in future developments.
What Can We Do?
Moscow’s exploitation of social media platforms is expansive, pervasive, and ever-growing. It feeds on the vulnerabilities left by a burgeoning industry that is developing too rapidly for the policymaking bureaucracy to match. In order to insulate against future attempts to manipulate these gaps, the U.S. government needs to take steps to address potential risks in the social media ecosystem, while maintaining the sanctity of the open and transparent platforms that threatened Moscow into reactionary measures in the first place. Countering this asymmetric threat to democracy requires policymakers, the private sector, and citizens alike to understand and acknowledge a few simple truths:
- This is not just a Facebook problem.
In the aftermath of the Facebook CEO’s congressional testimonies, it is extremely important for watchers to remember that Russian influence operations are not limited to Zuckerberg’s social media empire. Along with Facebook and Instagram, Kremlin-linked influence operations have utilized Twitter, YouTube, Tumblr, 4Chan, Reddit, and likely many other sites not yet investigated. Moscow’s asymmetric assault on democracy does not discriminate in its manipulation of online mediums, and as new communication technologies continue to develop at a blitz pace, it is logical to assume that the Kremlin will continue to search for ways to manipulate those new platforms to benefit its strategic goals. Measures such as Facebook’s new requirement for political ad purchasers to verify their identity are a step in the right direction, but much more action is required to adequately defend against the Kremlin’s broader social media offensive.56Additionally, given the widespread coverage of Russian influence operations, it is likely that other foreign actors will adopt similar tactics to exploit social media for their own purposes.
This presents a unique challenge for policymakers attempting to insulate U.S. democracy from hostile external influence. In order to effectively address these threats, policy solutions will need to comprehensively address the breadth of the social media sphere, whilst also incorporating prescient foresight on the potential for future risks.
- This is not just a government problem.
Policymakers alone are ill-equipped to combat the Kremlin’s social media network. The institutional bureaucracy of Washington will never be able to keep up with the pace of technological development emanating from Silicon Valley. Additionally, poorly constructed attempts to regulate tech companies, particularly social media and communications platforms, risk betraying the core democratic values of free speech and openness. While Facebook’s recent effort to facilitate research on the impacts of social media manipulation represents a promising step in the right direction, much greater future coordination is necessary.57 Creating an effective response to Moscow’s social media offensive requires a concerted effort by politicians, the tech community, and civil society organizations to limit vulnerabilities, build domestic resilience, and defend democratic institutions.
- This is not just an election problem.
Much of the controversy surrounding Moscow’s manipulation of Facebook centers on the 2016 U.S. presidential election. However, Russian influence efforts did not cease in November 2016, and will not cease any time in the near future. As described by the Office of the Director of National Intelligence, it is likely that “Moscow will apply lessons learned from its campaign aimed at the U.S. presidential election to future influence efforts in the United States and worldwide.”58 The Kremlin’s strategic goal to weaken and divide Western societies is constant, as is its asymmetric war on democracies. As such, policymakers should expect that the Kremlin will continue operations to manipulate social media in order to undermine democratic institutions regardless of election cycles.
- This is not just a social media problem.
While heavily covered in the news, the Kremlin’s exploitation of social media is just one of many tactics employed to sow discord and expand Russian influence abroad. Moscow utilizes a varied toolkit of “active measures” to assert its foreign policies, including malign finance, cyber operations, economic leverage, political alliances, corruption, and organized crime. Although the sheer range of these efforts presents an intimidating front, it is time for Western leaders to begin “‘fixing the roof’ rather than simply hoping the rain will stop.”59 Acknowledging the extent of the Kremlin’s asymmetric operations is a key step in developing the type of policy prescriptions that will holistically address this threat.
- John Hendel, “Zuckerberg Confirmed to Testify Next Week Before House Lawmakers,” Politico, April 4, 2018,https://www.politico.com/story/2018/04/04/zuckerberg-testify-congress-schedule-500201; Heather Kelly, “Facebook Says Cambridge Analytica May Have Had Data on 87 Million People,” CNN, April 4, 2018,http://money.cnn.com/2018/04/04/technology/facebook-cambridge-analytica-data-87-million/index.html.
- Scott Shane, “The Fake Americans Russia Created to Influence the Election,” The New York Times, September 7, 2017,https://www.nytimes.com/2017/09/07/us/politics/russia-facebook-twitter-election.html.
- Ashley Gold, “Facebook Removes Dozens of Russia-Linked Accounts,” Politico, April 3, 2018,https://www.politico.com/story/2018/04/03/facebook-russia-accounts-498888.
- Donie O’Sullivan, “Russian Trolls Created Facebook Events Seen By More Than 300,000 Users,” CNN, January 26, 2018,http://money.cnn.com/2018/01/26/media/russia-trolls-facebook-events/index.html.
- Scott Shane, “These Are the Ads Russia Bought on Facebook in 2016,” The New York Times, November 1, 2017,https://www.nytimes.com/2017/11/01/us/politics/russia-2016-election-facebook.html.
- Clint Watts, “So What Did We Learn? Looking Back on Four Years of Russia’s Cyber-Enabled ‘Active Measures,’” The German Marshall Fund of the United States, January 16, 2018, https://securingdemocracy.gmfus.org/blog/2018/01/16/so-what-did-we-learn-looking-back-four-years-russias-cyber-enabled-active-measures; Mark Galeotti, “Controlling Chaos: How Russia Manages Its Political War in Europe,” European Council on Foreign Relations, August 2017,http://www.ecfr.eu/publications/summary/controlling_chaos_how_russia_man….
- Mark Galeotti, “What Exactly Are ‘Kremlin Ties’?,” The Atlantic, July 12, 2017,https://www.theatlantic.com/international/archive/2017/07/russia-trump-putin-clinton/533370/.
- Clint Watts, “Extremist Content and Russian Disinformation Online: Working with Tech to Find Solutions,” The German Marshall Fund of the United States, October 31, 2017, https://securingdemocracy.gmfus.org/publications/extremist-content-and-russian-disinformation-online-working-tech-find-solutions.
- Jim Rutenberg, “RT, Sputnik and Russia’s New Theory of War,” The New York Times, September 13, 2017, sec. Magazine,https://www.nytimes.com/2017/09/13/magazine/rt-sputnik-and-russias-new-theory-of-war.html;
Stephen Blank and Carol Saivetz, “Russia Watches The Arab Spring,” RadioFreeEurope/RadioLiberty, June 24, 2011,https://www.rferl.org/a/commentary_russia_watches_arab_spring/24245990.html. - “Brief of Former National Security Officials As Amici Curiae in Support of Neither Party” (ProtectDemocracy.org, December 8, 2017), https://protectdemocracy.org/wp-content/uploads/2017/12/36-1-Amicus-Brief-National-Security-Officials.pdf.
- Josh Constine, “Facebook Now Has 2 Billion Monthly Users… And Responsibility,” TechCrunch, June 27, 2017,http://social.techcrunch.com/2017/06/27/facebook-2-billion-users/.
- Sheera Frenkel and Katie Benner, “To Stir Discord in 2016, Russians Turned Most Often to Facebook,” The New York Times, February 17, 2018, sec. Technology, https://www.nytimes.com/2018/02/17/technology/indictment-russian-tech-facebook.html.
- Alicia Parlapiano, “The Propaganda Tools Used by Russians to Influence the 2016 Election,” The New York Times, February 16, 2018, sec. U.S., https://www.nytimes.com/interactive/2018/02/16/us/politics/russia-propag….
- Scott Shane, “The Fake Americans Russia Created to Influence the Election,” The New York Times, September 7, 2017,https://www.nytimes.com/2017/09/07/us/politics/russia-facebook-twitter-election.html.
- “Secured Borders | Facebook,” Internet Archive: Wayback Machine, December 17, 2016,https://web.archive.org/web/20161217011258/https://www.facebook.com/Secured.Borders/.
- Ben Collins, Kevin Poulsen, and Spencer Ackerman, “Russia’s Facebook Fake News Could Have Reached 70 Million Americans,” The Daily Beast, September 8, 2017, https://www.thedailybeast.com/russias-facebook-fake-news-could-have-reached-70-million-americans.
- Alicia Parlapiano, “The Propaganda Tools Used by Russians to Influence the 2016 Election,” The New York Times, February 16, 2018, sec. U.S., https://www.nytimes.com/interactive/2018/02/16/us/politics/russia-propag….
- Scott Shane, “How Unwitting Americans Encountered Russian Operatives Online,” The New York Times, February 18, 2018,https://www.nytimes.com/2018/02/18/us/politics/russian-operatives-facebook-twitter.html.
- Alex Stamos, “An Update On Information Operations On Facebook,” Facebook Newsroom, September 6, 2017,https://newsroom.fb.com/news/2017/09/information-operations-update/.
- Ali Breland, “Facebook Bans Over 200 New Russian Accounts,” The Hill, April 3, 2018,http://thehill.com/policy/technology/381499-facebook-bans-over-200-new-russian-accounts.
- Jonathan Albright, “Instagram, Meme Seeding, and the Truth about Facebook Manipulation, Pt. 1,” Medium, November 8, 2017, https://medium.com/berkman-klein-center/instagram-meme-seeding-and-the-truth-about-facebook-manipulation-pt-1-dae4d0b61db5.
- Sheera Frenkel, “For Russian ‘Trolls,’ Instagram’s Pictures Can Spread Wider Than Words,” The New York Times, December 17, 2017, https://www.nytimes.com/2017/12/17/technology/instagram-russian-trolls.html.
- Alex Pasternack, “Russia’s U.S. Propaganda Campaign Infiltrated Instagram, Too,” Fast Company, October 6, 2017,https://www.fastcompany.com/40478430/russia-linked-instagram-facebook-posts-ads-memes-propaganda.
- Dylan Byers, “Facebook Estimates 126 Million People Were Served Content From Russia-Linked Pages,” CNN, October 30, 2017, http://money.cnn.com/2017/10/30/media/russia-facebook-126-million-users/index.html.
- Eli Rosenberg, “Twitter to Tell 677,000 Users They Were Had by the Russians. Some Signs Show the Problem Continues.,”Washington Post, January 19, 2018, https://www.washingtonpost.com/news/the-switch/wp/2018/01/19/twitter-to-tell-677000-users-they-were-had-by-the-russians-some-signs-show-the-problem-continues/.
- Mike Isaac and Daisuke Wakabayashi, “Russian Influence Reached 126 Million Through Facebook Alone,” The New York Times, October 30, 2017, https://www.nytimes.com/2017/10/30/technology/facebook-google-russia.html.
- Eli Rosenberg, “Twitter to Tell 677,000 Users They Were Had by the Russians. Some Signs Show the Problem Continues.,”Washington Post, January 19, 2018, https://www.washingtonpost.com/news/the-switch/wp/2018/01/19/twitter-to-tell-677000-users-they-were-had-by-the-russians-some-signs-show-the-problem-continues/.
- Aaron Kessler, “Who Is @TEN_GOP in the Mueller Indictment?,” CNN, February 17, 2018,https://www.cnn.com/2018/02/16/politics/who-is-ten-gop/index.html.
- Greg Gordon and Peter Stone, “Russian Propaganda Engaged U.S. Vets, Troops on Twitter and Facebook, Study Finds,”McClatchy DC Bureau, October 9, 2017, http://www.mcclatchydc.com/news/nation-world/national/article177744986.html.
- Alicia Parlapiano, “The Propaganda Tools Used by Russians to Influence the 2016 Election,” The New York Times, February 16, 2018, sec. U.S., https://www.nytimes.com/interactive/2018/02/16/us/politics/russia-propag… Cecilia Kang, Nicholas Fandos, and Mike Isaac, “Russia-Financed Ad Linked Clinton and Satan,” The New York Times, November 1, 2017,https://www.nytimes.com/2017/11/01/us/politics/facebook-google-twitter-russian-interference-hearings.html?mtrref=undefined&gwh=677ACF80609BE2487246EABB446EC73A&gwt=pay.
- “Brief of Former National Security Officials As Amici Curiae in Support of Neither Party” (ProtectDemocracy.org, December 8, 2017), https://protectdemocracy.org/wp-content/uploads/2017/12/36-1-Amicus-Brief-National-Security-Officials.pdf.
- “The Methodology of the Hamilton 68 Dashboard,” The German Marshall Fund of the United States, August 7, 2017,https://securingdemocracy.gmfus.org/publications/methodology-hamilton-68-dashboard.
- Andrew Weisburd and Clint Watts, “How Russia Dominates Your Twitter Feed to Promote Lies (And, Trump, Too),” The Daily Beast, August 6, 2016, https://www.thedailybeast.com/articles/2016/08/06/how-russia-dominates-your-twitter-feed-to-promote-lies-and-trump-too.
- Heidi Tworek, “Responsible Reporting in an Age of Irresponsible Information,” The German Marshall Fund of the United States, March 23, 2018, https://securingdemocracy.gmfus.org/publications/responsible-reporting-age-irresponsible-information.
- Tony Romm, “The Washington Post, Miami Herald, InfoWars and Other U.S. Sites Spread Russian Propaganda From Twitter,”Recode, November 3, 2017, https://www.recode.net/2017/11/3/16599816/washington-post-mcclatchy-miami-herald-ap-russian-propaganda-twitter.
- “Defensive Disinformation as Decoy Flare: Skripal and Flight MH17,” EU vs Disinfo, March 27, 2018,https://euvsdisinfo.eu/defensive-disinformation-as-decoy-flare-skripal-and-flight-mh17/.
- DFRLab, “#PutinAtWar: Social Media Surge on Skripal,” Medium, April 6, 2018, https://medium.com/dfrlab/putinatwar-social-media-surge-on-skripal-b5132db6f439.
- Elisabeth Braw, “How to Deal With Russian Information Warfare? Ask Sweden’s Subhunters,” Defense One, April 3, 2018,http://www.defenseone.com/ideas/2018/04/how-deal-russian-information-warfare-ask-sweden/147154/.
- “Hamilton 68: Tracking Russian Influence Operations on Twitter,” Alliance for Securing Democracy,https://dashboard.securingdemocracy.org/.
- For more on bot networks: DFRLab, “#BotSpot: Obituary of a Botnet,” Medium, March 29, 2018,https://medium.com/dfrlab/botspot-obituary-of-a-botnet-9d8e33963d66.
For more on identifying Russian troll accounts: DFRLab, “#TrollTracker: How To Spot Russian Trolls,” Medium, March 30, 2018,https://medium.com/dfrlab/trolltracker-how-to-spot-russian-trolls-2f6d3d287eaa. - Gagan Bhangu, “Top 15 Most Popular Websites In The World 2018,” OTechWorld.Com, June 10, 2017,https://otechworld.com/most-popular-websites-in-world/.
- Elizabeth Dwoskin and Adam Entous, “Google Says Russia Tried to Influence US Election Using Adverts on YouTube and Gmail,” The Independent, October 9, 2017, http://www.independent.co.uk/news/world/americas/google-russia-us-election-adverts-found-youtube-gmail-donald-trump-president-investigation-latest-a7990546.html.
- Jacob Kastrenakes, “Russian Groups Made 1,100 YouTube Videos During 2016 US Election,” The Verge, October 30, 2017,https://www.theverge.com/2017/10/30/16578810/google-russian-propaganda-disclosure.
- Daisuke Wakabayashi and Nicholas Confessore, “Russia’s Favored Outlet Is an Online News Giant. YouTube Helped.,” The New York Times, October 23, 2017, https://www.nytimes.com/2017/10/23/technology/youtube-russia-rt.html.
- For YouTube subscription numbers, see: https://www.youtube.com/user/RussiaToday;https://www.youtube.com/user/bbcnews; https://www.youtube.com/user/msnbcleanforward;https://www.youtube.com/user/FoxNewsChannel/featured
- “Russia’s RT Launches New French Channel Despite ‘Propaganda’ Charges,” RadioFreeEurope/RadioLiberty, December 19, 2017, https://www.rferl.org/a/russia-today-rt-launches-new-french-language-channel-paris-despite-propaganda-charges-macron/28926043.html.
- For the HTML source of RT’s YouTube page: view-source:https://www.youtube.com/user/RussiaToday/featured;
For an archived version: view-source:https://web.archive.org/web/20180409012702/https://www.youtube.com/user/… - “Tumblr Says Russia Used It for Fake News During 2016 Election,” The Guardian, March 24, 2018,http://www.theguardian.com/technology/2018/mar/24/tumblr-says-russia-used-it-for-fake-news-during-2016-election.
- Issie Lapowsky, “Tumblr IDs 84 Accounts That Spread Propaganda,” Wired, March 23, 2018,https://www.wired.com/story/tumblr-russia-trolls-propaganda/.
- Josh Russell and Ben Collins, “Russians Used Reddit and Tumblr to Troll the 2016 Election,” The Daily Beast, March 1, 2018,https://www.thedailybeast.com/russians-used-reddit-and-tumblr-to-troll-the-2016-election
- Jacqueline Thomsen, “Reddit Identifies Nearly 1,000 Accounts Linked to Russian Troll Farm,” Text, The Hill, April 10, 2018,http://thehill.com/policy/technology/382558-reddit-identifies-nearly-1000-accounts-linked-to-russian-troll-farm.
- Josh Russell and Ben Collins, “Russians Used Reddit and Tumblr to Troll the 2016 Election,” The Daily Beast, March 1, 2018,https://www.thedailybeast.com/russians-used-reddit-and-tumblr-to-troll-the-2016-election;Issie Lapowsky, “Reddit Still Hosts Links to Russian Propaganda Sites,” Wired, March 9, 2018,https://www.wired.com/story/reddit-russian-propaganda/.
- Issie Lapowsky, “Reddit Still Hosts Links to Russian Propaganda Sites,” Wired, March 9, 2018,https://www.wired.com/story/reddit-russian-propaganda/.
- Clint Watts, “Extremist Content and Russian Disinformation Online: Working with Tech to Find Solutions,” The German Marshall Fund of the United States, October 31, 2017, https://securingdemocracy.gmfus.org/publications/extremist-content-and-russian-disinformation-online-working-tech-find-solutions.
- Kirill Meleshevich and Bret Schafer, “Online Information Laundering: The Role of Social Media,” The German Marshall Fund of the United States, January 9, 2018, https://securingdemocracy.gmfus.org/publications/online-information-laundering-role-social-media.
- Jack Nicas, “Facebook to Require Verified Identities for Future Political Ads,” The New York Times, April 6, 2018,https://www.nytimes.com/2018/04/06/business/facebook-verification-ads.html.
- “Facebook Launches New Initiative to Help Scholars Assess Social Media’s Impact on Elections,” Facebook Newsroom, April 9, 2018, https://newsroom.fb.com/news/2018/04/new-elections-initiative/.
- “Assessing Russian Activities and Intentions in Recent US Elections,” Office of the Director of National Intelligence, January 6, 2017, https://www.dni.gov/files/documents/ICA_2017_01.pdf.
- Mark Galeotti, “Controlling Chaos: How Russia Manages Its Political War in Europe,” European Council on Foreign Relations, August 2017, http://www.ecfr.eu/publications/summary/controlling_chaos_how_russia_man….