In between bites of stuffing and potatoes this Thanksgiving, it is likely that “fake news” or “election meddling” may be mentioned at the table. Here is all you need to know so that you can hold your own in a discussion on influence operations with your relatives.

When someone says the term “fake news” what they are probably referring to is disinformation. Disinformation is the deliberate spreading of false information in order to influence public opinion or obscure the truth. Disinformation is a key part of broader online influence operations that aim to shape people’s perceptions of events and issues.

 

Why conduct an influence operation?

How do influence operations work?

When do influence operations occur?

Influence campaigns are continuous, long-term operations that are not confined to elections or specific geopolitical events. The Kremlin’s ongoing operation in the United States, for example, aims to divide society and inflame tensions to undermine American governance and society over months and years. Moscow does not care whether Trump builds the wall, if NFL players kneel during the anthem, or if “black lives matter.” It only views these issues as opportunities to exploit and weaken the United States.

Who is carrying out influence operations?

Russia and Iran are the main actors perpetrating online influence operations targeting the United States. However, other authoritarian regimes, like China and North Korea, will likely follow suit. And these tactics are not exclusive to authoritarian regimes: non-state actors, such as corporations, domestic political parties, and lobbying firms may adopt these methods as well.

Where do influence operations take place?

Influence operations span the whole online information ecosystem, not just Facebook or Twitter. Narratives are often laundered from foreign media outlets through various social media platforms (from 4chan and Reddit to Tumblr) to hide the original source of the information. Foreign actors even seek to manipulate Google and YouTube search results to spread their content. The content then enters into public discussion and is circulated more widely through the information ecosystem.

How can I protect myself from influence operations?

The main and most effective way to protect yourself is to improve your media hygiene. No matter what regulations and practices are put into place, some disinformation will always slip through the cracks. That is why the responsibility falls on each and every one of us to make sure we are contributing to a healthy digital democracy.

  • Look for the blue check: Only share information from sources you know and trust or are independently verified. Look for the blue check badge on Twitter, Facebook, and Instagram!
  • Do the legwork: Read the article – not just the title – before you like or share.
  • Trust, but verify: If you are not sure whether a source can be trusted, try using a site like mediabiasfactcheck.com to check the factual reporting and partisanship scores for that source.
  • Think before you share: Information operations use emotion to blind us to the authenticity of the content and to motivate us to act and share content out of passion.
  • Improve your digital “spidey sense”: Learn to spot the telltale signs of inauthentic accounts such as an abnormally high posting volume, a relatively new account creation date, basic language errors, and the use of stock photos. Sites like Botometer can help by predicting the likelihood that an account is a bot.

Frequently Asked Questions

What is “fake news”?

“Fake news” is a poorly-defined and misleading term to describe alleged inaccuracy and bias in the media. It is often used as shorthand to label unfavorable news as false, rather than to describe genuinely inaccurate information. British policymakers are no longer allowed to use it, and you should consider avoiding it too.

Can influence operations really change how I think?

Digital marketing is a multi-billion dollar industry for a reason: It works. Influence operations do not aim to change behavior overnight; they slowly harden and shape a target’s perceptions without them realizing it.

How is this any different from the propaganda of the past?

While the goals may be similar, new technologies have increased the reach, speed, and impact of influence operations, while simultaneously lowering their startup costs. In the past, such operations required extensive networks and a significant investment of resources. Now all that’s needed is basic linguistic skills, knowledge of algorithms, and an internet connection to send harmful disinformation directly to hundreds of millions of Americans.

The views expressed in GMF publications and commentary are the views of the author alone.