The four key ways disinformation is spread online

(Credit: Unsplash)

This article is brought to you thanks to the collaboration of The European Sting with the World Economic Forum.

Author: Spencer Feingold, Digital Editor, Public Engagement, World Economic Forum


  • Social media has allowed online disinformation to flourish.
  • False and out-of-context information is largely propagated by people looking to distort public opinion and advance particular agendas.
  • Disinformation is often advanced in four key ways, according to two experts.

Social media has ushered in an era of unprecedented connectivity. Yet it has also allowed disinformation and so-called fake news campaigns to proliferate and flourish.

Disinformation—which includes false and out-of-context information spread with the intent to deceive or mislead—is largely propagated by people looking to distort public opinion and advance particular agendas.

“Key is how they exploit the inherent openness and opaqueness of the content ecosystem,” explained Doowan Lee and Adean Mills Golub, two experts on disinformation analysis and the co-founders of Veracity Authentication Systems Technology (VAST).

Disinformation can be propagated by a host of online actors, including governments, state-backed entities, extremist groups and individuals. For example, the World Economic Forum recently reported how one anonymous anti-Semitic account on the image board 4chan sparked a misinformation campaign that targeted the Forum.

At VAST, Lee and Mills Golub monitor content from over 10 billion websites in 75 languages to track how content spreads online. Disinformation campaigns, according to Lee and Mills Golub, are propagated in four key ways:

  • Social engineering: Providing a framework to mischaracterize and manipulate events, incidents, issues and public discourse. Social engineering is often aimed at swaying public opinion in favor of a certain agenda.
  • Inauthentic amplification: Using trolls, spam bots, false identity accounts known as sock puppets, paid accounts and sensational influencers to increase the volume of malign content.
  • Micro-targeting: Exploiting targeting tools designed for ad placements and user engagements on social media platforms to identify and engage the most likely audiences that will share and amplify disinformation.
  • Harassment and abuse: Using a mobilized audience, fake accounts and trolls to obscure, marginalize and drone out journalists, opposing views and transparent content.

Examples of disinformation infecting online and offline discourse in recent years are plentiful.

During the 2016 US presidential election, for instance, Twitter identified over 50,000 Russian-linked spam accounts that were spreading divisive content related to the election. Climate change denial, the Russian invasion of Ukraine and war in Syria are other issues that have been steeped in disinformation.

The COVID-19 pandemic has also been plagued by disinformation. In fact, the issue has been so severe that pandemic-related disinformation was dubbed a so-called infodemic. “There seems to be barely an area left untouched by disinformation in relation to the COVID-19 crisis,” Guy Berger, a top UNESCO official and one of the United Nations’ leading figures combating disinformation, said in 2020.

Experts note that one of the common facets of disinformation campaigns is discrediting authoritative voices.

Ruth Ben-Ghiat, a professor of history at New York University who studies authoritarian leaders and propaganda, explained that purveyors of disinformation often attempt sow doubt on elites and trustworthy sources by connecting them to “supposed conspiracies to control and harm the population” and by portraying them as corrupt “cabals” associated with lewdness.

“Anti-science and anti-‘globalism’ are related,” Ben-Ghiat said.

The best way to combat disinformation remains a complicated topic of debate. Yet experts largely agree that cooperation between the public, regulators and social media companies is necessary—and that curbing the spread of disinformation is crucial.

As Lee and Mills Golub note, “the more of the content ecosystem [disinformation campaigns] occupy, the more challenging for trusted organizations to compete for audiences.”

Speak your Mind Here

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: