Our Programmes



Sign up to our newsletter.

Follow ODI

How to tackle mis/disinformation with a human centred approach

Written by Aaron Bailey-Athias, Abbie Richards

The mind-boggling amounts of online misinformation and disinformation that we saw in the 2020 US elections are not going away easily in 2021. (Misinformation is false and harmful information, usually spread without malicious intent. Disinformation is false information spread deliberately).

To some extent it has always been an issue in US elections, but the problem is more complicated in our new information reality, and not just in the US. For example, prominent conspiracy theories such as QAnon are gaining ground in a fertile online environment and influencing politics in other countries such as Brazil, Finland, Japan and the UK.

If we are to properly address the problem amidst the Sustainable Development Goals’ (SDGs) Decade of Action, we need a better understanding of how these movements and individuals spread mis/disinformation and contribute in eroding public trust. Most importantly, we need to understand why their messages have created a deep and moral imperative to fight the mainstream narrative.

How mis/disinformation erodes public trust – and the offline consequences

In politics, mis/disinformation erodes public trust in democracy when false messages, especially those sent by influential political figures, have the capacity to affect citizens’ trust in each other, their faith in elections or other democratic institutions or processes. These messages likely reinforce pre-existing ideas in people who look for validating narratives for their worldview, becoming ‘truths’ and alienating individuals further.

The immediate consequences of political mis/disinformation are visible. For starters, increased polarisation and partisanship. In the US, thousands of angry voters mobilised online and offline to protest with the hashtag #StopTheSteal. Major US cities have seen protests raise tensions across communities, some sadly recently ending in violence. US poll officials have reported receiving abuse online and death threats.


The long-term consequences of political mis/disinformation are hard to foresee given uncertainties and a mix of different factors that can affect civic trust and participation in democratic processes. In the case of the US however, Trump’s legacy is of mainstreaming once fringe views and conspiracy theorists into politics. Even as Trump leaves office this month, he will likely continue to be a key political influencer.

How to tackle mis/disinformation

The question left is what can be done to restore trust and tackle mis/disinformation? There is no one, simple, cookie-cutter solution to a problem of this complexity. The rapid production and spread of mis/disinformation through online platforms is a novel phenomenon that requires creative and innovative counteraction.

But collective action, meaningful social media utilisation and citizen involvement, can help.

Perhaps most importantly, we need to bridge the polarised communities and put a human approach at the heart of any strategy.

Understand the human psyche behind content sharing

It’s important to understand the extent to which human emotion plays a role in how we consume and accredit information. Sensational information triggers the same pathways in our brains that our hunter-gatherer ancestors’ brains used to trigger fear and anger. The human brain responds very similarly to “Run – there’s a bear behind you!” and “Globalists are paedophiles who prey on children!”. Our brains evolved to not ignore these danger signals, they were important if we wanted to stay alive.

These sensational stories, however, don’t just promote false information. They make us angry. And anger is what makes misinformation go viral. The more anger a piece of fake news incites, the more contagious it is. This tie between our emotional state and the spread of misinformation is a strong argument for more consideration of human psychology in digital platform design.

The way social media algorithms are designed doesn’t help either. Platforms have been built around drawing people’s attention to content that is popular, regardless of the quality of content. Platforms’ business models rely on exploiting human emotions to maximise screen time and increase profits.

Rethink and redesign our digital spaces

Social media platforms must redesign their digital spaces to prioritise high quality information. The major social media giants of 2020 have largely failed to remove misinformation, leaving 90% of posts reported for mis/disinformation up on their platforms. And misinformation, while false, generally promotes interesting fictitious narratives that have much higher virality than neutral or debunking information. Today’s social media platforms need a design overhaul which would restructure their feeds to ensure that curated knowledge reaches users before mis/disinformation does.

Additionally, while ‘social media echo chambers’ are often blamed for increased polarisation and the spread of misinformation, the power of ‘influencers’ in digital ecosystems is frequently overlooked. Within these vast communication networks, biased influencers have a disproportionate impact on the beliefs of their communities. They enable fringe conspiracies to be amplified into community accepted beliefs. Platforms that promote equity within social groups and deemphasise the influence of a select few could create less biased and more informed communities.

Rebuild consensus around authoritative sources of information

It’s crucial that trust is restored in authoritative, consensus-built sources of information. An authoritative source is, broadly speaking, a piece of information whose authenticity is widely recognised by experts within the field.

Often the words ‘authoritative source’ are associated with blind faith in distant experts. This perception is what degrades public trust in scientists during a pandemic and erodes trust in democratic systems. And to some extent this is the result of a communication failure. In the US, for instance, scientific literacy is low. Very low. Studies suggest 70% of US citizens cannot read and understand the science section of the New York Times.

While this is a problem that needs to be addressed through education, it also highlights the lack of accurate, digestible authoritative sources for many. We need to create scientific sources that give people accurate information and enable learning without belittling their intelligence.

The authoritative sources of the 21st century must evolve into engaging platforms that prioritise accessibility, encourage curiosity and scepticism, and promote scientific literacy and critical thinking.

Address the problem together

Combatting a problem this vast and complex requires communication and action across all layers of civil society with a cross-pollination of expertise. We need researchers to make their knowledge as easily accessible as possible. Experts need to be in contact with communicators, activists, policy-makers and concerned citizens.

Some networks are already addressing connecting, such as the formation of research/activist networks like Kollektiivi. And we’ve seen successful volunteer-based misinformation tracking and reporting programmes like the Youth Against Misinformation initiative or the UN HerStory network.

As we go forward and address this global problem, it’s crucial to remember the humanity behind it. Every day people are falling down these dark rabbit holes. You may have people in your life who’ve fallen into the endless whirlpool of misinformation. Or maybe you even feel as though you have as everyone is susceptible to it.

Every day people are going to have to learn both how to recognise misinformation and how to discuss it with their friends and family who believe it. We need let go of condescension and approach this problem with patience and compassion for friends, family, neighbours and fellow citizens.