In an era where digital information moves faster and spreads further than ever before, distinguishing fact from fiction has become a defining challenge of the online environment. Recent research by The Alan Turing Institute shows that more than 90% of adults in the UK report encountering information disorder on social media - with the remainder almost certainly unwittingly exposed.

The study highlighted not only the scale of the problem, but also a troubling gap in users’ engagement with tools and strategies designed to assess the reliability of what they see online - with only a small number having accessed media-literacy resources available to help navigate this landscape.

This article highlights some of the psychological influences that allow information disorder to take hold and persist, including cognitive shortcuts, social pressures, and behavioural dynamics. These that shape how people absorb, share, defend, and resist information they come across online - from anchoring. and confirmation bias to groupthink, polarisation, and the continued influence of false claims even after correction.

Understanding how these mechanisms influence thought processes will help you understand why information disorder spreads faster than truth, why it is so hard to change people’s minds, and why exposure along is enough to form and entrench false beliefs - a prerequisite for designing effective mitigation strategies.

Anchoring Bias

“A lie gets halfway around the world before the truth has a chance to get its pants on.”

Anchoring bias describes the tendency to rely disproportionately on the first piece of information encountered when forming a judgement. In information environments, this makes early claims artificially powerful, even when they are speculative or wrong. Initial narratives often set the frame through which all subsequent information is interpreted, forcing later corrections to compete against an established reference point.

This dynamic is especially pronounced during crises, where the volume of information typically exceeds verification capacity. Anchoring accelerates information disorder by rewarding fast movers rather than accurate ones - giving sources of information disorder a significant advantage over those trying to counter it. Mitigating it requires slowing initial claims, clearly flagging uncertainty, and deliberately re-establishing corrected information as a new reference rather than treating it as a footnote.

Flight MH17: Russia frequently exploits anchoring bias by rapidly seeding its preferred narrative after crises.

“On July 21, 2014, the Russian MoD presented a series of fabricated and misleading information about the flight path of MH17, radar data, the location of the July 18, 2014 Luhansk video, and the inclusion of misdated and heavily edited satellite imagery.” Bellingcat

This messaging anchoring debate around alternative culprits and sustaining doubt even after international investigators traced the missile to a Russian-supplied Buk system.

Availability Heuristic

“When you hear hoofbeats, think horses, not zebras.”

The availability heuristic leads individuals to judge likelihood and importance based on how easily examples come to mind. In digital environments, visibility is often mistaken for frequency or significance. Content that is vivid, emotional, or repeatedly encountered is perceived as representative, regardless of underlying evidence. This distorts risk perception and amplifies extreme or atypical events, giving them disproportionate influence over public understanding.

Information disorder thrives when isolated but notable anecdotes are treated as evidence of a wider pattern. Mitigation depends on restoring proportionality through context, comparative data, and emphasis on trends rather than isolated incidents, while resisting the algorithmic incentives that reward memorability over accuracy.

Overestimation of fatality likelihood in terror attacks: “When an infrequent event can be recalled easily, people tend to overestimate its likelihood. In the context of terrorist events, they are extremely publicised and it is therefore understandable that they have a higher availability. However, common but unremarkable events, such as car accidents leading to fatality are less well reported and so have lower availability, so their likelihood tend to be underestimated.” LSE

Backfire Effect

“It is difficult to get a man to understand something, when his salary depends on his not understanding it.”

The backfire effect occurs when attempts to correct false beliefs instead reinforce them. Rather than updating views, individuals interpret corrections as attacks on identity, values, or group membership. In polarised environments, factual rebuttal can harden commitment to misinformation, especially when delivered confrontationally or by distrusted sources. This makes conventional fact-checking insufficient and sometimes counterproductive.

Countering the backfire effect requires reframing corrections in non-adversarial terms, avoiding repetition of false claims, and grounding new information in shared values rather than direct contradiction.

*Note: There is debate around whether the backfire effect actually exists, with some studies finding no evidence of it at all.

“More recently, researchers have concluded that ‘fact-checkers can rest assured that it is extremely unlikely that their fact-checks will lead to increased belief at the group level.” First Draft News

Confirmation Bias

“People do not believe what they see, they see what they believe.”

Confirmation bias is the tendency to seek out, interpret, and recall information that aligns with existing beliefs while discounting conflicting evidence. In digital ecosystems, this bias is reinforced by algorithms that optimise for engagement and by user self-selection into like-minded communities. As a result, misinformation that fits prior views spreads rapidly and encounters little resistance, while corrective information is filtered out.

Confirmation bias fragments shared reality and sustains parallel narratives, particularly within echo chambers that amplify and recycle prior beliefs. Algorithmic curation and self-selection tighten these closed loops, insulating audiences from contradiction and allowing distortion to pass as consensus. Countermeasures should prioritise exposure to genuinely diverse sources, introduce friction that slows impulsive sharing, and promote institutional designs that reward credibility, challenge, and verification rather than comfort or affirmation.

Why people believe in fake news: “Previous studies largely pointed to belief in fake news as confirmation bias, which is the tendency to believe information that supports your existing worldview,” said Amrita George, co-author of a study analyzing how emotional cues shape news consumption on social media. “Fake news scratches an emotional itch.” Georgia State University

Cognitive Dissonance

“A man convinced against his will is of the same opinion still.”

Cognitive dissonance arises when individuals encounter information that conflicts with deeply held beliefs, creating psychological discomfort. Rather than revising beliefs, people often reject, reinterpret, or dismiss the new information to restore internal consistency. This response allows misinformation to persist even when contradictory evidence is widely available.

In politicised contexts, dissonance management often takes the form of conspiracy thinking or distrust of institutions. Information disorder benefits from this defensive reflex. Effective mitigation avoids forcing abrupt reversals and instead presents information incrementally, allowing belief adjustment without implying error, betrayal, or loss of identity.

Republican views on felons as president: “We can see cognitive dissonance and its effects at work when people rapidly “reason” in ways that are really attempts to mitigate their discomfort with new information about strongly held beliefs. For example, before Trump was convicted of various charges in 2024, only 17% of Republican voters believed felons should be able to be president; directly after his conviction, that number rose to 58%.” The Guardian

Commitment Effect

“It is easier to fool people than to convince them that they have been fooled.”

The commitment effect occurs when beliefs become more rigid after individuals publicly commit to them. Once a position is expressed, shared, or defended, social and reputational costs make reversal difficult. Admitting error becomes associated with weakness or loss of status, encouraging continued defence of false claims. In online environments, where statements are permanent and highly visible, entrenchment is intensified.

The perceived social or reputational costs of changing an opinion or view contributes to the persistence and proliferation of information disorder. Countering this effect requires creating socially acceptable pathways for revision, normalising updates and corrections, and reducing the stigma attached to changing one’s position in light of new information - for example, by avoiding pejorative language such as describing revisions as “backtracking” or “U-turns.”

Example: “No one enjoys being wrong. It’s an unpleasant emotional experience for all of us. The question is how do we respond when it turns out we were wrong—when there wasn’t enough milk left for coffee, when we hit traffic and missed the flight, or when we find out the man who sat in jail for five years based on our identification was innocent all along?” Psychology Today

Continued Influence Effect

“A falsehood is never falsehood enough if it has once been accepted as truth.”

The continued influence effect refers to the persistence of misinformation in reasoning even after it has been corrected. While individuals may acknowledge that a claim is false, they continue to rely on it as an explanatory anchor in their understanding of events.

Corrections that remove facts without replacing the narrative structure leave a cognitive gap that misinformation continues to fill. This makes debunking incomplete and fragile. Countering the effect requires substitution in addition to negation: providing an alternative explanation that is coherent, causal, and satisfying enough to displace the original false account.

Evidence of rationality in CIE: The study tested whether the continued influence effect depends on source credibility. Across two experiments, participants showed lingering influence from retracted information when the original source was more credible than the correction. However, the effect also appeared when sources were equally credible, indicating memory-based error beyond rational source evaluation.” Hey et al

Exposure Effect

“A lie spoken often enough becomes the truth”

The exposure effect describes how repeated exposure to a claim increases its perceived plausibility, regardless of accuracy. Familiarity is mistaken for truth. In high-volume media environments, false claims benefit simply from being repeated, including through attempts to debunk them. Over time, repetition erodes scepticism and lowers the threshold for belief.

Information disorder exploits this dynamic by prioritising saturation over persuasion. Repetition and sheer volume, rather than argument, drives adoption. Mitigation therefore requires limiting the unnecessary recirculation of false claims, even in corrective contexts, while consistently amplifying accurate information so familiarity accrues to reliable narratives rather than misleading ones. Detecting and disrupting mass posting by coordinated online actors - whose scale and synchronicity are designed to manufacture consensus and overwhelm organic debate - is a crucial step in doing so.

Prior exposure increases perceived accuracy of fake news: “Using actual fake news headlines presented as they were seen on Facebook, we show that even a single exposure increases subsequent perceptions of accuracy, both within the same session and after a week. Moreover, this “illusory truth effect” for fake news headlines occurs despite a low level of overall believability, and even when the stories are labeled as contested by fact checkers or are inconsistent with the reader’s political ideology.” Pennycook et al

Entrenchment Effect

“The difficulty lies, not in the new ideas, but in escaping from the old ones.”

People often persist with a view not because it is accurate or well-supported, but because it is familiar or comforting. Repeatedly holding, expressing, and defending a belief lowers the cognitive effort required to maintain it, while switching to a new view feels difficult, disruptive, or even upsetting. Over time, the belief becomes a mental default.

Information disorder benefits when a fondness for familiarity and comfort supersedes a desire for accuracy. Beliefs that are repeatedly held and expressed become cognitively effortless defaults, while revising them feels demanding and disruptive. Countering this effect requires interrupting repetition, reintroducing novelty and contrast, and ensuring accurate information is encountered consistently so familiarity accrues to reliable accounts rather than inherited assumptions.

Why humans prefer comfort over truth: This is why beliefs often persist even when they are wrong. They do not survive because they are true. They survive because they work. They reduce anxiety. They strengthen identity. They create community. Once a belief becomes tied to safety or belonging, letting it go can feel like betrayal. Not just of an idea, but of a way of life.” Medium

Groupthink

“The pressure to conform is enormous.”

Groupthink arises when cohesion within a group suppresses dissent and critical evaluation. Members prioritise consensus and belonging over accuracy, allowing assumptions to go unchallenged and blind spots to harden. In online communities, this dynamic accelerates misinformation by marginalising corrective voices and rewarding conformity. Contradictory information is often framed as hostile or illegitimate, further insulating the group.

Information disorder thrives in closed systems with strong in-group identity. Mitigation depends on encouraging internal disagreement, protecting minority views, and structuring deliberation so that critique is treated as a contribution to collective accuracy rather than a sign of disloyalty.

The Power of Groupthink: There’s a reason that ideas - even erroneous ones - catch fire on social media or in popular culture: groupthink. New research co-authored by Berkeley Haas Asst. Prof. Douglas Guilbeault shows that large groups of people all tend to think alike, and also illustrates how easily people’s opinions can be swayed by social media - even by artificial users known as bots.” Haas News

Mandela Effect

“Memory is deceptive because it is coloured by today’s events.”

The Mandela Effect describes cases where large numbers of people confidently remember events or details that are wrong. Because these memories feel vivid and are socially reinforced, they are hard to dislodge. In information environments, this blurs the line between mistake, myth, and invention, particularly when recollection is treated as evidence. Repeated misremembering can end up carrying more weight than records or documentation.

Countering the effect requires grounding claims in primary sources, contemporaneous records, and verifiable documentation, while reducing reliance on recollection as evidence in public discourse.

Origin of the Mandela Effect: “The term "Mandela Effect" was first coined in 2009 by Fiona Broome when she created a website to detail her observance of the phenomenon. Broome was at a conference talking with other people about how she remembered the tragedy of former South African president Nelson Mandela's death in a South African prison in the 1980s.

However, Nelson Mandela did not die in the 1980s in a prison - he passed away in 2013. As Broome began to talk to other people about her memories, she learned that she was not alone. Others remembered seeing news coverage of his death as well as a speech by his widow.” Very Well Mind

Normative Influence

“No one wants to be the odd one out.”

Normative influence occurs when individuals adopt beliefs or behaviours to align with perceived social norms rather than personal conviction. In digital spaces, visible metrics such as likes, shares, and comments distort perceptions of consensus. People may amplify misinformation not because they believe it, but because they think it is socially expected or rewarded. This accelerates spread through conformity rather than persuasion.

Information disorder exploits perceived majorities that are often illusory. Mitigation requires making accurate beliefs visibly common, reducing performative incentives, and highlighting the gap between vocal minorities and broader opinion.

Influence on misinformation: “In the misinformation realm, the normative signal associated with the endorsement of social-media messages has been found to influence false-message belief both before and after correction, and a social-norming intervention has been found to reduce belief in equivocal claims.” Nature

Online Disinhibition Effect

“Give a man a mask and he will tell you the truth.”

The online disinhibition effect describes how anonymity, distance, and lack of immediate consequences reduce social restraint. Disinhibition weakens norms of verification and accountability. Individuals are more likely to share false, extreme, or aggressive content online than offline. This lowers the cost of misinformation and increases its emotional intensity, making it more transmissible.

Information disorder flourishes where attribution is weak and consequences are minimal. Countermeasures include strengthening accountability signals, introducing friction that slows sharing, and applying moderation consistently, while preserving legitimate anonymity for users who rely on it for safety.

Coining of the term: “[The online disinhibition effect] was originally introduced by Dr. John Suler, a professor of psychology at Rider University in his 2004 paper, "The Online Disinhibition Effect," published in the journal CyberPsychology & Behavior. People had been noticing flaming and aggressive behavior on message boards for a while already, but Suler was the one who mapped out exactly why people behave so differently in cyberspace than in real life.” Wikipedia

Polarisation

“If you’re not with us, you’re against us.”

Polarisation refers to the sorting of individuals into opposing camps with increasingly rigid boundaries. In polarised environments, information is judged by who says it rather than whether it is accurate. This turns facts into identity markers and corrections into acts of hostility.

Information disorder benefits because false claims aligned with group identity are protected from scrutiny, while accurate information from outside the group is rejected. Reducing polarisation requires separating factual questions from moral or identity conflict and avoiding zero-sum framing that forces every issue into a binary choice.

US Politics: “In an ever more polarized American political climate, people are motivated to share news articles that are politically consistent with their partisanship, even when they recognize that the information comes from dubious sources and may not be true.” Brookings

Unconscious Bias

“We judge ourselves by our intentions and others by their behaviour.”

Unconscious bias consists of implicit assumptions that shape judgment without conscious awareness. These biases influence which sources are trusted, which stories seem plausible, and which claims are dismissed.

Information disorder exploits these tendencies by aligning false narratives with stereotypes or expectations, making them feel intuitively correct. Because the bias operates below awareness, it resists direct correction. Mitigation requires deliberate reflection on assumptions, exposure to diverse perspectives, and institutional audits that reduce reliance on intuition when evaluating information.

Impact of stereotypes on vulnerability to misinformation: The study finds that age stereotypes increase vulnerability to misinformation: when information matches stereotypical expectations, it is processed more automatically, leading to poorer recognition, higher false memories, and more commission errors than when information violates age-based expectations.” Gulseren and Ikier

Zealot Effect

“The best lack all conviction, while the worst are full of passionate intensity.”

The zealot effect refers to the disproportionate influence of a small number of highly committed individuals within belief communities. These actors are resistant to evidence, dominate discussion, and set the tone for acceptable views. Their certainty and persistence crowd out moderates and create the illusion of widespread conviction.

Information disorder gains traction when debate is driven by those who are least open to or even immune from correction. Effective responses focus on disengaging from zealots and instead shaping the broader, more persuadable audience, reducing the visibility and leverage of uncompromising actors.

Vaccine hesitancy in BAME communities: “There is also a lot of misinformation, unfortunately some of it being fuelled from outside the country by some influential Christian leaders. There are medical concerns about the contents of the vaccines and then there are some religious concerns as well. We have been working on bringing correct information to our people in our community so that they can make informed decisions.” The Guardian

Keep Reading

No posts found