THE NEUROSCIENCE OF MISINFORMATION: How the Brain Processes Fake News
- Marcela Emilia Silva do Valle Pereira Ma Emilia
- Nov 12
- 9 min read

🧠 The Neuroscience of Misinformation
We live in an age where truth competes with speed.
Information arrives before verification, and the brain — wired to react faster than it reflects — becomes vulnerable to what feels true, not necessarily to what is true.
Misinformation doesn’t spread because people are naive.
It spreads because the human brain is efficient.
Evolution shaped us to conserve cognitive energy, trust familiar patterns and respond to emotional stimuli — precisely the fertile ground where fake news thrives.
Recent reviews in cognitive and social psychology show that believing misinformation isn’t just a matter of ideology, but of how we think: relying on mental shortcuts, trusting “gut feeling” over analysis, and mistaking familiarity for truth.
Neuroscience is beginning to reveal what happens when the brain is exposed to false information: circuits of emotion, memory and reward take over long before reason joins the conversation.
💭 The Brain That Wants to Believe

The human mind seeks coherence, not accuracy.
When information reinforces what we already believe, the brain releases dopamine — the same molecule associated with pleasure and learning.
That small chemical burst strengthens the sense of “this makes sense”, making the belief more resistant to later revision.
Research in the psychology of misinformation shows that it’s not only political content that matters, but also thinking style: people who rely less on analytical reasoning tend to believe and share more fake news, even when they have enough knowledge to know better.
Reviews such as Ecker et al. (2022, Nature Reviews Psychology) indicate that beliefs in misinformation are shaped by three main axes:
cognitive factors (intuitive vs. analytical thinking),
social factors (group norms, identity),
and affective factors (fear, anger, hope, outrage).
👉 Misinformation works because the brain wants to feel right — and wants to feel like it belongs.
🧬 The Anatomy of Misinformation

Three brain systems sit at the centre of believing and spreading false information:
Amygdala: responds to emotionally charged content — fear, anger, outrage. The more emotional the news, the stronger the engagement.
Prefrontal cortex: responsible for logical reasoning and impulse control. Under emotional or cognitive overload, it becomes partially inhibited — reducing critical thinking.
Hippocampus: stores memories but doesn’t distinguish fact from fiction. It records content according to emotional intensity, not accuracy.
Reviews on misinformation also describe the “continued influence effect”: even after correction, false information keeps influencing reasoning. The brain retains the original narrative as a “structure” and tries to fit the correction around it.
Even when debunked, fake news leaves cognitive traces — the brain doesn’t unlearn what it has already felt.
🧩 Memory and Reconstruction: When False Becomes Familiar

Each time we recall an event, the brain rewrites it — literally.
This process, known as memory reconsolidation, makes memories malleable and easily contaminated by new information.
Studies in memory show that after exposure to false news about recent events, a significant portion of people recall details that never occurred — as if they were personal memories.
That’s because the brain doesn’t seek precision, but narrative coherence.When something feels plausible and fits into our worldview, the brain “glues” it to real memories.
🪞 In neurobiological terms, fake news can turn into a legitimate memory — and that’s why correcting is not the same as erasing.
🌐 Bubbles, Polarisation and the Misinformation Environment

The brain doesn’t exist in a vacuum — it navigates informational environments designed to capture attention.
Social media and personalised feeds reinforce what we already think, creating what the literature calls selective exposure and ideological bubbles: we see more of what confirms our beliefs and less of what challenges them.
Studies such as Dominic Spohr’s (2017) discuss how fake news and ideological polarisation feed off one another:
algorithms favour emotional and polarising content,
users begin to consume increasingly homogeneous information,
and this further reinforces group identities (“us” vs. “them”).
From the brain’s point of view, it’s a perfect cycle of social reward + cognitive confirmation.
👵🧠 Age, Trust and Vulnerability to Misinformation

Misinformation doesn’t affect all age groups equally.
Reviews such as “Ageing in an Era of Fake News” (Brashier & Schacter, 2020) show that older adults tend to share more fake news than younger people — not due to lack of intelligence, but because of several combined factors:
greater interpersonal and institutional trust built over life;
changes in episodic memory that make it harder to recall where and how information was obtained;
growing familiarity with certain topics, which the brain misreads as a signal of truth.
This reinforces an important point:vulnerability to misinformation is not stupidity — it’s a by-product of how the brain ages, learns and trusts.
⚙️ The Social Reward Circuit

Misinformation isn’t just about believing — it’s about belonging.
Each like, share or comment that validates our opinion gives the brain a dopaminergic boost.
This reward cycle activates the same circuit involved in habits and behavioural addictions: dopamine, pleasure and repetition.
Studies on the spread of online content show that false news often travels faster and farther than true news — partly because it’s more emotional, more surprising and more aligned with group identity.
💡 Misinformation spreads like emotion, not like argument.
🧩 The Deceived Brain: Four Neurocognitive Mechanisms of Misinformation

Misinformation doesn’t spread just because of technology — it spreads because it exploits shortcuts in the brain’s architecture.
Some neural circuits that evolved to guarantee survival now work against us in the digital environment.
Here’s how four mechanisms combine to make falsehoods feel true:
⚡ 1. Selective Attention: the Emotional Trigger
The limbic system — especially the amygdala and nucleus accumbens — gives absolute priority to stimuli that provoke fear, anger or outrage.
🧠 In the savannah, ignoring a lion’s roar could be fatal.
Today, a headline like “Vaccine causes infertility in 5 years” activates the very
same circuit.
📊 Emotional content gains up to 14 times more engagement than neutral information (Vosoughi et al., Science, 2018).
💬 Neural translation:The brain pays more attention to what threatens than to what informs.
🧭 2. Confirmation Bias: the Invisible Filter
The dorsolateral prefrontal cortex (DLPFC) acts as a biased editor.
It amplifies information that confirms our beliefs and suppresses or distorts what contradicts them.
🧠 Brain imaging studies show stronger DLPFC activation when people read news aligned with their political views (Kaplan et al., Nature Neuroscience, 2016).
📍 Result:You literally “see” what you already believe.
Biology shapes perception — and reinforces cognitive bubbles.
🔁 3. The Illusory Truth Effect: the Lie That Becomes True
After just three exposures to the same piece of information, the hippocampus tags it as familiar — and therefore, trustworthy.
Even when the content is labelled “false.”
📊 Fazio et al., Journal of Experimental Psychology (2015) showed that statements repeated three times were judged true in 66% of cases, regardless of content.
💭 Mental meme:“I’ve seen this before” = “It must be true.”
🔄 4. Neural Echo: Self-Reinforcing Bubbles
Recommendation algorithms on social media act as amplifiers of dopamine and synaptic reinforcement.
The more you consume X, the stronger the neural connections for X become — reducing curiosity and cognitive openness.
🧠 Modern version of Hebb’s rule: “Neurons that fire together, wire together... and isolate you.”
📉 Consequence: Reduced cognitive flexibility and difficulty changing one’s mind — a brain that repeats instead of reflects.
✨ The end result is a hyperconnected yet cognitively narrow brain.
Misinformation doesn’t just deceive — it rewires the neural pathways of attention, memory and emotion.
🛡️ Applied Neuroscience: Training the Brain to Resist Misinformation

The good news is that neural plasticity also applies to how we think.
The brain can be trained to recognise patterns of emotional manipulation and respond consciously before believing or sharing.
🧘 1. Conscious Attention (Pause Before Reaction)
Breathing before reacting reactivates the prefrontal cortex and reduces amygdala dominance.
That micro-pause increases the chance of evaluating content — not just the emotion.
🧠 2. Train Constructive Scepticism
It’s not about doubting everything, but asking:Who is saying it? Why are they saying it? What do they gain from it?
This type of questioning strengthens metacognitive networks — the brain learns to observe its own thinking.
🌐 3. Diverse Exposure
Consuming varied sources and perspectives increases connectivity between the hippocampus (memory) and prefrontal cortex (reason), reducing confirmation bias.
Studies in political psychology and communication show that people who navigate more diverse informational environments tend to be less vulnerable to extreme and simplistic narratives.
🛡️ 4. “Prebunking” and Cognitive Inoculation
Rather than correcting after the fact, promising research shows it’s possible to “vaccinate” the brain against misinformation.
Work by Lewandowsky, Van der Linden and colleagues indicates that exposing people to weakened versions of manipulation techniques — while explaining how they work — creates a kind of cognitive antibody that makes them more resistant to future fake news.
Games and simulations like Cambridge’s Bad News Game are practical examples of this inoculation: users play the role of “disinformation creators” and, by understanding the tactics, become less vulnerable to them in real life.
🏢 In the corporate environment:
Forward-thinking companies are starting to integrate cognitive training in attention, critical analysis and media literacy into their learning programmes.
This improves decision quality, reduces bias and strengthens organisational trust — because teams that think better make fewer mistakes together.
Companies like Google, Deloitte and PwC are creating internal cognitive labs — short simulations, quizzes and quick dynamics — to train employees to detect emotional manipulation and cognitive bias in real time.
The goal is simple yet powerful: teach the brain to pause before reacting.
And it can be replicated at any scale:
use headlines or internal reports as examples;
gamify everyday decisions;
encourage “micro-cognitive pauses” in emails and meetings.
Small interventions, big shifts in collective clarity.
🌱 Conclusion — The Critical and Conscious Brain

The human brain is a meaning-making machine.
It connects, interprets and fills in gaps — which is why, sometimes, it believes before it understands.
But the same mind that can be deceived can also learn to doubt.
Neuroscience shows that doubt doesn’t weaken the mind — it strengthens it.
In the end, fighting misinformation isn’t just a matter of technology but of neuroeducation:learning to pause, analyse and choose what to believe is the true exercise of cognitive freedom.
✨ The future doesn’t belong to the brain that reacts,but to the brain that reflects.
📚 References
1. Ecker, U. K. H., Lewandowsky, S., Cook, J., Schmid, P., Fazio, L. K., Brashier, N., ... & van der Linden, S. (2022). The psychological drivers of misinformation belief and its resistance to correction. Nature Reviews Psychology, 1, 13–29.🔗 https://www.nature.com/articles/s44159-021-00006-y👉 Comprehensive review on how cognitive, social and affective factors shape belief in misinformation.
2. Pennycook, G., & Rand, D. G. (2021). The psychology of fake news. Trends in Cognitive Sciences, 25(5), 388–402.🔗 https://www.sciencedirect.com/science/article/pii/S1364661321000516👉 Analyses the mental mechanisms that make people susceptible to misinformation and explains how intuitive reasoning affects belief.
3. Lewandowsky, S., & Van der Linden, S. (2021). Countering misinformation and fake news through inoculation and prebunking. Trends in Cognitive Sciences, 25(6), 434–443.🔗 https://www.sciencedirect.com/science/article/pii/S1364661321001509👉 Introduces the concept of “cognitive inoculation”: training the brain to resist fake news before exposure.
4. Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146–1151.🔗 https://www.science.org/doi/10.1126/science.aap9559👉 Demonstrates that false news spreads 14 times faster than true information due to its emotional impact.
5. Kaplan, J. T., Gimbel, S. I., & Harris, S. (2016). Neural correlates of maintaining one’s political beliefs in the face of counterevidence. Nature Neuroscience, 19, 1683–1690.🔗 https://www.nature.com/articles/nn.4420👉 Shows that the dorsolateral prefrontal cortex is activated when we defend political beliefs — even against factual evidence.
6. Fazio, L. K., Brashier, N. M., Payne, B. K., & Marsh, E. J. (2015). Knowledge does not protect against illusory truth. Journal of Experimental Psychology: General, 144(5), 993–1002.🔗 https://doi.org/10.1037/xge0000098👉 Identifies the “illusory truth effect”: repeated information feels true, even when we know it’s false.
7. Spohr, D. (2017). Fake news and ideological polarization: Filter bubbles and selective exposure on social media. Business Information Review, 34(3), 150–160.🔗 https://journals.sagepub.com/doi/10.1177/0266382117722446👉 Analyses how algorithms and information bubbles reinforce polarisation and false beliefs.
8. Brashier, N. M., & Schacter, D. L. (2020). Ageing in an era of fake news. Current Directions in Psychological Science, 29(3), 316–323.🔗 https://journals.sagepub.com/doi/full/10.1177/0963721420915872👉 Explains why older adults are more vulnerable to misinformation — due to greater trust and reduced source memory.
9. Van der Linden, S., Roozenbeek, J., & Compton, J. (2020). Inoculating against fake news about COVID-19. Frontiers in Psychology, 11, 566790.🔗 https://www.frontiersin.org/articles/10.3389/fpsyg.2020.566790/full👉 Shows how games and simulations (like the Bad News Game) strengthen cognitive resistance.
10. Cambridge Social Decision-Making Lab (2019–2024). The Bad News Game Project. University of Cambridge.🔗 https://www.cam.ac.uk/research/news/fighting-fake-news-with-fake-news👉 Demonstrates that understanding manipulation tactics reduces susceptibility to fake news by up to 21%.



Comments