At the same time, the more we see something repeated, the more likely we are to believe it to be true. This “illusory truth effect” arises because we use familiarity and ease of understanding as a shorthand for truth – the more something is repeated, the more familiar and fluent it feels whether it is misinformation or fact.
“There is only typically one true version of a claim and an infinite number of ways you could falsify it, right?” says Nadia Brashier, a psychology professor at Purdue University who studies why people fall for fake news and misinformation. “So, if you hear something over and over again, probabilistically, it’s going to be the true thing.”
But these shortcuts do not work so well in our current political environment and social media, which can repeat and amplify falsehoods. One study found that even a single exposure to a fake headline made it seem truer. Politicians often repeat lies and seem to be aware of the power of the illusory truth effect, Brashier says.
We are also more susceptible to misinformation that fits into our worldviews or social identities, and we can fall into confirmation bias, which is the tendency to look for and favour information fitting what we already believe.
False stories and emotionally driven examples are easier to understand and more immersive than statistics. “We are navigating this new world of numbers and probabilities and risk factors,” Walter says. “But the vessel that we use, our brain, is very old.”
Once we have heard misinformation, it is hard to uproot even when we want to know the truth. Multiple studies have found that misinformation can still influence our thinking even if we receive a correction and believe it to be true, a phenomenon known as the “continued influence effect.”
In a meta-analysis aggregating the results from 32 studies of over 6500 people, Walter found that correcting falsehoods reduces – but does not entirely eliminate – the effect of misinformation.
One of the biggest barriers to correcting misinformation is the fact that hearing the truth doesn’t delete a falsehood from our memory.
Instead, the falsehood and its correction coexist and compete to be remembered. Brain imaging studies conducted by Lewandowsky and his colleagues found evidence that our brains store both the original piece of misinformation, as well as its correction.
“It seems to be cognitively almost impossible to listen to something, understand it and, at the same time, not believe it,” Lewandowsky says.
Dismissing misinformation requires a whole extra cognitive step of tagging it as false in our memory. “But by that time, in a sense, it’s too late because it’s already in your memory,” Lewandowsky says.
Over time, our memory of the fact-check may fade, leaving us only with the misinformation.
There is evidence that “we’re running up against basic limitations of human memory when we’re giving people corrective information,” Brashier says.
“We’re running up against basic limitations of human memory when we’re giving people corrective information.”
Professor Nadia Brashier
Finally, correcting misinformation is even more challenging if it is embedded into our identity or system of belief. People build mental models of the world to make sense of unfolding situations and “it’s very difficult to rip out a plank of this edifice without the whole thing collapsing,” Lewandowsky says.
“If it is an important component of your mental model, it is cognitively very difficult to just yank it out and say it’s false.”
There is so much misinformation out there that it is not feasible to react to each new falsehood that arises. “It’s like playing a game of whack-a-mole. You can be very good, but at the end, the mole always wins,” Walter says.
Debunking alone is not enough to combat misinformation – we also need to be proactive by “prebunking,” which essentially means preparing our brain to recognise misinformation before we encounter it. Much like the way a vaccine primes your immune system to battle a foreign invader, prebunking can inoculate and strengthen your psychological immune system against viral misinformation.
In one study from this year, Lewandowsky and colleagues presented almost 30,000 people across seven experiments with five short videos about common manipulation techniques – incoherence, false dichotomies, scapegoating, ad hominem attacks and emotionally manipulative language. Each video provided a warning about the impending misinformation attack and manipulation technique before presenting a “microdose” of misinformation.
The study found that watching these videos could make us more sceptical of falsehoods in the future.
Another way to protect yourself is to simply pay attention to whether what you are seeing is accurate. When people scroll through their social media feeds, they aren’t always thinking about accuracy. One recent study found that subtly nudging people to consider whether what they see is accurate made them less likely to share misinformation.
“All of us can fall for misinformation,” Brashier says. “I’ve fallen for false stories myself even though this is what I study.”
Make the most of your health, relationships, fitness and nutrition with our Live Well newsletter. Get it in your inbox every Monday.
Discussion about this post