How do we defend ourselves against fake news? To neutralize misinformation, we first need to understand how misinformation does damage. I find particularly instructive a research paper by Sander van der Linden and colleagues (including Mason’s own Ed Maibach) that tested what happens when people are exposed to different messages.
One group was told about the 97% scientific consensus on human-caused global warming. A second group was shown misinformation casting doubt on the consensus (specifically the Global Warming Petition Project listing 31,000 dissenting science graduates). A third group were shown both messages. The following graph shows the participants’ change in perceived consensus (a good measure of climate perceptions) in response to these messages:
The first group given the facts increased their acceptance of climate change. The second group given misinformation decreased climate acceptance. But the third group given both fact and myth showed no change in belief. Fact and myth cancelled each other out.
This highlights one of the most insidious and dangerous effects of misinformation: it cancels out accurate information. When people are exposed to conflicting messages and have no way of resolving the conflict, there’s a danger of them disengaging and believing neither.
But this insight also points to the solution to misinformation. To neutralize misinformation, we need to help people resolve the conflict between fact and myth. If people understand the techniques used by misinformation to distort the facts, they can make sense of conflicting information.
Enter inoculation theory. This is a branch of psychological research that takes the principles of vaccination and applies it to knowledge. When we’re exposed to a weakened form of a disease, we build up immunity so that we don’t get infected by the real disease. Similarly, when we’re exposed to a weakened form of misinformation, it builds up our resilience so that we don’t get misled by real misinformation.
How do you deliver misinformation in weakened form? An inoculating message contains two elements: a warning of the threat of being misled, and counterarguments explaining how the misinformation misleads. The fourth group in the figure above were shown an inoculating message, which explained several different ways that the Global Warming Petition Project was misleading. When participants were inoculated, the misinformation was mostly neutralized.
Coincidentally, while Sander van der Linden (a name that is a lot of fun to say out loud) was leading this research, I was running very similar research in Australia. I even used the same misinformation – the Global Warming Petition Project. However, there was one major difference between Sander’s inoculating message and mine. I never mentioned the Petition Project. Instead, I focused on a more general inoculation that explained the technique of fake experts: people who conveyed the impression of being a scientific expert without possessing the relevant expertise. I also used a tobacco advertisement as an example of the fake expert strategy. Here’s an excerpt:
My general form of inoculation completely neutralized the misinformation (the blue line in the figure below). This means that inoculating people against a denial technique in one topic has the potential to neutralize that form of misinformation in other topics also. Inoculation researchers call this the “umbrella of protection.” I also found that my inoculating message neutralized the misinformation across the political spectrum. Whether people are politically conservative or liberal, no one likes being misled. Inoculation neutralizes the polarizing effect of misinformation.
While there are a variety of technological and social media solutions to the problem of fake news, the more I research this issue, the more I become convinced that increasing public resilience is the most effective and long-term solution to misinformation. And the key to building resilience is inoculation.
After I had stumbled upon inoculation as the answer to misinformation, my next research question was how to put this into practice. I’ll explore that in subsequent posts…
Stephen Verchinski
Good article. Thanks for the sharing.
Robin
John – excellent article. I caught your online lecture last Friday which also covered this topic. Thanks so much.
I’m fascinated with this “inoculation” method. Where I live there are elected officials who are climate science deniers as well as op-eds and letters published by deniers in our local papers. We have a nice set of scientists and citizens who reliably provide rebuttals but I think the science bogs down the response and people are left confused. I think we need to go on the offensive and provide some inoculation articles to the papers using some of our local scientists as trusted messengers and your techniques. I cannot wait to read about how to do this in a future blog post and get to work on this on my region.
p.s. I LOVE Cranky Uncle vs. Climate Change! It’s so nice to look up the myths I hear and identify the fallacy.
crankyuncle
While in an ideal world, we can inoculate people with science & critical thinking, often we’re in the situation where people don’t have the bandwidth or other barriers exist that hinder people paying attention to scientific information. That’s why “logic-based inoculation” – that explain the misleading techniques used in misinformation – is powerful. First because it can inoculate people against a myth without having to go into all the scientific details. Second because logic-based inoculation conveys an “umbrella of protection”, building resilience against the same misleading techniques used in other topics. Third because they can be communicated in the form of parallel arguments, which as cartoons can be entertaining and engaging as well as pedagogical.