On March 20, I gave a webinar Increasing Critical Thinking to Understand and Respond to Climate Misinformation for the Intermountain Sustainability Summit organized by Weber State University. The webinar ended up going for 1 hour 50 minutes with 200 attendees and a vibrant Q&A (the energy levels of the Q&A are a good metric for the success of an event). Weber have just uploaded a video of the full webinar.
I divided the webinar into three parts. First, I explained the psychological research into countering misinformation, landing on inoculation theory as a powerful solution. Second, I delved into a critical thinking method of deconstructing and analyzing misinformation, as a step-by-step tool for identifying reasoning fallacies. Last, I previewed some of the latest research we’ve done at George Mason University testing different types of inoculating methods (in coming weeks, I’ll be blogging a series of posts delving into our different studies and the ways in different inoculation techniques are effective).
After each section, I paused from my presentation and answered questions. There were great questions (and hopefully useful answers from me). But even with the webinar stretching out to 1 hour 50 minutes, there still wasn’t enough time to answer them all. Before the webinar was closed, I copied and pasted all the questions, so I could answer them at my leisure in text form (not to mention add images and videos which I couldn’t do during the webinar). So here are my answers to the questions I didn’t get to during the webinar:
Q: How much hate do you get from the denial people? Similar to Michael Mann, Naomi Oreskes and Ben Santer? (B Wolpensinger)
One principle I write about in Cranky Uncle vs. Climate Change is the 3rd Law of Science Communication (not an actual law, I just made it up for the book): for every impact, there is equal and opposite pushback. This comes from personal experience – the more impactful my work, the more intense and virulent the pushback from deniers. Nowhere is this more true than our research finding 97% scientific consensus on climate change. This work has been attacked in hundreds of articles, blog posts, personal attacks, FOI requests for my emails, videos, congressional testimonies, etc. That said, the attacks I receive pale in comparison to the attacks that Mann, Oreskes, and Santer have received – testament to the deep, lasting impact of their work.
Q: What do you believe youth generations could be doing more of when it comes to influencing climate policy and social perception of climate consensus? (Hailey Burton)
Voting 🙂
To flesh that out a little more, I think that what all generations need to be doing more is talking about climate change – with their friends and family, on social media, and to their elected officials. We need collective action and social momentum in order to generate the political will to transform our society to clean energy (and yes, voting is a powerful way to move us in that direction). We need to break climate silence.
Q: Misinformation and propaganda are identical and are used by many to influence us all. My concerns are with the logical end-game where assumptions are turned into facts. How can that be addressed? How can assumptions be clearly defined? (HarryHo)
Working with critical thinking philosophers Peter Ellerton and David Kinkead, we developed a critical thinking methodology for deconstructing and analyzing misinformation, scrutinizing claims for logical fallacies and false premises (or assumptions). I highly recommending reading our full (IMO quite readable) paper published in Environmental Research Letters. Also useful is my blog post applying this methodology to the myth that COVID-19 is not as bad as the seasonal flu. Or if you only have three minutes to spare, check out our concise and entertaining video introduction to our critical thinking methodology:
Q: One observation about Trump, leading climate deniers, or even our crazy uncles is that they speak with a lot of bluster and extreme confidence that many members of the public find to be seductive and persuasive because they are spoken with extreme confidence.  What’s your thoughts on the underlying psychology about this that makes these misleading statements persuasive with a portion of the public since they are spoken with such solid confidence by the denier speaker? (Brian Ettling)
People who speak with more confidence are more persuasive. Simple arguments are more persuasive than complex arguments .So when climate deniers give simplistic, confident arguments while scientists speak with hedged language using scientific terms like “uncertainty” and “error” (which have specific meanings in science that differ to how the public understand it), that’s a problem. Further, scientists live at the wild frontiers of scientific knowledge and tend to talk about what they’re interested in – those phenomena where scientific understanding is low – rather than the settled areas of science. However, it’s important that when scientists engage with the public, they lead with what we know before delving into their current research.
Q: Misinformation on climate change is produced from nearly all fallacies you mentioned. Difficult to inoculate against them all.  Which ones do inoculation work best on? (John Callahan)
Great question. I haven’t had the time (or funding) to extensively test inoculating against a wide array of fallacies so I can’t yet answer that question. However, what’s exciting about the Cranky Uncle game is we’ll be collecting a lot of data about inoculation against all the fallacies in the FLICC taxonomy. So hopefully later this year, we’ll have a clearer idea of which fallacies are easier to neutralize and which are more difficult (which will point us towards what parts of the game need to be tweaked to be more effective).
While we’re still developing the game, we’ve already collected pilot data from students playing the prototype of the game. Even with a simple prototype, we found that the game increased students’ ability to identify a number of different fallacies (just perusing the graph below, it looks like cherry picking and single cause fallacy were the two showing the least amount of improvement but take this preliminary data with a grain of salt).
Q: Just a comment – the balance of real facts and misinformation also applies in voting. Misinformation/negative ads tend to reduce voter engagement, which is why negative ads work. They lead to more people staying home, and just not voting. (Jennifer Roberts)
Misinformation not only targets people’s beliefs but can also try to influence people’s behavior. During the 2016 campaign, the Trump campaign boasted about how they were targeting specific demographics like young liberals and African Americans with ads designed to demotivate them from voting. Subsequent voting participation was depressed among those demographics. However, inoculation can also be applied to behavior as well as knowledge. Preemptively building awareness of the attempts that will be made to demotivate people from voting is crucial to nullifying these misinformation campaigns.
Q: Why is political ideology chosen vs. education level vs. other discriminate criteria? (HarryHo)
The reason I emphasize political ideology (and even more so, political affiliation) is because they are the strongest drivers of climate change. A meta-analysis – or survey of surveys – by Matthew Hornsey and colleagues, found that political affiliation was by far the biggest determinant of people’s beliefs about climate change. Political ideology was a distant second, followed by education. Note in the figure below that age has a negative effect on climate beliefs. This means that the older people are, the less likely they are to accept the reality of climate change.
Q: There is a big informal effort going on now online, especially on social media, by experts and scientists to counteract misinformation. Do you have any practical suggestions on how to use the principle of inoculation in these scenarios? (Alison Palkhivala)
My practical suggestion is to call out denial techniques when you see them. If you see a fallacy or denial technique, respond by explaining the facts then explain the technique that the myth uses to distort the facts. This is the structure I employ whenever responding to misinformation: facts first, then introduce the myth, then explain the fallacy that the myth uses to distort the facts. I discuss the fact-myth-fallacy structure more in the third section of the webinar.
Q: Could you give an example for a good general approach to inoculation that is not restricted to a particular misinformation? (Marco Smolla)
The first time I tested inoculation in my PhD research, I inoculated participants against the technique of fake experts using the example of the tobacco industry. Then I showed them misinformation about climate change using the same technique. I found that the inoculation was effective in neutralizing the misinformation, even though the inoculation never mentioned the specific climate myth. This is a phenomena known as the “umbrella of protection”: inoculating people against a technique of denial in one topic conveys resistance in other topics.
I also use this approach in the Cranky Uncle game – using general, non-climate examples to illustrate fallacies used in climate misinformation:
Q: How can we use inoculation theory “in real life”, when people are stuck in echo chambers with others in their “tribes”? (Melanie)
This is a question I’ve been wrestling with over the last decade. I’ve been researching and refining my understanding of how best to inoculate people against misinformation but how do you get those inoculating messages into echo chambers and siloed communities? There are probably many answers to this question but the one answer I’m particularly excited about is gamification in classrooms. As I’ve talked to scientists and educators across the country about the Cranky Uncle game (which teaches critical thinking about climate change), I’ve been struck by the enthusiasm they’ve shown for incorporating the game in their classes. Educators are hungry for interactive, creative educational resources that engage their students. I realized that classrooms may be the way to crack echo chambers and the combination of critical thinking, humor, and gamification in the Cranky Uncle game could be a powerful solution.
Q: What do you do with people who basically don’t believe all experts? I’m referring to conspiracy theorists here. Are they a lost cause? (Alison Palkhivala)
We talk about this in the Conspiracy Theory Handbook. They’re not an entirely lost cause and there is research into how to reach them. But it’s an uphill battle and as I say about climate denial in general, if we have limited resource for communication and public engagement, our efforts are better spent on the 90% of the population who are at least open to evidence about climate change.
That said, the Conspiracy Theory Handbook proposes the following four strategies for potentially reaching conspiracy theorists:
Q: In my experience, much of the information exchanges are brief and limited in scope. Disclaimers also tend to be necessarily brief and limited in these exchanges. How best to convey the innoculation in “twitter” type exchanges? (Chris Sessions)
The Fact-Myth-Fallacy is flexible and lends itself to a variety of formats. I’ve used it in a textbook about climate change. We used that structure in all our Denial101x videos. We’ve tested it in experiments involving Twitter responses to misinformation. I’m particularly drawn to infographics that concisely communicate the fact and fallacy, such as one I developed in response to the “COVID-19 isn’t as bad as the flu” myth:
Q: Is it important to tackle the drivers of misinformation (e.g. Heritage Foundation)? If so, any ideas?
Shining disinfecting daylight on the sources of denial is crucially important. Work like Naomi Oreskes and Erik Conway’s Merchants of Doubt, or Bob Brulle’s research into corporate funding of denial organizations, are crucially important. Those narratives play an important role in inoculating people against science deniers.
Taking a step back, there are three main methods of inoculating people against misinformation: source-based (exposing denialist organizations and individuals), logic-based (exposing rhetorical techniques), and fact-based (debunking with accurate information). My work has largely focused on logic-based. But spoiler alert, I do have some research coming down the pipeline that I’ll be blogging about soon that directly compares fact-based vs. logic-based inoculation. There was a clear “winner” in this comparison so stay tuned…
Q: Society has definitely been inoculated by Trump Fake News. What is the best way to address this in conversation? (Bonnie Bourgeous)
I don’t see an easy answer to the problem of one third of the country being inoculated against mainstream news. Trump’s brand of inoculation is a shot-gun approach that breeds cynicism and distrust of all information sources. In contrast, the type of inoculations we test try to avoid increasing cynicism, taking a more surgical approach directed at specific denialist techniques.
We touch on different approaches in Conspiracy Theory Handbook (see the graphic I posted above). I think two relevant ones are showing empathy and affirming critical thinking (or more generally, connecting with people via shared values).
Andrea
Thanks a lot for sharing your knowledge. Especially, I find your Conspiracy Theory Handbook very helpful indeed for private discussions with friends who are not as educated on climate issues. In my experience, affirming critical thinking and empathy can help. However, I am looking for advise on hard to reach people, who gave up on facts (cannot distinguish true/false anyway) and dislike logic (feel outsmarted). They argue with “gut feeling”, or generally feeling as the most valid approach. I wonder what is the percentage of people susceptible to rational arguments, and what is the best approach to reach the others? As scientists, we are for good reason trained to avoid discussing feelings.
John Cook
At least from a climate change point of view, only 10% of the U.S. population are dismissive of climate science. Communicating scientific information that is perceived to be threatening to dismissives is largely unproductive or can even be counter-productive. That said, if scientific information is framed in a way that affirms a person’s values rather than threatens their values – ideally by a person who shares their values – you may make some progress. But it’s a tough proposition even in those ideal conditions!
But keep in mind if only 10% are dismissive, that means 90% of the public are at least open to evidence. The purpose of the Cranky Uncle vs. Climate Change book is not to change Cranky Uncles minds – instead it’s written for the other 90% of the population. Specifically, I’m reaching out to two audience. One audience is the 32% of the population who are cautious, disengaged, or doubtful about climate change – my goal is to inoculate them against misinformation and move them towards the “concerned about climate change” category. The other audience is the 58% of the population who are concerned or alarmed about climate change – most of these people don’t talk about climate change with the friends and family so it’s my hope that this book will empower the “climate concerned” to start talking about climate change and break climate silence.