by John Cook

Climate change is a more divisive issue than ever. The public’s beliefs about questions such as human-caused global warming have been drifting further apart for several decades. Democrats have become more convinced that climate change is real and needs solving, while Republicans are much less convinced. How do we solve this polarization problem?
It is impossible to properly address the problem without first examining the cause. In the late 1980s, climate change was a bipartisan issue, with George H. W. Bush pledging to fight the greenhouse effect with the White House effect. But in the early 1990s, views on climate change began to align more closely with political beliefs.
This didn’t happen by accident. At the same time, conservative think-tanks began publishing misinformation about climate science—with the purpose of polarizing the issue of climate change. They exaggerated scientific uncertainty, minimized the severity of climate impacts, and maximized the severity of policy impacts. The strategy worked.
So a key driving factor behind public polarization about climate change is misinformation. Efforts to depolarize climate change without addressing the source unfortunately amount to tinkering at the edges of the problem.
How do we counter misinformation? This is a question I’ve researched over the last decade, exploring cognitive science and critical thinking for possible solutions. It turns out the answer comes from a branch of psychological research dating back to the mid-20th Century, known as inoculation theory.
Inoculation theory borrows the idea of vaccination—building resistance against a disease by exposing people to a weak form of a disease—and applies it to knowledge. Researchers have found through decades of research that exposing people to a weak form of misinformation builds up immunity so that when they’re exposed to actual misinformation, they are less vulnerable to being misled.
What exactly is a weak form of misinformation? An inoculating message has two elements—warning of the risk of being misled, and counter-arguments explaining the techniques used to mislead. I’ve tested this approach in my own research, measuring whether it was possible to inoculate people against one of the most potent forms of climate misinformation: the Global Warming Petition Project.
The Petition Project is a website that lists 31,000 Americans with science degrees who have signed a statement stating that humans aren’t disrupting climate. The purpose of the website is to argue that there is no scientific consensus on human-caused global warming. But, multiple studies, including my own research, have found overwhelming scientific agreement that humans are causing global warming, with studies converging on 97% consensus. So how do we reconcile the 97% consensus with the 31,000 dissenting petition signers?
The Petition Project uses the technique of fake experts—people who convey the impression of scientific expertise while not possessing any actual relevant expertise. Anyone with a science degree can sign the petition so the list is populated with computer scientists, veterinary scientists, mechanical engineers, and other fields of science unrelated to climate change. Less than 0.1% of the signatories have any expertise in climate science. The Petition Project is fake experts in bulk.
In an experiment, I showed participants an inoculating message before showing them content from the Petition Project website. In my inoculation, I explained the technique of fake experts but didn’t specifically mention the Petition Project. Rather, I used tobacco misinformation as an example of the fake expert strategy—the tobacco industry used this technique in advertising campaigns designed to confuse the public about the scientific consensus that smoking caused cancer.
My experiment showed that when people had the fake expert technique explained to them, the Petition Project no longer misled them. While it is one of the most potent forms of climate misinformation, it had no influence on people who were made aware of the fake expert strategy.
Most importantly, the inoculation worked across the political spectrum. The misinformation was neutralized for political conservatives as well as for political liberals. It turns out it doesn’t matter where you sit on the political spectrum—no one likes being misled.
So the answer to the public polarization about climate change is inoculation—once people are made aware of the techniques of polarizing misinformation, they are no longer vulnerable to being misled by that misinformation. The key to making the public resilient against misinformation is raising awareness of the techniques of science denial.
A useful framework for understanding denialist techniques is FLICC: an acronym for Fake experts, Logical fallacies, Impossible expectations, Cherry picking, and Conspiracy theories. These five techniques of science denial were first proposed by Mark Hoofnagle, a science blogger focused on countering health misinformation. He observed that these five techniques appear in any denialist movement that rejects a scientific consensus, whether it is the link between smoking and cancer, vaccination safety, or human-caused global warming.
We’ve already looked at the most prominent example of fake experts. Logical fallacies cover a range of different reasoning flaws such as red herrings, oversimplification, and false dichotomy. The most common logical fallacy in climate misinformation is jumping to conclusions. This is also known as “non sequitur,” Latin for “it does not follow”, where the conclusion does not follow from the premises. For example, the argument “climate has changed naturally in the past so modern climate change must be natural” is a non sequitur as the conclusion doesn’t necessarily follow from the premise. Just because climate change has been natural in the past before doesn’t mean it must be natural now.
Impossible expectations involve demanding unrealistic levels of scientific proof. For example, a commonly heard argument is that scientists can’t accurately predict the weather next week so how can they predict the climate decades from now? This argument conflates weather and climate, and thus places impossible expectations on computer models. While weather is chaotic and difficult to predict, climate is weather averaged over time and hence predictable. It may be impossible to predict a single coin toss but we know that 1000 coin tosses will yield close to 500 heads and 500 tails. Similarly, we may not know what the weather will be like on January 23 next year but we know that on average, January will be cooler than June.
Cherry picking looks at small pieces of the puzzle that paint a picture we want to see while ignoring the larger picture that tells an unwanted story. A common form of cherry picking is focusing on cold weather in order to cast doubt on global warming. Looking at the weather at a single time and place ignores what’s happening to the entire planet, which is showing a long-term warming trend.
Lastly, conspiracy theories are inevitable when a person denies a scientific consensus. How else do you explain why all the world’s scientists agree? The problem with conspiracy theories is that they are immune to facts—conspiracy theorists respond to any disconfirming evidence by expanding the theory to include the evidence. For example, when climate scientists’ emails were stolen and quote-mined to provide “proof” of conspiracy, a number of investigations were conducted into the scientists’ conduct. After each investigation concluded that there was no evidence of wrongdoing or conspiracy, climate deniers responded by claiming the investigators were part of the conspiracy.
These five techniques of science denial are found throughout the arguments deployed to mislead the public about climate change and cast doubt on human-caused global warming. Efforts to build a resilient public against climate misinformation require more than just explaining the science—we also need to explain how the science can get distorted. Being aware of denialist techniques is an important step to inoculating against misinformation—not just about climate change but science misinformation of any sort.

John Cook is a research assistant professor at the Center for Climate Change Communication at George Mason University. His research focus is on understanding and countering misinformation about climate change. In 2007, he founded Skeptical Science, which won the 2011 Australian Museum Eureka Prize for the Advancement of Climate Change Knowledge and 2016 Friend of the Planet Award from the National Center for Science Education. He authored the book Cranky Uncle vs. Climate Change, that combines climate science, critical thinking, and cartoons to explain and counter climate misinformation. John moved from Australia to the U.S. in 2017, and continues to eat Vegemite every day for breakfast.
Leave a Comment