Chris Mooney has been exploring the basic underpinnings of denialism lately, with this latest article a good summary of the basic problems:
In a recent study of climate blog readers, Lewandowksy and his colleagues found that the strongest predictor of being a climate change denier is having a libertarian, free market world view. Or as Lewandowsky put it in our interview, “the overwhelming factor that determined whether or not people rejected climate science is their worldview or their ideology.” This naturally lends support to the “motivated reasoning” theory—a conservative view about the efficiency of markets impels rejection of climate science because if climate science were true, markets would very clearly have failed in an very important instance.
But separately, the same study also found a second factor that was a weaker, but still real, predictor of climate change denial—and also of the denial of other scientific findings such as the proven link between HIV and AIDS. And that factor was conspiracy theorizing. Thus, people who think, say, that the Moon landings were staged by Hollywood, or that Lee Harvey Oswald had help, are also more likely to be climate deniers and HIV-AIDS deniers.
This is similar to what we’ve been saying for years. Ideology is at the heart of antiscience, (yes even liberal ideology) and when in conflict with science will render the ideologue incapable of rational evaluation of facts. The more extreme the ideology, the more likely and more severe the divergence from science. Then there is the separate issue of cranks who have a generalized defect in their reasoning abilities, are generally incompetent at recognizing bad ideas, often believing conflicting theories simultaneously, and are given to support any other crank who they feel is showing science is somehow fundamentally wrong. This is the “paranoid style”, it’s well-described, and likely, irreversible. However, more run-of-the-mill denialism should be preventable.
We’ve discussed this extensively in regards to research by Dan Kahan, although I have disagreed with this jargon of motivated reasoning. Chris, however, knows what they’re referring to with their fancified science-speak, ideology is at the root of denial.
Recognizing that the problem of anti-science is not one of a lack of information, or of education, or of framing is of paramount concern. This is a problem with humans. This is the way we think by default. People tend to arrive at their beliefs based on things like their upbringing, their religion, their politics, and other unreliable sources. When opinions are formed based on these deeply-held beliefs or heuristics, all information subsequently encountered is either used to reinforce this belief, or is ignored. This is why studies showing education doesn’t work, the more educated the partisan is on a topic, the more entrenched they become. You can’t inform or argue your way out of this problem, you have to fundamentally change the way people reason before they form these fixed beliefs.
Scientific reasoning and pragmatism is fundamentally unnatural and extremely difficult. Even scientists, when engaged in a particular nasty internal ideological conflict, have been known to deny the science. This is because when one’s ideology is challenged by the facts you are in essence creating an existential crisis. The facts become an assault on the person themselves, their deepest beliefs, and how they perceive and understand the world. What is done in this situation? Does the typical individual suck it up, and change, fundamentally, who they are as a person? Of course not! They invent a conspiracy theory as to why the facts have to be wrong. They cherry pick the evidence that supports them, believe any fake expert that espouses the same nonsense and will always demand more and more evidence, never being satisfied that their core beliefs might be wrong. This is where “motivated reasoning” comes from. It’s a defense of self from the onslaught of uncomfortable facts. Think of the creationist confronted with a fossil record, molecular biology, geology, physics, and half a dozen other scientific fields, are they ever convinced? No, because it’s all an atheist conspiracy to make them lose their religion.
How do we solve this problem?
First we have to recognize it for what it is, as Mooney and others have done here. The problem is one of human nature. Engaging in denialism doesn’t have to mean you’re a bad person, or even being purposefully deceptive (although there are those that have that trait), the comparison to holocaust denial, always a favorite straw man of the denialist, is not apt. Denialism in most people is a defense mechanism that protects their core values from being undermined by reality. And no matter what your ideology, at some point, you will have a conflict with the facts because no ideology perfectly describes or models all of reality. You are going to come into conflict with the facts at some point in your life no matter where you are on the ideological spectrum. The question is, what will you do when that conflict arises? Will you entrench behind a barrier of rhetoric, or will you accept that all of us are flawed, and our beliefs at best can only provide an approximation of reality – a handy guide but never an infallible one?
Second, we have to develop strategies towards preventing ideological reaction to science and recognize when people are reacting in an irrational fashion to an ideological conflict with science. One of my commenters pointed me to this paper, which describes an effective method to inoculate people against conspiratorial thinking. Basically, if you warn people ahead of time about conspiratorial craziness, they will be more likely to evaluate the claims of conspiracists with higher skepticism. We should encourage skeptical thinking from an early age, and specifically educate against conspiratorial thinking, which is a defective mode of thinking designed to convince others to act irrationally (and often hatefully). When we do see conspiracy, we shouldn’t dismiss it as harmless, the claims need to be debunked, and the purveyors of conspiracy theories opposed and mocked. Before anyone ever reads a line of Alex Jones, or Mike Adams, a training in skepticism could provide protection, and with time, the paranoid style will hold less and less sway. People primed to expect conspiratorial arguments will be resistant, and more skeptical in general. The Joneses, Moranos, and the Adamses of the world don’t have the answers, they know nothing, and their mode of thought isn’t just wrong, but actively poisonous against rational thought. As skeptical writers we should educate people in a way that protects them from their inevitable encounter with such crankery. This is why writers like Carl Sagan are so important with his (albeit incomplete) Baloney Detection Kit. He knew that you have to prepare people for their encounters with those with an ideological agenda, that others will bend the truth and deny the science for selfish reasons.
This is what is at the heart of true skepticism. First, understanding that you can be wrong, in fact you will often be wrong, and all you can do is follow the best evidence that you have. It’s not about rejecting all evidence, or inaction from the constantly-moved goalposts of the fake skeptics. It’s about pragmatism, thoughtfulness, and above all humility towards the fact that none of us has all the answers. Second, it’s understanding not all evidence is created equal. Judging evidence and arguments requires training and preparation as recognizing high-quality evidence and rational argument is not easy. In fact, most people are woefully under-prepared by their education to do things like read and evaluate scientific papers or even to just judge scientific claims from media sources.
Thus I propose a new tactic. Let’s get Carl Sagan’s Baloney detection kit in every child’s hands by the time they’re ten. Hell, it should be part of the elementary school curriculum. Lets hand out books on skepticism like the Gideons hand out Bibles. Let’s inoculate people against the bullshit they’ll invariably contract by the time they’re adults. We can even do tests to see what type of skeptical inoculation works best at protecting people from anti-science. It’s a way forward to make some progress against the paranoid style, and the nonsense beliefs purveyed by all ideological extremes. There is no simple cure, but we can inoculate the young, and maybe control the spread of the existing disease.
Leave a Reply