What is denialism?
Denialism: the employment of rhetorical tactics to give the appearance of argument or legitimate debate, when in actuality there is none. These false arguments are used when one has few or no facts to support one’s viewpoint against a scientific consensus or against overwhelming evidence to the contrary. They are effective in distracting from actual useful debate using emotionally appealing, but ultimately empty and illogical assertions.
Examples of common topics in which Denialists employ their tactics include: Creationism/Intelligent Design, Global Warming Denialism, Holocaust Denial, HIV/AIDS Denialism, 9/11 conspiracies, tobacco carcinogenecity denialism (the first organized corporate campaign), anti-vaccination/mercury autism denialism and anti-animal testing/animal rights extremist denialism. Denialism spans the ideological spectrum, and is about tactics rather than politics or partisanship.
We believe there are five simple guidelines for identifying denialist arguments. Most denialist arguments will incorporate more than one of the following tactics: Conspiracy, Selectivity, False Experts, Impossible Expectations/Moving Goalposts, and Argument from Metaphor/violations of informal logic. Adapted from Give Up Blog’s post with permission.
Suggesting scientists have some ulterior motive for their research or they are part of some conspiracy. The most basic example of this lie is to say that if the scientists discovered contrary findings they would lose their funding. The most severe example is to suggest scientists are engaged in some kind of elaborate “cover-up” or that they are part of the zionist conspiracy against the Aryan race. Whatever, it amounts to the same thing.
Response: These criticisms reflect a total ignorance of how science, especially academic science, works from a practical standpoint. Not only do scientists love to discover things that run contrary to expectations and publish them, but it is precisely the exceptional results that generate a great deal of interest (although they also require a higher degree of skepticism). The papers published in Nature and Science aren’t just essays saying “everything is fine.” They are often revolutionary (and sometimes incorrect) papers describing unusual findings, powerful new findings, or things that represent a major coup of scientific diligence and work. Funding, while often rewarded to projects that don’t take huge risks, is also heavily based on novelty, not maintaining some kind of party line. Further, the idea that scientists would ever work together in uniform to supress some piece of information is laughable. Scientists are in competition with eachother, and if something were being suppressed by a group it is usually only because they want to publish it first, and their competitors would love to beat them to it. Science is quite incompatible with keeping secrets or maintaining conspiracies, and to any actual scientists this is laughable.
Denialists will often cite: a critical paper supporting their idea, or famously discredited or flawed papers meant to make the field look like a it’s based on weak research. Quote mining is also an example of “selective” argument, by using a statement out of context, just like using papers or data out of context, they are able to sow confusion.
Response: I’ve noticed this is common among the AIDS/HIV denialists (who have a discredited paper from 1987 they like to wave around and they pick on Gallo for fudging the initial identification of HIV), but also is a big thing among global warming deniars as described in the Guardian article. Some creationists like Jonathan Wells particularly enjoy using examples of failed theories supporting Darwinian evolution (like Haeckels’ embryos) to suggest that the tens of thousands of other papers on the subject, and the entire basis of genetics, biology and biochemistry are wrong. The biggest problem here is that science doesn’t “purge” the literature when these things are proven false and they stay there forever. It is up to the researcher to read more than the papers that support their foregone conclusion, they have to develop a theory that incorporates all the data, not just the data they like.
The fake expert
: A bought-scientist or scientist/expert from an unrelated field to say that their data, lack of data, proven-flawed data or their expert opinion disproves the validity of the entire field.
Response: The global warming denialists have the greatest amount of money invested in the fake-expert strategy but they all pretty much use this tactic to some degree. Note that creationists and other anti-science types particularly will line up behind MDs to support their crap, because a lot of doctors are graduated in this country, and even though they technically have a degree in science, they’ve never actually done it themselves and it’s never to hard to find some quack with an MD to back up your line of bullshit. I would point you, for example, to the Presidential Council on Bioethics which is full of MDs gleaned for their ideological slant, with no real scientific legitimacy (Krauthammer being the most glaring example). I’m not maligning MD researchers who do exist, but it is a strategy used to give a patina of legitimacy to otherwise laughable ideas.
Impossible expectations/Moving Goalposts:
The use of the absence of complete and absolute knowledge to prevent implementation of sound policies, or acceptance of an idea or a theory. It’s a little bit like argument ad ignorantiam, but more sinister. Basically, the suggestion is made that until a subject is understood completely and totally (usually requiring a level of knowledge only found in deities), no action can be reasonably taken.
Response: This is a big one with global warming deniers. To state the problem metaphorically, it’s like saying until you’ve figured out the exact momentum, moment of inertia, time dilation, length contraction, and relativistic position of a car in several reference frames that is speeding at you, you shouldn’t jump out of the way. Since global warming is very complicated, they use this mixed appeal to ignorance and inaction to suggest until we understand climate 100%, we should do nothing. Never mind that this is impossible, but that is the expectation. A reasonable person would instead suggest that once you have enough data that suggest a change of behavior, or change of policy is warranted, it would be prudent to take that data under consideration and change things before we’re all under water. You don’t need to know the position of every molecule in the galaxy before deciding you need to jump out of the way of a speeding train. Just like we don’t need to have a perfect model of the earth’s climate to understand that all the current data and simulations suggest decreasing carbon output is of critical importance right now, and not when humans have obtained some imaginary scientific nirvana.
The logical fallacy
The fallacies usually used are metaphor/argument from analogy, appeals to consequence, straw men or red herrings. The metaphor, as hopefully I’ve demonstrated, is a useful tool in language to help communicate ideas in common sense terms. However, it isn’t an argument in and of itself. Denialists will often use argument from metaphor or analogy to suggest that scientific data are wrong. For example, creationists will use as an argument the metaphor that saying natural selection leading to humans is like saying it’s probable that you could assemble a jumbo-jet that could fly simply by shaking the constituent parts in a box for 5 billion years. Or that a mousetrap is too complex for evolution because if a single part was missing it wouldn’t work.
Response: I’m not purposefully setting up a straw man here, but this type of argument from false analogy is incredibly common as are other classic logical fallacies. One could argue many things, but it would be a waste of time because the situations described are silly and have nothing to do with human evolution. The analogies ignore the nature of evolution, suggest it’s just totally random, ignore natural selection as the mechanism of evolution, ignore basic biology and create a totally artifical point of reference for a biological discussion. In short, metaphors have nothing to do with biology or evolution, but they are confusing and on the surface their logic sounds correct to many laymen. These are a hallmark of the “irreducible complexity” arguments of the creationist denialists, but other denialists have similar appeals to metaphor. But irreducible complexity arguments are all based on metaphors, while data from siRNA, knockout mice, humans with silent genetic defects, etc., indicate that cells and biological organisms are not irreducibly complex, and often can operate and adapt with less than a full complement of their ideal genetic code. There are quite a few gene knockout mice in which no phenotype has been observed, and anyone who has knocked genes out in cells with siRNA could tell you, an effect is no guarantee. Cells adapt to a number of situations and not all genes are required for healthy, viable offspring. Science is not about who has the best metaphor that makes the most sense to good ol’ common folk. Data trumps metaphors every time.
Recognizing these tactics is the first step towards debunking or just outright dismissing these dismal and distracting arguments that detract from legitimate debate and sow confusion about scientific fact.