How will we ever know the truth about 4-29. I say, it was a conspiracy to undermine 9-11 truth to show that fuel from a tanker truck could actually melt steel and cause a freeway to collapse. Initial photos from the site raise lots of questions.
I’m going to be less active for a few days. Going to Montreal (for the first time) for the 17th Conference on Computers, Freedom, and Privacy. I’ll be moderating a panel on the new landscape of online advertising, featuring Microsoft’s Kim Howell, the Center for Digital Democracy’s Jeff Chester, and Mike Zaneis of the Interactive Advertising Bureau. There may be some denialism afoot, in which case I’ll project a card or two on the screen.
Anyone have any restaurant suggestions?
Yesterday, I discussed how “no problem” is a chorus in denialist rhetoric. But sometimes, something bad has happened, and it’s more or less impossible say “no problem” with a straight face. What can a denialist do?
Continue reading “Denialists’ Deck of Cards: The 2 of Hearts, “Bad Apples””
Some might wonder why I include some right-wing “family” organizations on the list of denialists. It’s simple. In their efforts to oppose all forms of contraception, they routinely lie about the science behind the efficacy of condoms for STD-prevention (just like HIV/AIDS denialists), the efficacy of contraception, as well as social effects of contraception like the falsehood that contraceptive availability leads to promiscuity and higher STD transmission.
Take for instance, the Family Research Council on emergency contraception.
(republished from denialism.com – this was too good an example to pass up)
*Update* Calladus has a good overview of their “research” into the efficacy of abstinence education. What kind of family value is lying anyway?
Continue reading “Selectivity from the Family Research Council”
For our next installment of the big five tactics in denialism we’ll discuss the tactic of selectivity, or cherry-picking of data.
I’m very proud to be on Scienceblogs with Mark, and for my first posts, I’m going to be introducing the Denialists’ Deck of Cards, a humorous way to think about rhetorical techniques that are used in public debate. Those who pay attention to consumer protection issues, especially in product safety (especially tobacco, food, drugs), will recognize these techniques. The goal of classifying them in this way is to advance public understanding of how these techniques can be used to stifle reform in consumer protection or on other issues. So, the Denialists’ Deck is extremely cynical. But it is a reflection of and reaction to how poor the public policy debates in Washington have become.
Continue reading “Denialists’ Deck of Cards: The 2 of Clubs, “No Problem””
A crank is defined as a man who cannot be turned.
– Nature, 8 Nov 1906
Here at denialism blog, we’re very interested in what makes people cranks. Not only how one defines crankish behavior, but literally how people develop unreasonable attitudes about the world in the face of evidence to the contrary. Our definition of a crank, loosely, is a person who has unreasonable ideas about established science or facts that will not relent in defending their own, often laughable, version of the truth. Central to the crank is the “overvalued idea”. That is some idea they’ve incorporated into their world view that they will not relinquish for any reason. Common overvalued ideas that are a source of crankery range from bigotry, antisemitism(holocaust deniers), biblical literalism (creationists – especially YEC’s), egotism (as it relates to the complete unwillingness to ever be proven wrong) or an indiscriminant obsession with possessing “controversial” or iconoclastic ideas. Some people just love believing in things that no one in their right mind does, out of some obscure idea that it makes them seem smart or different.
Continue reading “Unified theory of the crank”
Three can keep a secret if two are dead.
What are denialist conspiracy theories and why should people be instantly distrustful of them? And what do they have to do with denialism?
Continue reading “Conspiracy”
Hello and welcome to denialism blog.
Here we will discuss the problem of denialists, their standard arguing techniques, how to identify denialists and/or cranks, and discuss topics of general interest such as skepticism, medicine, law and science. I’ll be taking on denialists in the sciences, while my brother, Chris, will be geared more towards the legal and policy implications of industry groups using denialist arguments to prevent sound policies.
Continue reading “Hello Scienceblogs”
What is denialism?
Denialism: the employment of rhetorical tactics to give the appearance of argument or legitimate debate, when in actuality there is none. These false arguments are used when one has few or no facts to support one’s viewpoint against a scientific consensus or against overwhelming evidence to the contrary. They are effective in distracting from actual useful debate using emotionally appealing, but ultimately empty and illogical assertions.
Examples of common topics in which Denialists employ their tactics include: Creationism/Intelligent Design, Global Warming Denialism, Holocaust Denial, HIV/AIDS Denialism, 9/11 conspiracies, tobacco carcinogenecity denialism (the first organized corporate campaign), anti-vaccination/mercury autism denialism and anti-animal testing/animal rights extremist denialism. Denialism spans the ideological spectrum, and is about tactics rather than politics or partisanship.
We believe there are five simple guidelines for identifying denialist arguments. Most denialist arguments will incorporate more than one of the following tactics: Conspiracy, Selectivity, False Experts, Impossible Expectations/Moving Goalposts, and Argument from Metaphor/violations of informal logic. Adapted from Give Up Blog’s post with permission.
Suggesting scientists have some ulterior motive for their research or they are part of some conspiracy. The most basic example of this lie is to say that if the scientists discovered contrary findings they would lose their funding. The most severe example is to suggest scientists are engaged in some kind of elaborate “cover-up” or that they are part of the zionist conspiracy against the Aryan race. Whatever, it amounts to the same thing.
Response: These criticisms reflect a total ignorance of how science, especially academic science, works from a practical standpoint. Not only do scientists love to discover things that run contrary to expectations and publish them, but it is precisely the exceptional results that generate a great deal of interest (although they also require a higher degree of skepticism). The papers published in Nature and Science aren’t just essays saying “everything is fine.” They are often revolutionary (and sometimes incorrect) papers describing unusual findings, powerful new findings, or things that represent a major coup of scientific diligence and work. Funding, while often rewarded to projects that don’t take huge risks, is also heavily based on novelty, not maintaining some kind of party line. Further, the idea that scientists would ever work together in uniform to supress some piece of information is laughable. Scientists are in competition with eachother, and if something were being suppressed by a group it is usually only because they want to publish it first, and their competitors would love to beat them to it. Science is quite incompatible with keeping secrets or maintaining conspiracies, and to any actual scientists this is laughable.
Denialists will often cite: a critical paper supporting their idea, or famously discredited or flawed papers meant to make the field look like a it’s based on weak research. Quote mining is also an example of “selective” argument, by using a statement out of context, just like using papers or data out of context, they are able to sow confusion.
Response: I’ve noticed this is common among the AIDS/HIV denialists (who have a discredited paper from 1987 they like to wave around and they pick on Gallo for fudging the initial identification of HIV), but also is a big thing among global warming deniars as described in the Guardian article. Some creationists like Jonathan Wells particularly enjoy using examples of failed theories supporting Darwinian evolution (like Haeckels’ embryos) to suggest that the tens of thousands of other papers on the subject, and the entire basis of genetics, biology and biochemistry are wrong. The biggest problem here is that science doesn’t “purge” the literature when these things are proven false and they stay there forever. It is up to the researcher to read more than the papers that support their foregone conclusion, they have to develop a theory that incorporates all the data, not just the data they like.
The fake expert
: A bought-scientist or scientist/expert from an unrelated field to say that their data, lack of data, proven-flawed data or their expert opinion disproves the validity of the entire field.
Response: The global warming denialists have the greatest amount of money invested in the fake-expert strategy but they all pretty much use this tactic to some degree. Note that creationists and other anti-science types particularly will line up behind MDs to support their crap, because a lot of doctors are graduated in this country, and even though they technically have a degree in science, they’ve never actually done it themselves and it’s never to hard to find some quack with an MD to back up your line of bullshit. I would point you, for example, to the Presidential Council on Bioethics which is full of MDs gleaned for their ideological slant, with no real scientific legitimacy (Krauthammer being the most glaring example). I’m not maligning MD researchers who do exist, but it is a strategy used to give a patina of legitimacy to otherwise laughable ideas.
Impossible expectations/Moving Goalposts:
The use of the absence of complete and absolute knowledge to prevent implementation of sound policies, or acceptance of an idea or a theory. It’s a little bit like argument ad ignorantiam, but more sinister. Basically, the suggestion is made that until a subject is understood completely and totally (usually requiring a level of knowledge only found in deities), no action can be reasonably taken.
Response: This is a big one with global warming deniers. To state the problem metaphorically, it’s like saying until you’ve figured out the exact momentum, moment of inertia, time dilation, length contraction, and relativistic position of a car in several reference frames that is speeding at you, you shouldn’t jump out of the way. Since global warming is very complicated, they use this mixed appeal to ignorance and inaction to suggest until we understand climate 100%, we should do nothing. Never mind that this is impossible, but that is the expectation. A reasonable person would instead suggest that once you have enough data that suggest a change of behavior, or change of policy is warranted, it would be prudent to take that data under consideration and change things before we’re all under water. You don’t need to know the position of every molecule in the galaxy before deciding you need to jump out of the way of a speeding train. Just like we don’t need to have a perfect model of the earth’s climate to understand that all the current data and simulations suggest decreasing carbon output is of critical importance right now, and not when humans have obtained some imaginary scientific nirvana.
The logical fallacy
The fallacies usually used are metaphor/argument from analogy, appeals to consequence, straw men or red herrings. The metaphor, as hopefully I’ve demonstrated, is a useful tool in language to help communicate ideas in common sense terms. However, it isn’t an argument in and of itself. Denialists will often use argument from metaphor or analogy to suggest that scientific data are wrong. For example, creationists will use as an argument the metaphor that saying natural selection leading to humans is like saying it’s probable that you could assemble a jumbo-jet that could fly simply by shaking the constituent parts in a box for 5 billion years. Or that a mousetrap is too complex for evolution because if a single part was missing it wouldn’t work.
Response: I’m not purposefully setting up a straw man here, but this type of argument from false analogy is incredibly common as are other classic logical fallacies. One could argue many things, but it would be a waste of time because the situations described are silly and have nothing to do with human evolution. The analogies ignore the nature of evolution, suggest it’s just totally random, ignore natural selection as the mechanism of evolution, ignore basic biology and create a totally artifical point of reference for a biological discussion. In short, metaphors have nothing to do with biology or evolution, but they are confusing and on the surface their logic sounds correct to many laymen. These are a hallmark of the “irreducible complexity” arguments of the creationist denialists, but other denialists have similar appeals to metaphor. But irreducible complexity arguments are all based on metaphors, while data from siRNA, knockout mice, humans with silent genetic defects, etc., indicate that cells and biological organisms are not irreducibly complex, and often can operate and adapt with less than a full complement of their ideal genetic code. There are quite a few gene knockout mice in which no phenotype has been observed, and anyone who has knocked genes out in cells with siRNA could tell you, an effect is no guarantee. Cells adapt to a number of situations and not all genes are required for healthy, viable offspring. Science is not about who has the best metaphor that makes the most sense to good ol’ common folk. Data trumps metaphors every time.
Recognizing these tactics is the first step towards debunking or just outright dismissing these dismal and distracting arguments that detract from legitimate debate and sow confusion about scientific fact.