Our recent discussions of HIV/AIDS denial and in particular Seth Kalichman’s book “Denying AIDS” has got me thinking more about the psychology of those who are susceptible to pseudoscientific belief. It’s an interesting topic, and Kalichman studies it briefly in his book mentioning the “suspicious minds”:
At its very core, denialism is deeply embedded in a sense of mistrust. Most obviously, we see suspicion in denialist conspiracy theories. Most conspiracy theories grow out of suspicions about corruptions in government, industry, science, and medicine, all working together in some grand sinister plot. Psychologically, suspicion is the central feature of paranoid personality, and it is not overreaching to say that some denialists demonstrate this extreme. Suspicious thinking can be understood as a filter through which the world is interpreted, where attention is driven towards those ideas and isolated anecdotes that confirm one’s preconceived notions of wrong doing. Suspicious thinkers are predisposed to see themselves as special or to hold some special knowledge.
Psychotherapist David Shpairo in his classic book Neurotic Styles describes the suspicious thinker. Just as wee see in denialism, suspiciousness is not easily penetrated by facts or evidence that counter individuals’ preconceived worldview. Just as Shapiro describes in the suspicious personality, the denialist selectively attends to information that bolsters his or her own beliefs. Denialists exhibit suspicious thinking when they manipulate objective reality to fit within their beliefs. It is true that all people are prone to fit the world into their sense of reality, but the suspicious person distorts reality and does so with an uncommon rigidity. The parallel between the suspicious personality style and denialism is really quite compelling. As described by Shapiro:A suspicious person is a person who has something on his mind. He looks at the world with fixed and preoccupying expectation, and he searches repetitively, and only, for confirmation of it. He will not be persuaded to abandon his suspicion of some plan of action based on it. On the contrary, he will pay no attention to rational arguments except to find in them some aspect or feature that actually confirms his original view. Anyone who tries to influence or persuade a suspicious person will not only fail, but also, unless he is sensible enough to abandon his efforts early will, himself, become an object of the original suspicious idea.
The rhetoric of denialism clearly reveals a deeply suspicious character. In denialism, the science of AIDS is deconstructed to examine evidence taken out of context by non-scientists. The evidence is assimilated into one’s beliefs that HIV does not cause AIDS, that HIV tests are invalid, that the science is corrupt, and aimed to profit Big Pharma.
…
The insights offered by Shapiro are that denialists are not “lying” in the way that most anti-denialists portray them. The cognitive style of the denialist represents a warped sense of reality for sure, explaining why arguing or debating with a denialist gets you nowhere. But the denialist is not the evil plotter they are often portrayed as. Rather denialists are trapped in their denialism.
…
Psychologically, certain people seem predisposed to suspicious thinking and it seems this may be true of denialism as well. I submit that dienialism stems from a conspiracy-theory-prone personality style. We see this in people who appear predisposed to suspiciousness, and these people are vulnerable to anti-establishment propaganda. We know that suspicious people view themselves as the target of wrongdoing and hold persecutory ideas.
I agree that this certainly represents a portion of denialists, but not all. I think others, for example creationists and global warming denialists, tend to have a different motivation and style, due to ideological extremism that warps their worldview. Ideological and paranoid denialism can co-exist within denialist camps, or even within an individual, but there are areas where the overlap is incomplete. Still, the issue of the suspicious personality style is important.
We all know this person. If you don’t, maybe you know Dale Gribble (AKA Rusty Shackleford).
I just know Mike Judge has met the suspicious personality style and encapsulated the extreme of this personality in this character. Dale inevitably sees every event as tied to some bizarre government/alien conspiracy, and inevitably the other men in the alley ignore his interjections or Hank simply says, “that’s asinine”. Hank is a wise man. To argue with a Dale would only make you look like the fool.
Some anti-denialists sites have recently brought to my attention a growing body of work trying to understand how people become conspiracy theorists. Two papers in particular are of interest, the first Unanswered Questions: A Preliminary Investigation of
Personality and Individual Difference Predictors of 9/11 Conspiracist Beliefs [1] is an interesting study because it provides some explanation for crank magnetism.
So, how was this study done and what did this study show?
For one, I enjoyed reading this study because, as with all well-written papers, they had a nice introduction into the literature behind the psychology of belief in conspiracy theories. This is something I’m becoming more familiar with, but feel that it’s also a relatively crude approximation of what is happening. There are several hypotheses about what causes conspiratorial beliefs, and they cite a number of previous studies that attempt to explain the phenomenon. These explanations range from feelings of political powerlessness feeding into conspiracism, to cultural or group understanding of events, to psychological explanations like the need to preserve self-esteem, express feelings like anger at disliked groups, or be individualistic, and some hypotheses which focus on specific deficiencies in cognition.
It’s a small study (n=257) of British men and women who were given surveys to analyze their “Support for Democratic Principles”, an inventory to assess their belief in conspiracy theories (15 items allowing them to show relative support for common conspiracy theories – with the exception they had to drop the question about Elvis being alive since it was too far out even for conspiracy theorists), a “big five” questionnaire (which assess the five personality factors of openness, conscientiousness, extraversion, agreeableness and neuroticism), a “9/11 conspiracist beliefs scale”, a inventory assessing feelings about authority, one for cynicism, and one for exposure to 9/11 conspiracy beliefs.
The researchers then took the data from these surveys, loaded them into matrices and then tried to fit it to various models to create significant linkages between 9/11 beliefs and various personality factors. This is not research in which I have any expertise, so if anyone would like to provide any criticisms of how they did it that may impact results I’d be happy to hear it. I am trusting the peer-reviewers in this case to have done a good job vetting their technique for obvious flaws.
What did they find?
Well, in general 9/11 conspiracy beliefs were low prevalence, but interestingly they felt 9/11 conspiracy beliefs could be predicted from a few personality indicators, and, consistent with crank magnetism, belief in 9/11 conspiracies was part of a general conspiratorial attitude with belief in multiple conspiracy theories being common according to their general conspiracism scale. The authors explain their findings thusly:
The results of this preliminary examination of 9/11 conspiracist theories can be predicted by a number of personality and individual difference variables, which together explained just over half of the variance in the former. As shown in Figure 1, General Conspiracist Beliefs had the strongest effect on 9/11 Conspiracist Beliefs, which not surprisingly was also affected by 9/11 Conspiracist Exposure. Of the more distal predictors, only Political Cynicism, Attitudes to Authority and Agreeableness had significant effects on 9/11 Conspiracist Beliefs when the less distal predictors were taken into account; however, there were several significant effects of the more distal on the less distal predictors, namely Attitudes to Authority and Openness on 9/11 Conspiracist Exposure, and Political Cynicism, Support for Democratic Principles and Openness on General Conspiracist Beliefs. Age and sex differences were found for Agreeableness and Support for Democratic Principles, though age also affected 9/11 Conspiracist Exposure when the more distal mediators were taken into account.
The finding that exposure to 9/11 conspiracist ideas was positively associated with holding 9/11 conspiracy beliefs is perhaps not surprising. It seems likely that coming into contact with such ideas (either directly or indirectly) increases an individual’s understanding and, consequently, acceptance of such ideas (alternatively, it is also possible that individuals who already believe in 9/11 conspiracy theories seek out such information). More interesting was the finding that General Conspiracist Beliefs was positively associated with 9/11 conspiracist ideas, a result that fits with Goertzel’s (1994) assertion that conspiracy beliefs form part of a monological belief system, in which each conspiratorial idea serves as evidence for other conspiratorial beliefs. For example, believing that John F. Kennedy was not killed by a lone gunman, or that the Apollo moon landings were staged, increases the chances that an individual will also believe in 9/11 conspiracy theories. As Goertzel (1994) highlights, monological belief systems provide accessible explanations for new phenomena that are difficult to comprehend or that threaten existing belief systems (Goertzel, 1994).
Moreover, Goertzel (1994) points out that, often, the proof offered as evidence for a conspiracy is not specific to one incident or issue, but is used to justify the general pattern. That a government is covering-up its involvement in the 9/11 attacks, for instance, goes to show that it is also covering-up the fact that extraterrestrial life has visited Earth, or that national governments are involved in political assassination. Thus, the more conspiracy theories a monological thinker agrees with, the more she or he will accept and assimilate any new conspiracy theory that is proposed.
I’m going to have to read some more by this Goertzel cat. This is an interesting study though because it correlates these two behaviors that we ourselves have observed so frequently. It is limited by size, and that it was done on British subjects, but somehow I suspect it isn’t unfair to generalize from British cranks to American ones or cranks worldwide. I would like to see their findings replicated on a larger scale, as even though their findings were significant, they were looking at a small subset of a relatively small number of subjects.
The second paper I’d like to talk about Paranormal Belief and Susceptibility to the Conjunction Fallacy[2], approaches a similar problem from the point of view that maybe people who believe in such obvious nonsense have difficulties with basic reasoning skills.
These authors begin with a discussion of a more developed literature that describes the common cognitive deficits encountered in people who believe in the paranormal. Basically, what has been found, again and again, is that people who believe in paranormal events have certain cognitive deficits and in particular have problems with probabilistic reasoning.
It is widely recognised that most people are poor at judging probability and that under conditions of uncertainty, will rely on heuristics–cognitive ‘rules of thumb’–to simplify the reasoning process so as to make quick, easy and proximate, but ultimately flawed, judgments (e.g. Gilovich, Griffin, & Kahneman, 2002; Kahneman, Slovic, & Tversky, 1982; Shaifi, 2004; Sutherland, 1992). Further research suggests a person’s pre-existing or a priori beliefs can have a significant influence on these heuristical judgements (e.g. Watt, 1990/1991). Blackmore and Troscianko (1985) were first to test whether paranormal believers were especially prone to probabilistic reasoning biases. They had paranormal believers and non-believers answer questions relating to the generation of random strings (i.e. list 20 numbers as if drawn from a hat), randomness judging (i.e. indicate whether various boy/girl mixes were biased or unbiased), coin tossing outcomes (i.e. indicate whether the number of heads scored from 20 throws was biased or unbiased) and sampling decisions (e.g. indicate which is more likely to be drawn from a given number of red and blue sweets). Whilst no group differences were found for the random string generation or randomness judging tasks, Blackmore and Troscianko found that those who believed in the possibility of extrasensory perception1 made more coin tossing and sampling errors than non-believers. These data suggest paranormal believers underestimate the likelihood of a chance outcome and ‘look beyond’ coincidence in search of causal–usually supernatural–
explanations. According to Blackmore and Troscianko (1985), this underestimating of chance expectations–termed the ‘chance baseline shift’–may strengthen one’s belief in psi even when there is no evidence that psi actually exists.
Subsequent work examining believers’ tendency to misunderstand chance offers mixed
results. Henry (1993) found most people believe intuition (71%) and psi (64%) are the best
explanations for ‘everyday coincidence experiences’ (see also Henry, 2005) whilst Bressan
(2002; Study 1) found paranormal believers reported having more frequent ‘meaningful
coincidences’ than non-believers. Likewise, Tobacyk and Wilkinson (1991) found those
with a more pronounced belief in the paranormal (specifically, in superstition, psi and
precognition) had a higher preference for games of chance and were more prone to
developing illusory correlations between statistically unrelated events (see also Vyse,
1997). Marks (2002) goes further by suggesting believers misperceive chance events as
somehow being related because their a priori beliefs in the paranormal demand such a
relationship and thus, that they are especially prone to making ‘subjective validations’.
It’s an interesting discussion, as many of the findings seem to be subject to general cognitive ability, and there have been mixed results in identifying the “believers” specific problems with understanding randomness and probability. There’s a saying in medicine, the questions stay the same, it’s just the answers that change. Well, the question remains, why do believers in paranormal events impute more significance to random events than non-believers?
These authors are interested in studying the problem from the point of view that the heuristics, or cognitive rules of thumb that people use to make decisions, in believers are off with regards to the conjunction fallacy. This refers to a tendency, that is very common overall, to ascribe a higher probability of an event occurring if it is associated in the individuals mind with another event, even if the probabilities of the two events are independent.
Conjunction biases have been demonstrated in a wide variety of hypothetical contexts where, in most cases, the proportion of individuals violating the conjunction rule ranges
from between 50 and 90% (Fisk, 2004; Tversky & Kahneman, 1983). Given previous claims that paranormal believers’ susceptibly to reasoning biases may be context or domain specific (e.g. Gray & Mills, 1990; Merla-Ramos, 2000; Wierzbicki, 1985; although see Lawrence & Peters, 2004; Roe, 1999), it seems reasonable to expect believers will be more prone to the conjunction fallacy, particularly when conjunctive events appear to reflect paranormal phenomena. Take the common example of when one is thinking about an old friend just at the moment he/she unexpectedly calls (e.g. Rhine-Feather & Schmicker, 2005). Here, the two constituent events–namely (a) thinking about the friend and (b) that friend unexpectedly calling–may not be unusual in their own right. One may have thought about the same friend many times before or alternatively, many other friends may have unexpectedly called in the past; neither would be particularly surprising (cf. Fisk, 2004). It is only when these two constituent events co-occur in close temporal proximity that this conjunction is deemed too unlikely to be a simple coincidence. In such cases, many experients will dismiss chance and look for a causal, often paranormal, explanation (cf. Blackmore & Troscianko, 1985; Bressan, 2002; Marks, 2002). Similar logic can be applied to other aspects of the paranormal including the apparent accuracy of psychic predictions where the co-occurrence of two constituent events–namely (a) the prediction and (b) the predicted outcome–seems too unlikely to be just a coincidence. Given previous claims that paranormal believers often misunderstand chance and randomness (e.g. Bressan, 2002), it seems reasonable to suggest believers may be especially prone to the conjunction fallacy. Evidence that believers tend to adopt an intuitive (heuristical) as opposed to an analytic thinking style (Aarnio & Lindeman, 2005; Irwin & Young, 2002; Lester, Thinschimdt, & Trautman, 1987), which in turn is associated with more conjunction errors (Fisk, 2004; Toyosawa & Karasawa, 2004), adds further support to this assertion. Moreover, given that personal experience of alleged paranormal phenomena is the single biggest predictor of paranormal belief (Blackmore, 1984), a tendency to misjudge conjunctive events as having some underlying causal relationship may help explain the maintenance, and perhaps even the development, of such beliefs
So, what did they do? They constructed a series of vignettes that test people’s tendency to fall for the conjunction fallacy, and then simultaneously tested them for the presence or absence of paranormal beliefs. Importantly, in addition to testing for paranormal beliefs, the researchers controlled for achievement in psychology, statistics, and mathematics.
They found their hypothesis was correct. The relatively common conjunction bias was even more common in those who believed in paranormal phenomena. Problems with the study again included small size and worse, this was performed on a relatively homogeneous population of college students in England.
So what do these studies mean for our understanding of cranks? Well, in addition to providing explanations for crank magnetism, and cognitive deficits we see daily in our comments from cranks, it suggests the possibility that crankery and denialism may be preventable by better explanation of statistics. Much of what we’re dealing with is likely the development of shoddy intellectual shortcuts, and teaching people to avoid these shortcuts might go a long way towards the development and fixation on absurd conspiracy theories or paranormal beliefs.
Viren Swami, Tomas Chamorro-Premuzic, Adrian Furnham (2009). Unanswered questions: A preliminary investigation of personality and individual difference predictors of 9/11 conspiracist beliefs Applied Cognitive Psychology DOI: 10.1002/acp.1583
Rogers, P., Davis, T., & Fisk, J. (2009). Paranormal belief and susceptibility to the conjunction fallacy Applied Cognitive Psychology, 23 (4), 524-542 DOI: 10.1002/acp.1472
Leave a Reply