The mighty fortress of belief

Bad Science has an excellent piece on the psychology of how we deal with evidence that challenges our cherished beliefs. Needless to say, our most common reaction is to try and undermine the evidence rather than adjust our beliefs.

The classic paper on the last of those strategies is from Lord in 1979: they took two groups of people, one in favour of the death penalty, the other against it, and then presented each with a piece of scientific evidence that supported their pre-existing view, and a piece that challenged it. Murder rates went up, or down, for example, after the abolition of capital punishment in a state, or comparing neighbouring states, and the results were as you might imagine. Each group found extensive methodological holes in the evidence they disagreed with, but ignored the very same holes in the evidence that reinforced their views.

The article goes on to discuss a recent study that found that scientific information that contradicts a cherished belief not only leads people to doubt the study in question, but also science itself.

In psychology, the motivation to resolve conflicting ideas is called cognitive dissonance and it leads us to try and resolve the contradiction in whichever is the most personally satisfying way, rather than whichever it the most in tune with reality.

The theory has an interesting beginning and first originated when psychologist Leon Festinger decided to study a flyer saucer cult, an episode he documented in his amazing book When Prophecy Fails.

Festinger was curious as to what would happen when an inconsistency to a cherished belief was so absolute it would seem to be logically overwhelming. So he was intrigued when he saw a story in the paper about a religious cult who had prophesised that the world would end in a great flood on December 21, 1954, while the true believers would be rescued in a flying saucer.

The members sold all their possessions, several divorced because their spouses were non-believers, and they prepared for the big event. Festinger’s colleague Stanley Schachter infiltrated the cult and documented what happened on the night when the ‘end of the world’ came – and went.

In the hours following midnight the group were distraught, but at 4am a ‘message’ arrived from the aliens, channelled through the group’s leader. It said: “This little group, sitting all night long, has spread so much goodness and light that the God of the Universe spared the Earth from destruction.”

You would think that a failed prophecy backed up by a lame excuse would lead the members to give it up as a lost cause, but instead, they became more fervent in their beliefs and publicly announced they’d saved the world.

Although the group Festinger studied eventually disbanded, the group’s leader ‘Sister Thedra’ went on to found various alien-inspired New Age movements and is still widely revered in those circles. There’s some information about her on this UFO group page and on various similar places online, none of which mentions the failed prophecy.

Cognitive dissonance is one of the most established theories in psychology and one of our most powerful motivators that drives us to fit the world into what we already believe. Science, religion or reality are simply no match.

Link to Bad Science on discounting evidence.

5 thoughts on “The mighty fortress of belief”

  1. There is an important omission in this summary. Prior to the prophecy, the group shunned publicity. After the failure of the prophecy, most of the group actively sought out publicity.
    They had to drastically change their actions to sustain their beliefs.
    This is somewhat different from other cognitive dissonance examples in which the action remains the same, only fantastic justifications of it are given instead.

  2. You might think that this is obviously irrational behaviour, but Edwin Jaynes in ‘Probability theory – the logic of science’ (probably the most well known text on bayesian probability) argues that something quite similar is quite consistent with with Bayesian, and therefore presumably rational, reasoning. I don’t have that text with me right now, but basically, in formulating a question as a bayesian problem, before applying the evidence you first assign prior probabilities to the alternative hypotheses.
    If the probability that you assign to a hypothesis – let us say, global warming – is lower than the probability you assign to the hypothesis that people are likely to lie about it, then direct evidence for the hypothesis will never persuade you, at least in the absence of evidence that the people trying to persuade you are actually being honest.
    If this seems implausible, consider how hard someone would need to work to persuade you of the existence of ESP. (another example from Jaynes’ book). You would – quite rationally – scrutinise evidence presented to you much more carefully than a hypothesis for which you had less prior scepticism – unless you thought it so likely to be charlatanism that you didn’t bother to look.
    The irrationality, if it exists, is not therefore in this behaviour, but in whatever caused people to assign their particular prior beliefs to start with.

  3. I’m having some trouble with, “Cognitive dissonance is one of the most established theories in psychology and one of our most powerful motivators that drives us to fit the world into what we already believe.” It would seem to me that we interpret the world according to our beliefs all the time, indeed, that no other course of action makes any sense. We can’t possibly need a “motivator” to drive us to do what we do all the time anyway. I suggest that the usual expression for this is, “extraordinary claims require extraordinary evidence”. And that it’s only when we have a smug feeling about the correctness of our own beliefs that we characterize people who think differently to us as “driven by cognitive dissonance”.

Leave a comment