Information channelling

Photo by Flickr user leSiege. Click for sourceThe Frontal Cortex has a fantastic piece discussing a new study finding that people choose TV news based on which channels are more likely to agree with their pre-existing opinions and how we have a tendency to filter for information that confirms, rather than challenges, what we believe.

Lehrer discusses various ways in which we selectively attend to information we agree with but the best bit is where he goes on to discuss a wonderful study from 1967 where people demonstrated in the starkest way that they’d rather block out information that doesn’t agree with their pre-existing beliefs.

Brock and Balloun played a group of people a tape-recorded message attacking Christianity. Half of the subjects were regular churchgoers while the other half were committed atheists. To make the experiment more interesting, Brock and Balloun added an annoying amount of static – a crackle of white noise – to the recording. However, they allowed listeners to reduce the static by pressing a button, so that the message suddenly became easier to understand. Their results were utterly predicable and rather depressing: the non-believers always tried to remove the static, while the religious subjects actually preferred the message that was harder to hear. Later experiments by Brock and Balloun demonstrated a similar effect with smokers listening to a speech on the link between smoking and cancer. We silence the cognitive dissonance through self-imposed ignorance.

Link to Frontal Cortex piece ‘Cable news’.
Link to summary of 1967 static study.
Link to PubMed entry for same.

One thought on “Information channelling”

  1. Yes rather depressing. Although knowledge of this process can help us avoid it in ourselves.
    I recall reading in Robert Cialdini’s excellent book ‘Influence: The Psychology of Persuasion’ how not only do we search out and amplify information that reinforces what we already believe (and conmen are very good at spotting people’s prejudices and asserting them to build rapport) but that this process can work the other way.
    Cialdini wrote about brainwashing during the Korean war of US soldiers who after having been made to state pro Chinese anti US pronouncements were then much more likely to believe them. People who are encouraged to role play a specific stance or belief are much more likely to favor that belief in future.
    So we seek out conformational information but (many of us) can also come to believe what we are made to “seek out.”

Leave a comment