Why worldview threats undermine evidence
Have you ever noticed that when you present people with facts that are contrary to their deepest held beliefs they always change their minds? Me neither. In fact, people seem to double down on their beliefs in the teeth of overwhelming evidence against them. The reason is related to the worldview perceived to be under threat by the conflicting data.
Creationists, for example, dispute the evidence for evolution in fossils and DNA because they are concerned about secular forces encroaching on religious faith. Anti-vaxxers distrust big pharma and think that money corrupts medicine, which leads them to believe that vaccines cause autism despite the inconvenient truth that the one and only study claiming such a link was retracted and its lead author accused of fraud. The 9/11 truthers focus on minutiae like the melting point of steel in the World Trade Center buildings that caused their collapse because they think the government lies and conducts “false flag” operations to create a New World Order. Climate deniers study tree rings, ice cores and the PPM of greenhouse gases because they are passionate about freedom, especially that of markets and industries to operate unencumbered by restrictive government regulations. Obama birthers desperately dissected the president’s long-form birth certificate in search of fraud because they believe that the nation’s first African- American president is a socialist bent on destroying the country.
In these examples, proponents’ deepest held worldviews were perceived to be threatened by skeptics, making facts the enemy to be slayed. This power of belief over evidence is the result of two factors: cognitive dissonance and the backfire effect. In the classic 1956 book When Prophecy Fails, psychologist Leon Festinger and his co-authors described what happened to a UFO cult when the mother ship failed to arrive at the appointed time. Instead of admitting error, “members of the group sought frantically to convince the world of their beliefs,” and they made “a series of desperate attempts to erase their rankling dissonance by making prediction after prediction in the hope that one would come true.” Festinger called this cognitive dissonance, or the uncomfortable tension that comes from holding two conflicting thoughts simultaneously. Two social psychologists, Carol Tavris and Elliot Aronson (a former student of Festinger), in their 2007 book Mistakes Were Made (But Not by Me) document thousands of experiments demonstrating how people spin-doctor facts to fit preconceived beliefs to reduce dissonance. Their metaphor of the “pyramid of choice” places two individuals side by side at the apex of the pyramid and shows how quickly they diverge and end up at the bottom opposite corners of the base as they each stake out a position to defend. (continue reading…)