There is an old joke about a psychiatrist working with a patient who was convinced that he was dead. The psychiatrist says “So you think you’re dead. Do dead people bleed?”, to which the patient replies “No, of course not.” Then the psychiatrist takes a sharp needle and pricks the patient’s finger drawing a drop of blood. The shocked patient says “Well, what do you know. Dead people do bleed!”
Once someone’s mind is made up, all the evidence is interpreted in the light of the existing belief. Once we interpret everything we see within the constraints of what we already believe, there is no way of falsifying those beliefs. This is an extreme case of confirmation bias.
Confirmation bias occurs when we interpret the available evidence in the light of what we already believe. Our interpretations are biased in order to confirm and support our beliefs.
It is a well-studied psychological phenomenon in which people tend to look for evidence that is consistent with their current understanding, rather than assess evidence in a balanced and fair manner. We tend to give greater weight to information supporting our existing beliefs, and we tend to either ignore or dismiss information that is inconsistent with our existing beliefs. We see what we are looking for, we persist with discredited beliefs, and we are very reluctant to overturn any conclusions we have drawn, no matter how flimsy the evidence supporting them.12 The more attached we are to those beliefs, the harder it is to assess them objectively. There is a “tight-fisted fear” about letting go of treasured beliefs which we have to consciously overcome if we are to really consider the evidence.
The last Japanese soldier
During World War II, Hiroo Onoda was a lieutenant in the Japanese army. In 1944, he was sent to the island of Lubang off the coast of the Philippines in order to hamper enemy attacks on the island. Under no circumstances was he to surrender or take his own life. Unfortunately, nobody told him when the war ended.
Eating rice, coconuts and meat from stolen cattle, Hiroo Onoda hid in the jungle for 29 years, carrying out occasional guerrilla activities and waiting for further instructions.
He avoided search parties sent to find him, believing they were enemy scouts. Leaflets announcing the end of the war were assumed to be propaganda. Newspapers, letters and family pictures dropped from the air were taken to be a trick. Friends and relatives even spoke out over loudspeakers, but Onoda remained suspicious, and did not believe that the war had really ended.
Later he wrote 3
“We considered people dressed as islanders to be enemy troops in disguise or enemy spies. The proof that they were was that whenever we fired on one of them, a search party arrived shortly afterward.”
Everything was interpreted within the context of what he already believed.
Eventually, his former commanding officer was located, working as a bookseller in southern Japan. He flew to Lubang Island to officially relieve Onoda of duty. On 9 March 1974, Onoda finally surrendered, turning over his sword, rifle, dagger and ammunition.
He discovered a very different world from the one he believed existed.
Beliefs must be falsifiable
I can relate to Onoda. For 30 years I believed that the Bible was inspired by God, that God heard my prayers, that Jesus was going to return to the earth, that dead people would be raised to life at his return, that Jesus would set up his kingdom with the capital in Jerusalem, and so on. I interpreted all the evidence within the context of what I already believed.
Any evidence that appeared to go against what I believed was dismissed as either: (a) something that could be explained if only we had more information; or (b) a biased presentation by someone determined not to believe. In this way, I could happily continue with my beliefs regardless of any evidence that might appear to contradict them.
The problem is that beliefs that cannot be contradicted are no different from delusions. Any belief worth acting on must be falsifiable. That is, it must be possible to reject the belief as false if there is sufficient evidence against it.
Falsifiability is an important concept in science. Scientific progress occurs when a theory or hypothesis is falsified through new evidence. Then a new theory or hypothesis is postulated that is consistent with all the available evidence including the new observations.
If a theory or belief cannot be falsified, there is no point in even examining the evidence for it. This is true in science and in religion.
Some believers take pride in the fact that their faith cannot be shaken. When asked “What would convince you that your belief was mistaken?” they reply: “Nothing. That’s what it means to have faith.” There is no virtue in such dogmatic delusions. They might as well have steadfast faith in the tooth fairy.
Extraordinary claims require extraordinary evidence4
Of course, confirmation bias works in the other direction as well: unbelievers are more likely to dismiss evidence that supports a religious belief, and they tend to look for evidence against belief.
However, the two sides are not symmetric. It is reasonable to be skeptical about all supernatural beliefs unless there is sufficient evidence to support them. In other words, before any evidence is explored, the balance of probabilities must be on the side of unbelief rather than belief. Otherwise, we would need to take seriously all supernatural claims including voodoo dolls, fairies and Jesus appearing in slices of pizza.
For unbelievers, this is easier — their confirmation bias means that they are already skeptical. (However, there is still a danger of dogmatic and arrogant skepticism which must also be avoided if we are to treat the evidence reasonably.) For believers, it is extraordinarily difficult to put aside their natural confirmation bias, and try to take a skeptical perspective on what they believe.
I was finally convinced that my religious beliefs were wrong once I was prepared to seriously consider the evidence. I thought about what would convince me that I was wrong, and I discovered that the evidence stacked up against what I had long held to be true. But it took a major shift in the way I viewed the evidence to come to this position. I had to be prepared to allow my beliefs to be falsifiable. I had to be prepared to seriously consider the evidence, and not just explain it away.
Onoda was finally convinced that the war was over because the one piece of evidence he was prepared to accept was produced — his commanding officer told him that the war was over.
Whatever your beliefs, it is well worth asking “What piece of evidence would convince you to change your mind?”.
R. S. Nickerson (1998) Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology 2(2), 175-220. ↩︎
E. Jelalian and A.G. Miller (1984) The perseverance of beliefs: Conceptual perspectives and research developments. Journal of Social and Clinical Psychology 2(1), 25-56. ↩︎
H. Onoda (1974). No surrender: My thirty-year war. Translated by C.S. Terry. New York: USA: Kodansha International Ltd, p.94. ↩︎
A statement made famous by the astronomer, Carl Sagan, in his television series Cosmos. However, the idea was expressed at least as early as the time of Laplace who wrote “The weight of evidence for an extraordinary claim must be proportioned to its strangeness” (Laplace, 1812) – or something like that because he wrote in French. ↩︎