I've been slowly working through Michael Shermer's book, The Believing Brain. It's a fascinating read, particularly for anyone with minimal interest in the neuropsychology of belief development. I'll probably write several posts highlighting interesting points from his book. Today I want to share one study he described on confirmation bias, which is our tendency to look for evidence which confirms our preexisting beliefs and ignore disconfirming evidence. We do this all the time. It's useful in detecting frequent patterns, allowing us to act quickly and decisively. Unfortunately, it sometimes leads to unjustified beliefs: just ask the mischievous kid who gets accused of every act of shenanigans the teacher learns about. There are times when that kid is actually minding his own business and working on his assignments!
One interesting study of die hard Republicans and Democrats showed the neurological activity behind confirmation bias:
"during the run-up to the 2004 presidential election, while undergoing a brain scan, thirty men-half self-described "strong" Republicans and half "strong" Democrats-were tasked with assessing statements by both Georg W. Bush and John Kerry in which the candidates clearly contradicted themselves. Not surprisingly, in their assessments of the candidates, Republican subjects were as critical of Kerry as Democratic subjects were of Bush, yet both let their own preferred candidate off the evaluative hook. Of course. But what was especially revealing were the neuroimaging results: the part of the brain most associated with reasoning-the dorsolateral prefrontal cortex-was quiescent. Most active were the orbital frontal cortex, which is involved in the processing of emotions, and the anterior cingulate cortex-our old friend the ACC, which is so active in patternicity processing and conflict resolution. Interestingly, once subjects had arrived at a conclusion that made them emotionally comfortable, their ventral striatum-a part of the brain associated with reward-became active.
In other words, instead of rationally evaluating a candidate's positions on this or that issue, or analyzing the planks of each candidates' platform, we have an emotional reaction to conflicting data. We rationalize away the parts that do not fit our preconceived beliefs about a candidate, then receive a reward in the form of a neurochemical hit, probably dopamine."
This helps explain how we can look at the same facts as someone else or as our self from one year ago, and reach vastly different conclusions.
When have you realized that your rational argument for something was actually confirmation bias at work?