Pages

Tuesday, August 30, 2011

Confirmation Bias Exposed

I've been slowly working through Michael Shermer's book, The Believing Brain. It's a fascinating read, particularly for anyone with minimal interest in the neuropsychology of belief development. I'll probably write several posts highlighting interesting points from his book. Today I want to share one study he described on confirmation bias, which is our tendency to look for evidence which confirms our preexisting beliefs and ignore disconfirming evidence. We do this all the time. It's useful in detecting frequent patterns, allowing us to act quickly and decisively. Unfortunately, it sometimes leads to unjustified beliefs: just ask the mischievous kid who gets accused of every act of shenanigans the teacher learns about. There are times when that kid is actually minding his own business and working on his assignments!

One interesting study of die hard Republicans and Democrats showed the neurological activity behind confirmation bias:

"during the run-up to the 2004 presidential election, while undergoing a brain scan, thirty men-half self-described "strong" Republicans and half "strong" Democrats-were tasked with assessing statements by both Georg W. Bush and John Kerry in which the candidates clearly contradicted themselves. Not surprisingly, in their assessments of the candidates, Republican subjects were as critical of Kerry as Democratic subjects were of Bush, yet both let their own preferred candidate off the evaluative hook. Of course. But what was especially revealing were the neuroimaging results: the part of the brain most associated with reasoning-the dorsolateral prefrontal cortex-was quiescent. Most active were the orbital frontal cortex, which is involved in the processing of emotions, and the anterior cingulate cortex-our old friend the ACC, which is so active in patternicity processing and conflict resolution. Interestingly, once subjects had arrived at a conclusion that made them emotionally comfortable, their ventral striatum-a part of the brain associated with reward-became active.

In other words, instead of rationally evaluating a candidate's positions on this or that issue, or analyzing the planks of each candidates' platform, we have an emotional reaction to conflicting data. We rationalize away the parts that do not fit our preconceived beliefs about a candidate, then receive a reward in the form of a neurochemical hit, probably dopamine."

This helps explain how we can look at the same facts as someone else or as our self from one year ago, and reach vastly different conclusions.

When have you realized that your rational argument for something was actually confirmation bias at work?

4 comments:

  1. Great post! I'm going to have to read Shermer's book. And all of us, I think, could use a dose of humility once we understand about confirmation bias.

    ReplyDelete
  2. doug B,
    I'd love to hear what you think about the book. I wish I had it in me to post as frequently as you. I would have written several posts about the book by now.

    ReplyDelete
  3. It's certainly not controversial to say people tend to evaluate data in accordance with what they already believe. Nevertheless, I am somewhat skeptical of Shermer's attempt to pinpoint this phenomenon in brain activity. Consider the following:

    Shermer wrote:

    (A) "Once subjects had arrived at a conclusion that made them emotionally comfortable, their ventral striatum-a part of the brain associated with reward-became active..."

    (B) "We rationalize away the parts that do not fit our preconceived beliefs..., then receive a reward in the form of a neurochemical hit, probably dopamine."

    It seems to me quite a stretch to say that statement (B) necessarily follows from statement (A). But if Shermer believes it, does this mean that a "neurochemical hit" caused him to reach a conclusion that was in keeping with his reductionism? Or will he try to say that some beliefs are more dopamine independent than others? Which ones are those? Deconstructing thought like he is attempting to do often has unforeseen consequences on the very thought used to do the deconstructing.

    ReplyDelete
  4. I struggle with this constantly for my blog. However, because I try to present the data as accurately as possible, I've become very sensitive to it, enough to catch it in myself at times. That practice has spilled over into other aspects of my life too. I would not be so bold as to say I've irradiated such personal error, but I probably have a better handle on it than the average person.

    ReplyDelete