Jan 8, 2018

Exploit Explained: Confirmation Bias

{This post is part of the Archive of Human Exploits}


Sir Arthur Conan Doyle's classic stories about detective Sherlock Holmes popularized the idea of deductive reasoning, the process of using general principles to reach a specific conclusion. General principle: Dogs bark at strangers. Detail: The dog didn't bark on the night of the murder. Conclusion: The murderer was someone the dog knew.

The inverse of deductive reasoning is inductive (or bottom-up) reasoning. That is, you start with a conclusion and then confirm it with specific examples. If that sounds like a recipe for error, it is. One of the most common is known as confirmation bias.

Confirmation bias — also known as "my side" bias  is "the tendency to search for, interpret, favor, and recall information in a way that confirms one's preexisting beliefs or hypotheses," according to Wikipedia. When humans start with a conclusion, they tend to cling to it in a way that causes them to overvalue supporting evidence and ignore evidence to the contrary.

Writing for Psychology Today, Professor Shahram Heshmat elaborates:
Confirmation bias occurs from the direct influence of desire on beliefs. When people would like a certain idea/concept to be true, they end up believing it to be true. They are motivated by wishful thinking. This error leads the individual to stop gathering information when the evidence gathered so far confirms the views (prejudices) one would like to be true. 
Once we have formed a view, we embrace information that confirms that view while ignoring, or rejecting, information that casts doubt on it. Confirmation bias suggests that we don’t perceive circumstances objectively. We pick out those bits of data that make us feel good because they confirm our prejudices. Thus, we may become prisoners of our assumptions.
Confirmation bias should not be confused with desirability bias, which is a similar but distinct concept.

Why is this a human exploit? Well, when trained persuaders start by appealing to our base prejudices  when my "side" becomes my "tribe"  they know they don't have much work to do on providing factual support for their arguments. Our tendency will be to seek out, or simply notice a lot more, information that confirms their argument while ignoring information that does not. In other words, we tend to let them into our mind uncritically or, at least, much less critically than people from outside the tribe.

Politics is a great example. Just think about a political leader you strongly dislike. Think about his or her flaws. Now ask yourself: How often have you read articles or posts that confirmed those flaws? How often have you read the opposite?

Chances are the ratio is seriously lopsided and, because the country is pretty evenly divided, that isn't because the other side is putting out less content.