Jan 8, 2018

Exploit Explained: Desirability Bias

{This post is part of the Archive of Human Exploits}

Most of us are familiar with the theory of confirmation bias. However, new research suggests we may often be mistaking it for a more potent theory known as desirability bias.

Writing in the August 2017 issue of the Journal of Experimental Psychology, three researchers explain:
[The] theory contends that individuals assign greater weight to information that is desirable versus undesirable ...This bias is reported to underlie an asymmetry whereby people update their prior beliefs to incorporate new and desirable information more than new but undesirable information ...
The other theory, confirmation bias, contends that people preferentially search for, evaluate, and incorporate new information that confirms their prior beliefs ... This bias is reported to underlie an asymmetry whereby people update their prior beliefs to incorporate new and confirming information more than new but disconfirming information — even if they receive a balanced set of both types of information. 
Unfortunately, the predictions of desirability bias and confirmation bias are often conflated.
In other words, desirability bias recognizes that what people want to happen is more important than what they believe will happen when those two things differ. To demonstrate and illustrate this, the researchers used the 2016 US presidential election. They started by asking 900 people "who they (a) desired to win and (b) believed would win." It's easy to see, for example, how a Republican might have desired Donald Trump to win while at the same time believing Hillary Clinton would ultimately prevail.

Later in the experiment, half of people received new polling data that confirmed their belief about who would win and half received polling data that contradicted their belief. Still later, they were asked again who they believed would win the election. In this way, the researchers were able to measure the effect of desirability on what they call "belief updating." The result, which the authors shared in a New York Times article pre-publication, was as follows:
Those people who received desirable evidence — polls suggesting that their preferred candidate was going to win — took note and incorporated the information into their subsequent belief about which candidate was most likely to win the election. In contrast, those people who received undesirable evidence barely changed their belief about which candidate was most likely to win. 
Importantly, this bias in favor of the desirable evidence emerged irrespective of whether the polls confirmed or disconfirmed peoples’ prior belief about which candidate would win. In other words, we observed a general bias toward the desirable evidence. 
What about confirmation bias? To our surprise, those people who received confirming evidence — polls supporting their prior belief about which candidate was most likely to win — showed no bias in favor of this information. They tended to incorporate this evidence into their subsequent belief to the same extent as those people who had their prior belief disconfirmed. In other words, we observed little to no bias toward the confirming evidence.
Why is this important? Confirmation bias is a human exploit for which many believe the cure is to expose the "victim" to more of the kind of contradictory evidence they tend to ignore. However, this research "implies that even if we were to escape from our political echo chambers, it wouldn’t help much," the authors conclude.

Reversing the perspective and taking it beyond politics, the finding suggests trained persuaders shouldn't waste time amassing contradictory information in an attempt to change beliefs. Rather, they should bypass belief and focus on creating or tapping into desires. Here again, we see the power of emotional appeals over facts and reason.