Aug 20, 2018

Exploit Explained: Survivorship Bias


{This post is part of the Archive of Human Exploits}

According to Cicero, Diogenes was once asked about paintings of people who had survived shipwrecks. "You who think the gods have no care of human things: What do you say to so many persons preserved from death by their especial favor?"

Diogenes replied: "I say that their pictures are not here who were cast away, who are by much the greater number."

What Diogenes identified is something known as survivorship bias. According to Wikipedia:
"Survivorship bias (or survival bias) is the logical error of concentrating on the people or things that made it past some selection process and overlooking those that did not, typically because of their lack of visibility. This can lead to false conclusions in several different ways. It is a form of selection bias.
"Survivorship bias can lead to overly optimistic beliefs because failures are ignored, such as when companies that no longer exist are excluded from analyses of financial performance. It can also lead to the false belief that the successes in a group have some special property, rather than just coincidence (correlation proves causality)."
A famous example of this bias is the story of World War II statistician Abraham Wald. According to TruthHawk.com:

"Wald was a Hungarian mathematician who emigrated to the United States. During World War II, he was part of the military’s Statistical Research Group.
The military had a project to learn how to best protect their planes from enemy gunfire. They would look at planes returning from war, and see which areas had the most bullet holes. Clearly, these areas were in need of protection – so the recommendation was to enforce those areas of the planes with armor for future missions.
"Wald saw the flaw in this analysis, and produced an elegant fix. He reasoned: if a plane is returning from combat, it means that it was not shot down by the enemy. So the areas riddled with bullet holes were least in need of protection, because the planes could survive a strike on those points.

Instead, Wald concluded: reinforce the areas where returning planes had suffered no damage. Those were most likely to be mission-critical, because no returning plane had bullet holes in those sections."

This bias is related to the concept of insensitivity to base rates. Usually, that's considered in the negative case. A plane crash affects our perception of flying even though thousands of planes land safely. This bias is something like the inverse: If planes routinely crashed, we wouldn't want to fly in one just because a few had landed safely.

Survivorship bias pops up often in my everyday life. In business, I often hear pitches for products that reference one or two similar items that are well-known successes. My job is often to remind people of all the similar items that are lesser-known failures.

Then there's diet advice. My friends and family are constantly experimenting with some new diet plan or another. Their reasons for trying the diet are always the same: They read or heard that someone lost an amazing amount of weight on the plan. Of course, no one ever catalogs or reports how many other people didn't lose weight, suffered side effects or gained all the weight back later.