Written by Carl Davidson, Director of Strategy and Insight, Research First
For all of us who work with ideas for a living, we need to be aware that we often come to cherish our ideas simply because they are our ideas. This leads to a serious problem known as ‘confirmation bias’. To see this in operation, try a simple accounting question: A bat and ball cost a dollar and ten cents. The bat costs a dollar more than the ball. How much does the ball cost?
If you’re like most people, you will think the ball costs ten cents. Except, it doesn’t. It costs five cents (5c for the ball, $1.05 for the bat, and $1.10 all together). If you thought it was ten cents, this is because your brain looked for reasons to confirm your intuitive response rather than challenge that response.
During World War 2 the RAF became increasingly alarmed at how many aeroplanes they were losing to enemy anti-aircraft fire. To address this problem, they planned to increase the amount of bulletproof armour carried on the planes. But because armour plating is heavy, and weight is always at a premium with aeroplanes, knowing where to put that extra armour became a pressing question.
The engineering solution seemed straightforward – it was a simple matter to study where the planes were most damaged and add armour to cover those areas. By studying enough planes, clear patterns of damage could be identified.
This approach makes intuitive sense but, as the mathematician Abraham Wald pointed out, it was also seriously flawed. The problem with the engineers’ thinking was that it was based on studying the aeroplanes that made it back to base. By definition these were planes that had been shot and still managed to fly home. In this regard what their study showed was exactly where the RAF didn’t need to add any more armour. The proper solution, Wald pointed out, was to only add armour to those areas where the returning aeroplanes had no damage.
In a similar vein, William Faulkner once said that if you aspired to be a great writer then you needed the courage to “kill your darlings”. What he meant was that writers needed the courage to discard those parts of a story they had fallen in love with that were no longer useful to the story. We hear the same advice today whenever we are encouraged to ‘edit ruthlessly.
Faulkner’s advice and Wald’s reasoning hold an important lesson for all of us. Whenever we make a decision we need to guard against ‘confirmation bias’. The ‘confirmation bias’ trap is another one of the many ways our brains fool us into thinking we’re smarter than we really are. David McRaney, author of You are Not so Smart, describes confirmation bias as “a filter through which you see a reality that matches your expectations”. In other words, it’s the tendency to look for confirmation for our pre-existing ideas while ignoring any evidence that might disprove those ideas.
What this means is that all of us assimilate new information in a way that confirms our existing view of the world. Or, as John Kenneth Gaibraith warned us “faced with the choice between changing one’s mind and proving that there is no need to do so, almost everyone gets busy on the proof”.
Interestingly, the better educated you are, the more likely is that you’ll fall for the confirmation bias trap (and the more certain you are that you won’t).
Overcoming confirmation bias takes a deliberate effort but starts with three easy steps.
- The first step is to be aware of how common it is (there’s an old joke in psychology that once you start to look for confirmation bias you start to see it everywhere).
- The second step is to ask yourself ‘what would it take to prove this idea wrong?’. This is where Faulkner’s advice to ‘kill your darlings’ is particularly useful. It reminds us not to get hung up an idea simply because it’s our own.
- The third step is to remember that confirmation bias tends to get be amplified in group settings. If you need to make a decision in a group setting, get somebody in that group to play the role of the Devil’s Advocate to argue against the decision you’re leaning towards. One easy way to do this is to make explicit the assumptions the others in the group might have taken for granted. This is where Wald’s reasoning is so useful, as it reminds us to think about the planes that didn’t make it home.
Image credit: iStock