Catching confirmation bias before it catches you

It's far too easy to jump to conclusions. Fortunately, there are ways of counteracting your biases.

by Anthony Fitzsimmons
Last Updated: 17 Apr 2018

Our brains are designed to jump to conclusions. It is a valuable trait when it helps us react to immediate risks, such as what might be a tiger lurking in the bushes. But it becomes dangerous when it leads us to accept wrong ideas and make bad decisions. As Nobel physicist Richard Feynman famously put it, ‘The first principle is that you must not fool yourself; and you are the easiest person to fool.’ 

René Descartes thought that understanding an idea was an automatic process which might be followed by evaluation of its validity. This sounds plausible, but Baruch Spinoza, a seventeenth century philosopher, disagreed. He argued that comprehension and a preliminary ‘acceptance’ of a concept were inseparable, with evaluation following later, if at all.

Centuries later, psychologists Daniel Kahneman and Amos Tversky concluded that a refined version of Spinoza’s hypothesis represents reality. Our brains behave as though they have two systems. System 1 is fast, gullible and jumps to conclusions. Laborious, energy intensive System 2, is ‘in charge of doubting and unbelieving’ but ‘sometimes busy and often lazy.’

But it is worse than that. Faced with the challenge of unbelieving, System 2 defaults to searching our memory for confirming evidence. Confirmation bias, as it is known, undermines our critical faculties. We would do better to search for disproving evidence but that approach, which underpins the scientific method, needs far more mental effort. 

Our System 2 becomes tired more easily than we are willing to accept. Granting parole involves weighing evidence and risks. The ‘easy’ decision is to refuse it. Professors Danziger, Levav and Avnaim-Pesso analysed over 1,000 decisions by Israeli parole board judges. They discovered that the judges freed prisoners in up to 65% of cases straight after meals and snacks, but the rate fell close to zero just before breaks.

Being tired or hungry reduces our capacity to think so we revert to easy options and defaults. It is the same when our finite brainpower is multitasking while we are solving a problem: for example when we are also checking emails, managing social anxiety or trying to impress. 

Remain vigilant

Confirmation bias is one of the reasons why flawed ideas persist. They are automatically ‘accepted’ as they are assimilated, doubly so if persuasively presented. We need considerable mental effort, as well as time and expertise, to test them in depth. It then takes courage, self-confidence and sophisticated social skills to challenge a powerful proposer. Failure to recognise and overcome obstacles such as these is a fundamental risk to effective decision-making.

For any important decision, a team of skilful, knowledgeable, articulate thinkers is essential but it must also be alert. Vigilant to deal with all forms of human weakness, the flight safety world has tackled the issue with a pilot fitness mnemonic ‘I’M SAFE’.  Whether we are pilots or board members, Illness, Medication, Stress, Alcohol, Fatigue and lack of Eating all degrade our ability to make good decisions. The loss of brain-power as we become tired or peckish makes boardroom breaks, biscuits and beverages a necessity not a luxury. But there are techniques we can use to help to open our minds.

The Devil’s Advocate concept is well-known, though arduous and difficult to manage. But Gary Klein, another psychologist, is the brain behind a clever technique to overcome confirmation bias and a raft of other risks to good decision-making: the Pre-mortem. When Kahneman mentioned it at Davos, a global CEO was overheard to say that this idea alone had made his trip worthwhile.

The process is simple and quick. After debating the proposal, and a refreshing break, the chair asks everyone to imagine that the project was implemented a year ago and has been a total disaster. Everyone, including all executives, takes three minutes to write down why, before taking turns to read one reason from their list until no-one has more to add. The discussion of the combined list will be on a new level.

How does it work? Klein likens it to a solar eclipse. Only when the sun’s dazzling disc is hidden is the corona revealed. With the glow of hoped-for success obscured, inconvenient issues become visible. Kahneman is attracted to a bias-based analysis.

By mentally 'accepting' that the project has failed, it becomes easier for us to harness System 2's confirmation bias to tap our memory-bank of failures to confirm why the project was set to fail. In other words, we can manipulate our minds so that confirmation bias counteracts itself. That is pleasingly neat.

Anthony Fitzsimmons is chairman of Reputability LLP and author of ‘Rethinking Reputational Risk: How to Manage the Risks that can Ruin Your Business, Your Reputation and You’. Copyright: Reputability LLP.

Image credit: Zendograph/Shutterstock

Tags:

Find this article useful?

Get more great articles like this in your inbox every lunchtime