How to be wrong (less)

A comprehensive guide to the cognitive biases that could derail your decision-making.

by Bill Borrows
Last Updated: 04 Feb 2020
Also in:
Food for thought

Ask a team of business students to build the tallest tower possible in 18 minutes using 20 sticks of uncooked spaghetti, one yard of tape, one yard of string and a marshmallow that must be placed on top, and you’re likely to be disappointed with the results.

Ask a bunch of pre-school children to do the same thing, however, and they’ll likely do a much better job. Why? Because as designer Peter Skillman, the original architect of the ‘spaghetti tower’ experiment, points out, none of the children spend any time “trying to be the CEO of Spaghetti Inc”.

In his best-selling book The Culture Code: The Secrets of Highly Successful Groups, Daniel Coyle describes the different approaches the teams take to Skillman’s team-building exercise. The business students, he notes, got right to work: “They began talking and thinking strategically. They tossed ideas back and forth and asked thoughtful, savvy questions. They generated several options, then honed the most promising ideas. It was professional, rational and intelligent.”

On the other hand, the children did not strategise: “They did not analyse or share experiences. In fact they barely talked at all. They abruptly grabbed materials from one another and started building, following no plan or strategy. Their entire technique might be described as tying a bunch of stuff together.”

When the results were averaged over dozens of trials and analysed, it turned out that tying a bunch of stuff together was a remarkably effective strategy. The structures built by the children averaged 26 inches, while those created by the business school students came in at less than 10 inches. Teams of lawyers (15 inches average) and CEOs (22 inches) were similarly embarrassed, although in the latter’s case adding an ‘executive administrator’ to the team increased tower height to an average of 30 inches.

“The business school students appear to be collaborating, but in fact they are engaged in a process psychologists call status management,” writes Coyle. “They are figuring out where they fit into the larger picture: Who is in charge? Is it OK to criticise someone’s idea? What are the rules here? Their interactions appear smooth, but their underlying behaviour is riddled with inefficiency, hesitation and subtle competition.”

Lessons from history

Skillman’s 2006 experiment is a reminder that we may think we know what we’re doing, but our best intentions can become unstuck by a whole host of psychological and social mechanisms that may have served us well in our hunter- gatherer days, but are less well-suited to the modern work environment, as many businesses have found to their cost.

Take Kodak, the poster child for 21st century business failure, which famously ignored the ascendancy of digital photography despite one of its own engineers inventing the requisite technology over 30 years previously. So spectacular was its downfall that it has generated a cottage industry of analysts and corporate coroners obsessed with revealing how such an innovative market leader, with sales over $2bn in 1966 and an 85+ per cent share in cameras and film in 1976, was forced to declare bankruptcy in 2012.

Various autopsies have revealed the endemic structural problems at Kodak – the company, it transpired, was not agile enough either to change its business model to embrace new developments or to give its innovators a voice. As Bill Lloyd, the company’s chief technical officer, subsequently told the New York Times, “It seems [the firm] had developed antibodies against anything that might compete with film.”

Kodak’s failure has also been put down to groupthink, a term first coined by social psychologist Irving Janis to describe how the desire for harmony or conformity in a group can lead to irrational or dysfunctional decision-making. But other psychological factors may also have been at play. Our brains, for instance, tend towards the status quo – change can have such an impact on stress levels that it can be seen on brain scans, something psychologists have dubbed the switch cost, a reference to the fee charged by utilities to change supplier. In fact, according to Nobel-award-winning psychologist and economist Daniel Kahneman, we are a lot less rational and correct in our thinking than we’d like to think. “Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance,” he writes in Thinking, Fast and Slow. Kahneman outlined two very different brain mechanisms in his best-selling book that impact the way we think. System 1 allows automatic, effortless (fast) thinking – when you’re in your car navigating a complicated roundabout for instance – and System 2, which does deliberate and conscious (slow) thinking – try multiplying 14 by 17 for example. It is the latter that we, wrongly Kahneman believes, identify with when we think of ourselves (both are equally important).

Faced with the sheer volume of decisions that we have to make every day, our brains employ fast thinking to prevent overload, developing habits and shortcuts (heuristics). So if all you’re doing is taking money from an ATM, chances are your PIN will come straight to mind and you’ll do it on autopilot, or you’ll have locked your front door but may not remember having done so. System 1 accounts for the vast majority of our thinking and is often good at what it does (after all, it kept our ancestors alive), but by their very nature, mental shortcuts can lead to systematic errors in thinking that affect the decisions and judgements we make, known as cognitive biases, a term coined by Kahneman and psychologist Amos Tversky.

And there are a lot of potential cognitive biases. Be careful, for example, that you don’t fall foul of the halo effect (when an initial positive judgement about a person, company, brand or product unconsciously colours our perception of them as a whole), the sunk cost bias (our tendency to persevere with something because we have invested time, money or energy into it, rather than because it is the right course of action) or confirmation bias (giving extra weighting to information that confirms what we already know).

Complicating factors

Researchers believe cognitive biases may have been at play in the short-lived and unpopular decision to introduce New Coke in 1985. On the face of it, the decision was sound. The company had already had success with the introduction of Diet Coke and there were encouraging findings from taste tests.

In a 2016 paper, academics at Colorado State University and Minot State University in the US point to two biases the organisation should have worked hard to avoid. The first is anchoring (when individuals use an initial piece of information to make subsequent judgements). In other words, on learning that the majority of test-group participants would react well to the changes, The Coca-Cola Company treated that baseline as evidence to support its decision.

The second bias stems from the availability heuristic (see panel, right) and is known as the retrievability bias. This one is “based on memory structures” and creeps in when we have to sift through information and arrive at past instances of what we believe to be a similar issue.

“For the organisation, the recent-past positive experience of the Diet Coke introduction may have made decision-makers believe that if they applied the same processes, they would also experience success with New Coke,” say the researchers. “However, they misapplied the bias because they failed to realise a key difference in the two situations. Diet Coke was a new product – New Coke was a replacement.” It was a costly mistake. New Coke was shelved 79 days after launch following a huge backlash and a 20 per cent drop in sales.

In his number one best-seller, Factfulness: 10 Reasons We’re Wrong About The World – And Why Things Are Better Than You Think, the late Swedish professor of public health Hans Rosling describes 10 instincts that he believes distort our thinking.

For example, Rosling explains by way of statistical analysis that, despite all the evidence of immense suffering in the world, it is actually in a pretty good place if you take it as a whole, pointing to improvements at all income levels and in most areas including infant mortality and life expectancy. 

Yet a cartload of chimpanzees (much like the pre-school kids in the spaghetti tower experiment) making random selections, he says, would have more success with his multiple choice questions concerning the state of the world than the inquiring minds of the established elite in well-off countries, which have fallen foul of ‘The Negativity Instinct’. 

“We need to learn to control our drama intake,” he writes. “Uncontrolled, our appetite for the dramatic goes too far, prevents us from seeing the world as it is, and leads us terribly astray.”

Missed opportunities

Rosling identifies nine other ‘instincts’ that impede effective decision-making, including The Fear Instinct (paying more attention to frightening things) and The Generalisation Instinct (grouping together things that are actually very different), and then takes us through his rules of thumb for developing a fact-based worldview: expect bad news; calculate the risk; get things in proportion; question your categories; remember – slow change is still change; take small steps.

Such rules of thumb might help alleviate the consternation we experience when, for instance, a bunch of infants beat top-flight MBAs at building spaghetti towers. “The result is hard to absorb because it feels like an illusion. We see smart, experienced business school students and we find it difficult to imagine that they would combine to produce a poor performance,” writes Coyle. 

“But this illusion, like every illusion, happens because our instincts have led us to focus on the wrong details. We focus on what we can see – individual skills. But individual skills are not what matters. What matters is the interaction.”

This focusing effect (a cognitive bias that occurs when people place too much importance on one aspect of an event causing them to inaccurately predict the utility of a future outcome) may well explain, in part at least, the decision of Motorola, with a 22 per cent share of the mobile phone market in 2006, to postpone joining the smartphone party until 2010 while it finessed the Razr, its best-selling but outdated flip phone. While iPhone and Blackberry were all about the customer experience, Motorola was worrying about aesthetics, and its hesitancy was rewarded by a 90 per cent fall in the company’s share price between October 2006 and March 2009. Not so much ‘Hello Moto’ as ‘Goodbye Motorola’ as one wag pointed out. (The Razr made a recent comeback as a foldable smartphone, five years after Motorola’s mobile business was acquired by Lenovo.)

The list of businesses that have failed to capitalise on major opportunities or deal with imminent threats would fill several books: Western Union, the telegraph company, turned down the patent on the telephone for $100,000; Decca Records signed Brian Poole and the Tremeloes in 1962 after a band called The Beatles failed their audition; 20th Century Fox handed the rights to Star Wars merchandise to George Lucas for $20,000; Atari rejected the chance to work with Steves Jobs and Wozniak on their new computer project, Apple; and, video rental chain Blockbuster missed out on an opportunity to acquire Netflix for $50m in 2000 and filed for bankruptcy a decade later while earlier this year the streaming giant announced a subscriber base of more than 139 million worldwide and revenue of over $16bn. And so on and so on.

Seeking to learn from the failures of such businesses and the people who run them can be an important step in understanding the biases that can derail our decision-making. But a note of caution. Hindsight is not only famously 20/20 but also itself another common cognitive bias. Also known as the knew-it-all-along phenomenon or creeping determinism, it refers to our tendency to see events that have already occurred as more predictable than they were before they took place. In Thinking, Fast and Slow, Kahneman writes: “It is easier to recognise other people’s mistakes than our own.” Acknowledging that simple fact may go a long way to helping us to be wrong, less.  

Image credit: George Rose/Getty Images

Tags:

Bill Borrows recommends

The dangers of data analytics

Read more

Find this article useful?

Get more great articles like this in your inbox every lunchtime

Why you keep ignoring hard truths

Research finds people choose wilful ignorance 32 per cent of the time.

Pret's problem is everyone's problem

Opinion: City centre ecosystems have stimulated and sustained countless businesses. We abandon them at our...

So you can stop guilt-tripping us back to the office now, Alan Sugar ...

The Apprentice star has expressed concerns that the City of London is like a morgue,...

We’re all part timers now

Rishi Sunak’s Job Support Scheme could have important - and welcome - implications for job...

5 things your remote workers want you to know

But are scared to tell you.

How friendly should you be with your employees?

Leaders discuss where the boundary between professional and personal lies.