What's the best way to predict the future?

Discovering who makes a superforecaster is enlightening - more so than reading about ideas for future trends, says Andrew Wileman.

by Andrew Wileman
Last Updated: 20 Aug 2020
Also in:

Superforecasting is a very good book. In fact it is essential reading - which I have never said in any of my previous Management Today reviews.

In 2003, after the invasion of Iraq, the American Intelligence Community (IC) was shocked by how wrong it had been about Saddam having weapons of mass destruction.

Two years before, it had been blamed for not connecting the dots to predict 9/11; after Iraq, it got blamed for connecting dots that didn't exist. This was despite 20,000 smart intelligence analysts beavering away at a cost of several billion dollars.

So the IC decided to test, in a proper scientific way, how to improve research, judgment and prediction of events. It set up a new agency, IARPA, which in 2011 commissioned a tournament of forecasters, tasked with making daily predictions looking between one month and one year into the future, covering 500 events, over four years.

There were five teams, taking different approaches, whose results would be compared objectively. Four were composed of experts (like top defence analysts) and academics. One team - christened the Good Judgment Project (GJP) - was made up of hundreds of ordinary volunteers, orchestrated by our author Tetlock. Its volunteers came from all walks of life - their one common characteristic was that in no way were they experts. Each was given daily questions, like 'will oil be over $100 a barrel in six months?', or 'will John Mahama win next month's presidential election in Ghana?', and they made daily probability predictions (e.g. 65 per cent chance, 20 per cent chance). They could use whatever research methods they wanted, using public material, and they could spend as many hours a day as they chose.

Tetlock had been brought in because previously for many years he had run a research project testing how well forecasts by experts did versus random guesses, or 'chimps throwing darts'. The results from that study showed that chimps did pretty much just as well as experts.

In the IARPA tournament, Tetlock's non-expert GJP also outperformed the expert teams. By the end of year two, it was outperforming so much, almost 80 per cent better, that IARPA abandoned the other teams and just focused on learning from the GJP.

One learning is similar to the chimp vs expert outcome - the wisdom of crowds. Aggregating lots of individual non-expert views can give very good estimates of even complex questions. In contrast, famous experts often end up being the worst forecasters, because they get big ideas and big egos, see every problem from one perspective, can't change their minds, and never admit they were wrong.

Its second learning is around what makes a good non-expert forecaster. Out of the GJP's hundreds of volunteers, some seem to be 'superforecasters'. They are non-ideological, pragmatic, flexible. They are numerate - although they rarely use complex maths. They debate with themselves, trying to take alternative views. They try to get solid context for their judgments. They change their mind frequently. They improve when collaborating in teams. (The GJP is now going commercial - take a look at goodjudgment.com - you can sign up and your views may influence US policy.)

This is a dense but very readable book, due I imagine to co-author Dan Gardner. It should be on every manager's and investor's reading list around the topics du jour of decision-making, prediction and behavioural economics.

Turning to The Future of Almost Everything, this isn't a bad book, but it is comparatively small beer. It is a look at futuristic mega-trends, which occur at the intersection of new technologies and changing social structures. So Dixon has themes like 'fast' - everything is changing faster - and 'urban' - humanity is migrating into mega-cities, with sub-themes like 'big data' and 'bio-digital brains'. The content is OK but thin - I don't think it gave me a single new thought.

After Superforecasting, you wonder, what is the point of a book like Dixon's? As Tetlock shows, predictions of anything looking forward more than a few years are practically useless - they are too vague on timeframe and too generalised. Mega-trend writing is non-probabilistic, so it flies in the face of good forecasting practice. I think Tetlock would put Dixon into his 'useless expert' camp. Imagining the longer-term future can be thought-provoking - but better done via science fiction.

Superforecasting: The Art & Science of Prediction by Philip Tetlock and Dan Gardner is published by Cornerstone Publishing at £14.99

The Future of Almost Everything by Patrick Dixon is published by Profile Books at £9.99

Andrew Wileman is a consultant, and the author of Driving Down Cost. This review was originally published in 2015.

Image credit: Pratik Chorge/Hindustan Times via Getty Images

Find this article useful?

Get more great articles like this in your inbox every lunchtime

Poor mental health and loneliness tops employees’ WFH experience

New research says leaders must take more responsibility for the wellbeing of their staff

Cryptocurrencies: the most important invention since the internet?

There are plenty of hyperbolic claims about Bitcoin et al. But just what exactly are...

“Millions isn’t enough, I need billions for the lifestyle I want”

5 bizarre business lessons to take from The Apprentice. For example, don't pad out your...

Will the global corporate tax deal really level the playing field?

MT Asks: Global leaders have agreed on a minimum corporate tax rate, but how will...

What leaders can learn from James Bond

Shaken and stirred: after the turmoil of Covid, is Bond a hero to channel or...

Time to end the culture of blame?

Should managers pursue a policy of no-blame, or is finding the source of mistakes only...