The UK economy is rich in economic surveys and forecasts. Trevor Merriden sorts the best from the rest.
You wake up and turn on the radio. John Humphrys is debating the feelgood factor with Kenneth Clarke. You wonder why the Chancellor's optimism hasn't yet touched your life. On the train into work, your newspaper reports a sterling currency crisis. You can't remember whether that makes imports more or less expensive. Once at your desk, you rediscover that economic survey you meant to read last month. It will tell you what's going on but your deadlines are tight. You throw it in the bin.
Meetings, strategies, deadlines and crises are the staple diet of a life in management - the wider economic picture is interesting but beyond our control. Harassed managers want to read more about the economy but claim they can't find a spare moment. Christine Mizon, PA to John Coombe, finance director of Glaxo Wellcome, says: 'The amount of information coming through the office is horrendous. Half the post that comes in gets filtered out because I know he either won't be interested or he won't get time.' Unsurprisingly, a Management Today straw poll of senior managers and their assistants shows that Mizon's view is endorsed in offices all over the UK.
Overstretched managers instead rely on their instincts to assess what's going on. This is a shame, because the UK economy is rich in accurate and timely official statistics and comprehensive economic surveys. And with an abundance of Mystic Megs from the City and academia predicting the future (see box), the economic ghosts of past, present and future are available for all to see. It's just a matter of knowing what to look for and whom to believe.
Micky Nunwa is personal assistant to Dominic Cadbury, chairman of Cadbury Schweppes. 'He generally follows his nose,' says Nunwa of his boss, 'but he does look at all the CBI surveys.' This view is typical - if there are surveys that most business people like, they are those produced by the Confederation of British Industries (CBI). The father of the business survey in the UK, the CBI has been churning out economic reports since 1958. Sudhir Junankar, CBI associate director for economic analysis, explains the rationale behind its methods. 'We want to access all sorts of information about what is happening in the UK economy that comes out well ahead of the official data,' he says. The CBI's technique is qualitative, covering business perceptions and opinions on output, employment, the adequacy of stocks, the reasons for investment and the constraints upon it. The surveys are conducted at a senior level, say the finance director or managing director of each company. Do businesses have an incentive to talk up their own sector? Junankar thinks not: 'There are fairly direct questions to be answered and businesses have no reason not to tell it as it is.'
Competing for attention on the desk of the over-burdened executive are plenty of other worthy reports, including the Purchasing Managers' Survey (PMS) produced by the Chartered Institute of Purchasing and Supply (CIPS). This monthly report is based on data from purchasing executives in 300 companies, reflecting the production mix of the UK economy. Purchasing managers are asked if their company's situation has improved or deteriorated since last month. The published survey pre-dates official Government statistics by two to three months.
Although relatively new to this country (it only started in mid-1991), the survey has an impressive track record abroad and is seen as a reliable indicator by no less than Alan Greenspan, chairman of the US Federal Reserve.
Also up there, if not quite at the top of the league, are the British Chamber of Commerce's Quarterly Economic Survey and various specialist surveys such as the British Retail Consortium's or those produced by the Housebuilders' Federation.
Of course, there are plenty of poor surveys around too. Managers generally gloss over these, but the few who don't suggest that surveys produced by Trade Indemnity, 3i and Lloyds Bank are more likely to line the wastepaper basket than the pockets of those who read them. Creators of such surveys might do well to heed the view of David Smith, MT's economics columnist, who says that poor surveys 'have two characteristics: the answers aren't interesting and the methodology is unconvincing. Think of the wildest answer a survey can throw up - and if that doesn't excite you, then it's not worth reading.'
FORECASTING: The full spectrum of soothsayers, including the good, the bad and the plain inconsistent
'If the outlook is boring, I say it's boring.' So claims John Stewart, the man behind the respected monthly survey published by the Housebuilders' Federation. Stewart is making the point that, as an independent economic consultant, he has no incentive to bend analysis to titillate his readers. The situation is different for some City economists. Nigel Lawson's 'teenage scribblers' are PR people for their company and have a clear interest in grabbing headlines with their forecasts.
Sensationalist soothsaying is where surveys end and forecasting begins. And there's no shortage of economic gurus eager to put their necks on the block in the search for recognition.
But which are worth following?
An annual survey of forecasts prepared by David Smith, economics editor at the Sunday Times (and MT economics columnist), is very revealing. At the end of each year, Smith looks at the predictive ability for that year of 40 or so well-known forecasters on the four key economic variables - growth, inflation, the current account and unemployment. He then awards each forecaster marks out of 10. Adding up the scores for the past three years (to eliminate freakishly good performances in any one year), MT has identified the 10 best and worst forecasters for the UK economy out of the 32 who featured in Smith's rankings (see table, right). Top of the pile comes Lombard Street Research by Tim Congdon of the Chancellor's independent panel of forecasters.
Yet it is a former member of the panel, Wynne Godley, who comes out as the least accurate forecaster.
Establishment forecasters fared poorly. The Treasury team came too close to the bottom 10 for comfort, and international bodies did little better: the Organisation for Economic Cooperation and Development (OECD), much criticised for its unwieldy and lengthy forecasting process, has a very poor record, as does the European Commission.
The overall MT scores mask a tale of triumph and tragedy over the three years. The National Institute is in our top 10 only by virtue of topping the league in 1995, having finished second bottom in 1994. By contrast Patrick Minford's team at Liverpool University, champions in 1994, were propping up the table a mere 12 months later.
The second column in the table shows the worst score of each forecaster and is perhaps the best guide (where forecasters appear to achieve the same scores overall, the next worst individual score is used as a differentiator). Achieving consistent accuracy is much harder than it seems, with only five forecasters - Kleinwort Benson, Morgan Grenfell, Lombard Street Research, Cambridge Econometrics and Schroders - managing half marks or more in each of the three years.
Position Forecaster Total score
(out of 30) Worst score
1 Lombard Street Research 20 5
2 Kleinwort Benson 18 6
3 Cambridge Econometrics 17 5
4 Morgan Grenfell 17 5
5 Schroders 17 5
6 DRI 17 4
7 National Westminster 16 4
8 Williams De Broe 16 4
9 National Institute 16 2
10 Salomon Brothers 16 2
Position Forecaster Total score
(out of 30) Worst score
23 Daiwa 12 2
24 London Business School 12 2
25 BZW 12 2
26 ITEM Club 12 2
27 Henley Centre 11 2
28 OECD* 10.5 3
29 UBS 10 3
30 Economist Intelligence Unit 9 1
31 Robert Fleming 8 1
32 Wynne Godley 7 1
* Two years only. Total is average multiplied by three.