UK: TALES OF THE UNEXPECTED.

UK: TALES OF THE UNEXPECTED. - The computer industry has proved abysmal at forecasting trends over the past 30 years. Established companies have consistently backed losers and failed to back winners.

by Jane Bird.
Last Updated: 31 Aug 2010

The computer industry has proved abysmal at forecasting trends over the past 30 years. Established companies have consistently backed losers and failed to back winners.

Computers were supposed to put an end to overflowing in-trays and bulging filing cabinets. But the paperless office is just one of the many myths that have been created and then destroyed by the computer industry in the past three decades. In fact, far from eliminating it, the computer age has actually generated a huge increase in the use of paper. 'All the predictions that the end of paper was nigh seem stupid looking back,' says George Cox, chief executive of systems business at Unisys Europe.

This is but one small illustration of how abysmal the computer industry has proved at forecasting over the past 30 years, this despite the fact that technological developments have often been utterly predictable. Even where the industry has managed to foresee trends, the experts have failed to work through the implications.

As long ago as 1965, for example, Gordon Moore, co-founder of chip giant Intel, put forward 'Moore's Law', the observation that microprocessors would double in power and capacity every two years. Time has proved him right - indeed, the doubling has often occurred within just 18 months.

The problem was that mainstream computer companies did not spot the business opportunities created by this rapid progress. The idea of putting powerful microprocessors in little boxes on executive desktops, for instance, simply did not occur to most of them. So throughout the '60s and '70s, the computer industry tried to persuade executives to have desktop terminals linked directly into corporate mainframes. 'When the PC came along we saw it as Mickey Mouse computing,' says Cox. 'Data-processing departments all ignored it.'

Other innovations such as graphical, user-friendly screens and the Internet also came not from established industry players but from small entrepreneurial companies adept at spotting what users really wanted. In the case of the Internet, as with fax machines, the technology had in fact been around for years before the computer industry began to realise it had commercial possibilities. 'Mainstream industry players have been consistently unable to think laterally and anticipate ways in which users would want to use technology,' claims Cox.

Ironically, meanwhile, recent history is littered with technologies foisted on a reluctant market by technology-driven suppliers: fifth-generation computers, executive information systems, sewing-machine weight portables, pen-based electronic Filofaxes and speech-driven word processors, to name just a few. Some of these have made a second and more successful appearance.

But at the time of their launch, they demonstrated a profound gap between what the computer industry thought people wanted and what they wanted in reality. It is a mismatch which has caused fortunes to be made and lost, and entire companies to be precipitated into fame or misfortune.

Part of the problem is the industry's overweening pride, its tendency to exaggerate the power and potential of products far beyond reasonable expectation. 'Almost every innovation has been oversold or has underdelivered,' says Harvey Parr, an industry veteran of 30 years and now director of London-based OSI, an IT consultancy.

Suppliers are always out for short-term gain, Parr adds, changing their message again and again to keep renewing the attractiveness of their products to disillusioned customers. They also tend to exploit enthusiastic managers by encouraging them to think that their aims are achievable and that it is worth investing in technology, he believes.

The dangers of overselling first became apparent in the '60s when real-time systems were introduced. Before this, computers had been largely batch machines adding up columns of figures about the past performance of businesses. 'They were essentially boxes of electronics which didn't help businesses except by counting what they were doing,' says Geoff Morris, former president of X/Open, the industry-funded computer standards body.

Real-time systems tried for the first time to keep up with the current business situation. Potential users were told they would be able to act on and react to live information. But real-time systems were far more complex than anything that had gone before and were much more likely to break down. 'Every supplier sold real-time systems into the financial services market and I don't think any of them worked,' recalls Morris. The result was a nightmare for customers and suppliers alike, he says. 'People thought installing a real-time system would be the same as installing a large batch system but they found out it was much harder.' Meanwhile, users who had optimistically abandoned their old systems were left high and dry. Devastating disruption ensued for businesses; the industry had the first of many marketing disasters on its hands. 'Suppliers had to set up huge customer satisfaction and service teams throughout the industry in an effort to make real-time systems work,' says Morris.

When the industry was not overselling technology, it was prematurely writing it off. Cobol, the programming language for business applications is a case in point. Alan Benjamin, the first director of the Computer Services Association, remembers being a lone voice back in 1966 when he argued that Cobol wasn't dead. The language had long been regarded as too big, too clumsy and too greedy for memory, he recalls. Yet it lived long after the experts thought it obsolete because so many programmers were trained in it and so many big machines had applications running in it. Applications tended to be added to and enhanced, rather than ditched so that people could start again from scratch in a new language. 'We're still arguing the toss over Cobol now,' says Benjamin. Meanwhile, it remains the mainstay of business applications.

Among the lessons the industry did learn with the introduction of real-time systems was the importance of resilience and reliability. As soon as vital applications were moved onto computers, downtime became a major problem. Huge resources were poured into devising systems that were robust and could be easily restarted. Success rewarded those who anticipated the trend, companies such as Tandem which introduced 'non-stop' computers with built-in back-up recovery systems.

The other big change which occurred around this time was the realisation that computer systems would never again be finished, that they were continuously evolving. Cox recalls working on his first application in the '60s, a system to mechanise manufacturing operations. As soon as it accurately replicated manual processes that had been in use for decades, the project was deemed complete and the team disbanded. 'But by the early '70s, we had moved away from an environment where systems never changed into one where they never stopped changing,' says Cox. 'It took ages for managers to recognise this.'

The next breakthrough came towards the end of the '70s with the introduction of the personal computer. Here the UK was a world pioneer. Sir Clive Sinclair was one of the first to market the product with the ZX80, followed by Cambridge-based Acorn with its BBC Micro. In the US, companies such as Apple and Commodore were catering for a similar market among home-users and aficionados. Although these products gained a strong cult following, they were long viewed disdainfully by the traditional data-processing industry. At first, few businesses used the new machines which remained confined largely to boffins at home or children playing zap'em games.

There was enough interest, however, for IBM to 54e legitimise the microcomputer with the launch of its PC in 1981. With Big Blue behind it, the PC was a huge success and the microcomputer market took off. Within a few years, all self-respecting managers had PCs on their desks.

One reason for the success of the IBM PC was that it allowed business people to bypass their data-processing departments and run their own departmental applications. It also offered the basis for much-needed standardisation in a world of proliferating technologies. The fact that IBM's PC was concocted from off-the-shelf components meant that other companies could put together their own, virtually identical PCs. The clone market was spawned, PCs became commodities, prices plunged and customers could choose from a huge range of mix 'n' match hardware and software products.

Full of early PC promise, the UK had its share of clone-makers including Amstrad, Apricot, Elonex and Viglen. In the end, the pressure on margins was too much for Apricot, however, which sold its computer business to Mitsubishi. Viglen, meanwhile, was taken over by Amstrad, which withdrew from the high street to focus, like Elonex, on mail-order markets.

Even Apple, which launched its stunningly innovative Macintosh computer in 1982, had difficulty fighting the IBM PC with its DOS operating system.

The Mac, which was the first to bring user-friendly graphical displays, on-screen icons and mouse-pointing devices to the mass market, changed the face of desktop computing and did much to promote its widespread use.

The IBM PC had nothing to match it until 1985 when Microsoft launched DOS's successor, the Windows operating system. This was far behind the Mac operating system and even today many users think it inferior. But the gap is closing and Apple has not made life easy by taking so long to license its technology to the clone-makers. With a diminishing share of the market, a series of management changes at the top and record losses of £457 million reported for the last quarter, Apple's future looks seriously under threat.

Until the early '80s, computer companies tended to design, build, sell and service their products themselves. The arrival of industry standards and the commoditisation of products opened the way for new distribution channels because the computer firms found themselves unable to cater for a mass market. 'The establishment of a huge third-party infrastructure in the form of dealerships and resellers was a critical change,' says John Leftwich, Microsoft's general manager for Europe. By the early '90s, the computer superstore phenomenon had begun, a development that was unimagined a decade earlier.

Unimagined market trends have also led to massive layoffs as competition has increased and profit margins have been squeezed. The old vertically integrated computer suppliers have belatedly split themselves into hardware, software and services divisions with much more emphasis on the lucrative systems-integration and consultancy businesses. The IT company that aims to be all things to all customers no longer exists. 'The idea of one company doing everything has definitely gone,' says Leftwich. Even ICL, now owned by Fujitsu, recently handed over the running of its struggling PC division to its parent and has begun selling off most of D2D, its manufacturing arm. Its networking, buildings-management and health-systems businesses have already been sold off. So while in the car, airline or aircraft engines industries, the vast proportion of the market is falling into the hands of fewer and fewer players, in the IT sector, the number of companies competing for a share in the market is rising.

The massive layoffs have also been the result of the feast-and-famine cycles of semiconductor production, caused partly by hugely overoptimistic forecasts from hardware vendors and partly by subsequent overcaution from chip manufacturers keen not to get burnt twice. In times of excess supply, for example the glut in the mid-'80s, predictably enough prices reached rock bottom and companies such as Hewlett-Packard and Apple went through huge reorganisations. This experience was not forgotten by the Japanese, before then heavy investors in chip capacity, who cut back heavily on investment in manufacturing facilities.

Now the cycle has started up again: during the past two years, memory and processor chips have been in short supply keeping PC prices high and causing huge investments in greenfield chip production sites as manufacturers have raced to reap the benefits of the shortages. But now growth in demand for semiconductor manufacturing equipment is outstripping the growth in demand for the product itself. Meanwhile, chip prices are reported to be 40% down on mid-1995 levels.

It all goes to show that there are few certainties in the computer industry.

A pattern is beginning to emerge, however, whereby no single company dominates the market for long. In the early days it was IBM. Then Big Blue's power was usurped by Microsoft whose DOS and Windows operating systems now control 80% of the world's PCs. But, for all his vision, Microsoft's founder, Bill Gates, did not foresee the popularity of the Internet, with the result that, by the time the company's Internet products were ready for shipment, aggressive new players such as Netscape and Sun had established a strong lead. Netscape's Navigator software is now used by 75% of cybersurfers for browsing the Web; it is the Windows of the Internet and the Internet is the biggest growth application of the PC, Microsoft's home territory.

Who was predicting, just a couple of years ago, that the outlook for Microsoft could so swiftly come into question? Once again, the industry has surprised us, innovation has come from outside, and it is the companies which have spotted true user needs which have won the prize.

Find this article useful?

Get more great articles like this in your inbox every lunchtime