Get rid of your mainframe, buy the latest mini or microcomputer system, and you'll slash two-thirds off your capital and operational IT expenditure. But can that be all there is to it? Jane Bird talks mainly to the converted.
Mark Spalding is a brave man - he is throwing out his company's multi million-pound mainframe computer together with the vast library of software that has been built up on it over more than a decade. But as head of information technology (IT) at McCarthy and Stone, the UK's largest builder of retirement homes, he is not closing down the company's computer operations. Instead he is pioneering the latest trend in computing - "downsizing". This means exploiting the fact that computers are becoming more and more powerful for less and less money. Every 18 months, the number of computer instructions that can be processed by a silicon chip doubles, and the speed at which text and data can be squirted down optical fibre telephone lines increases at an annual 100%. By the year 2000, computers will be vastly more powerful than today and will have, of course, speech and image recognition. Desktop machines will double as video telephones, and will be capable of understanding a vocabulary of 20,000 words - equivalent to that of the average adult.
Technological advances are creating a new generation of high-performance, low-cost machines. It began with microcomputers. A decade ago, few had heard of the PC yet there are now approaching 60 million worldwide. The PC, along with its big brothers, the workstation and the minicomputer, is beginning to snatch power from the old dominant mainframes. The new generation of machines offer users huge cash savings. By ditching a mainframe and opting for the latest mini or microcomputer system, they can typically slash two-thirds off their capital and operational IT expenditure. Some mainframe makers dispute the cost-cutting claims, but the fact is that McCarthy and Stone paid £3 million for the mainframe at its Bournemouth headquarters in 1988. The network of pcs and file servers that is replacing it, costs £1.2 million.
But there is far more to downsizing than merely installing lots of cheaper computers. It needs to be combined with a change in working patterns and business practices. Spalding spent two years analysing McCarthy and Stone's business requirements. The company has nine separate divisions spread across the UK, with responsibilities such as land acquisition, development proposals, construction services and sales. "We have autonomous teams in each division, so a mainframe and centralised IT function does not fit very comfortably in to our environment," says Spalding. He decided that a distributed system was far more appropriate. The divisions can readily exchange information but they do not all need to access a centralised company database.
To the untrained eye it might seem that a distributed system would be more expensive than a centralised one because of the duplication involved. Spalding is adamant that just as his downsized system was cheaper to buy, so it will be cheaper to run. The trick is simplification - he has chosen machines that have fewer complexities to push up their price tag or increase operating costs. "We saved money on the hardware because local area networks are commodity items and hence much more competitively priced. Also, any business needs a certain number of PCs for word-processing and spreadsheet analysis, so some of this equipment would have had to be bought whatever mainframe we had. This end of the market is bursting with software supply so you get a wide choice and good value for what you pay."
Spalding also saves on running costs because he does not require a separate IT manager in each of the regions, and half the company's 24 IT staff will no longer be needed. "Mainframes are technically very complicated and there tend to be a lot of problems to look into. You need a number of staff just to keep them ticking over, whereas a local area network is much simpler and less prone to problems," says Spalding. The important thing, he reckons, is that you are putting the equipment and the tools at the sharp end with the people who are involved in the day-to-day running of the business and in making things work. "At the end of this process we won't have any operators. There might be the odd-hour's work required, but it is not a full-time job."
McCarthy and Stone has taken the right approach to downsizing by considering the organisational needs before the capabilities of technology. It has used the opportunity of re-sizing to think again about all the application software. "When you are engaging in a re-sizing project it tends to mean lots of pre-conceived ideas get looked at again. And lots of things get thrown out of the window," says Spalding. "Programs are often written for good reason but rarely are they turned off. Re-sizing like this is a good opportunity to ask whether a system has reached the end of its life. Do you really want to re-invest in replicating it? A fresh look can be very revealing." None of McCarthy and Stone's systems are being converted; they are all being replaced, including applications in finance, construction, sales, marketing, estate-management and customer care.
The worst strategy is to automate existing procedures. This is like replacing the tea-lady with a robot, when the appropriate move is to instal vending machines throughout the building. It was an approach used all too often during the 1980s, when IT spending on white-collar workers increased by 200% while their productivity went up just 2%. Take the situation where staff previously received typed memos, distributed manually, reminding them to attend routine meetings. It would be pointless for somebody to use a wordprocessor to print the same invitations. Instead, the computer should automatically generate electronic mail memos to all those on the list, and deliver them to individuals' desk-top screens.
Dramatic downsizing exercises which involve moving from mainframes to PCs are relatively rare, according to Hewlett-Packard, which installed the McCarthy and Stone system. By far the majority of HP's downsizing business involves replacing mainframes with powerful mid-range systems. Nor is the mainframe dead. Vast commercial number-crunchers will continue to be needed by organisations such as banks and financial institutions for processing huge numbers of transactions at high speed.
In a recent survey by P and P Corporate Systems, a pc systems house, 56% of users said that their mainframe systems are "here to stay", while 44% said they would be "phased out over the next five to 10 years". According to Inteco, the Woking-based market research company, at the end of 1990 there were 34,000 mainframes in Europe. By December 1995 this number is expected to increase to 38,000, the equivalent of 2% growth a year. The medium range systems are forecast to expand somewhat faster, at an annual 13%, from 730,000 in 1990 to 1.3 million by 1995. But substantially more growth is expected from business PCs, which Inteco forecasts will increase at around 20% a year, from 13 million in 1990 to 34 million in 1995.
It is the ability to transfer individual tasks to smaller machines that strikes Harry Hoyle, Inteco's senior vice president, as the most important effect of downsizing. "People are moving applications around to different sized hardware platforms. For example, word-processing and spreadsheet analysis used to run on mainframes, but these are now mostly done on small machines which are far better suited to the tasks," says Hoyle.
This empowers the workers at the coal-face. "The most important reason for downsizing an application is that it gives users direct access to company data, which they can manipulate with better flexibility and improved functionality thanks to desktop PCs. Data is behind everything that is happening with computers at the moment," says Hoyle. In banking this might be cash management and customer analysis. In retailing, transport schedules are getting more critical. In manufacturing, Hoyle says it is inventory management and capacity planning that is needed.
But there is a serious danger in putting applications and data directly into users' hands - computer anarchy. Before the company realises what is happening, executives in each department have begun experimenting with alternative software packages, transferring and manipulating data, and created a host of unique and incompatible databases. Users cannot believe their luck at having got rid of the old-style data-processing departments to which they had to submit computing requests in triplicate and wait for weeks for a response. In a downsized environment, people do what they like. "There is no consistency. It is chaos," as one consultant put it.
Data centres built up very regimented operations procedures with backup, security and clear definitions of applications. To control the downsized environment a new breed of "data manager" is beginning to emerge. The role involves making sure that the data is available wherever it is required, whether that be in the hands of trading partners, customers, banks or regional centres.
The mainframe has a part to play here - it can become the repository for the key corporate data, the consolidation point where data is gathered, stored, and from where it is distributed. Hoyle says: "This enables the data processing department to hold on to the data and manage it better. It has more control over essential company information where resilience and security may be crucial." The high speed of computer transmission means that nobody need know or care if the mainframe is on local premises or hundreds of miles away. When global optical fibre highways are complete, it could even be on the other side of the world. Data can still be delivered to the screen in seconds.
There is also a trend towards upsizing, reckons Hoyle. "People tend to buy computers and at once find they are too small," he observes, just as new motorways are immediately overcrowded. Often applications are developed and experimented with on PCs, but as they are perceived to be important to the corporation they get moved to the bigger machines.
Hoyle continues: "There is a natural tendency in any company for a system to grow by 30%-40% a year." However, companies are cutting the number of large-system suppliers. In the past it was quite common for large users of mainframe to have machines made by several different manufacturers. "Now they are tending to rationalise with one supplier, which also saves money on software development and maintenance," Hoyle says.
Needless to say, the mainframe computer makers are fighting back. They argue that, despite the experiences of companies such as McCarthy and Stone, downsized systems are not automatically cheaper. A false impression of cost-savings is created by focusing too strongly on the processor speed, according to David Slavid, of ICL, the British mainframe-maker that was sold to Fujitsu of Japan in 1990. "The cost of the processor hardware today is no more than one-third of the five-year system cost, and probably less than 10% of the total five-year cost including staff," says Slavid. "So offering a processor at half the price of a competitor makes less than 5% difference to your total expenditure.
"Comparing the capacity and speed of magnetic tape cartridge systems can be intensely boring. But if 10 distributed systems each take half an hour a day of someone's time to back up, and a centralised system handles it automatically, then the distributed system suddenly costs an extra man-year of effort," Slavid says.
A survey by Ideas International, an Abingdon-based IT consultancy, found that a 160-user proprietary system typically costs around £660,000 over five-years, whereas eight 20-user Unix systems would add up to about £640,000. This does not include software or software support, where a centralised system appears to be marginally cheaper. Nor does it include the cost of networking, or of any associated internal support costs. In some cases, the cost of supporting eight distributed systems may be much higher than a single central system.
Slavid concedes that distributed systems have the advantage of local control, quick response to local needs, and lack of central overheads. But he emphasises the benefits of a centralised system as simplified support, uniform control, limited networking costs, and a single shared database. He argues that users may be disappointed if they opt for a downsized system on cost alone. Instead, the decision should be made on the basis of sound business strategy.
The continuing role for mainframes is stressed by Data General (UK), the minicomputer maker, which has produced a useful guide to downsizing issues. Peter Ferrigno, DG's marketing director, observes: "We're not saying throw out your mainframe. But for your next application, you might wish to consider keeping your mainframe as it is and moving towards the idea of a computer network where the mainframe is just one more box on the line." The essential thing, he reckons, is to select open systems which conform to industry standards and can all be connected to the same network to share data and applications.
Many boardrooms faced with a downsizing proposal believe that their company cannot run without the benefit of a mainframe. Spalding met much more resistance from company directors than from his technical staff. "Managers tend to be especially concerned if they have had mainframes for many years and repeatedly sold the idea of needing to acquire more hardware to handle burgeoning volumes," he says. "But small-scale networks can handle surprisingly large workloads."
During tests at McCarthy and Stone, the local area networks coped easily with the company's sales and marketing database which has hundreds of thousands of records. Such systems might not be sufficiently powerful for big insurance companies and banks, but they are more than adequate for the vast majority of businesses.
One big advantage of PC and local network software packages is that they are much more easily demonstrated and tested than their mainframe equivalents. Evaluation disks can be left with users to experiment at and evaluate in their own time. This would be virtually impossible on mainframes where installing each application can be a long and painful process.
Despite the advantages, there can be considerable staff resistance to downsizing. Computer boffins who have spent their lives accumulating mainframe expertise fear their skills will become as obsolete as the machines that are being thrown out. But Spalding has met with much more positive response from his staff who regard involvement in downsizing as a career opportunity. "By building up their technical knowledge of PCs and local area networks they are gaining skills in an expanding part of the market," he says.
The other advantage is that the downsized environment brings technical staff closer to the business users they are serving. "The mainframe was quite divisive. Its complex technical environment separated users from IT staff," says Spalding. "PCs bring both sides closer because the users feel they can understand the technology and they tend to discuss their business problems more with technical staff." His team is now frequently on the road providing training and solving technical problems in the regional offices. Those who suggest downsizing can cause anarchy are scaremongering, reckons Spalding. He is confident that McCarthy and Stone's regional managers are accustomed to acting responsibly and asking for advice when they need it.
If all goes according to plan, the mainframe will be hauled away in November. There have been problems, Spalding admits, but he does not lie in bed at night worrying about having taken the plunge. "It is not a bottomless pool - we carried out a full survey before we started, and nothing has happened to change our minds since."