IT on tap

Being able to buy computer capacity like electricity, as and when needed, has powerful appeal, but how feasible is it at present? Mark Vernon reports.

by
Last Updated: 31 Aug 2010

The deregulated utilities market can be confusing. It's quite possible these days to buy gas from a water company or electricity from a gas supplier.

One day, computer power might be included in the mix, and a CPU reading will be added to the electricity meter reading: the number of bytes you've pushed around the office will be totted up and appended to your bill, just as kilowatt hours are now.

This prospect is what IT vendors call utility computing. Since computing is reaching the point at which it can be commodified, processing power can be freed up and applications can be accessed from any particular single source. Just as water comes out of a tap, the computing utility is accessed simply by 'turning on' the network.

'Enabled by a virtualised, scalable, fully automated and shared IT infrastructure, utility computing promises to allow organisations not only to tap into computing, storage, networking resources and applications when they need them, but also to pay only for what they use,' explains Steve Nunn, EMEA managing partner at Accenture. 'The potential benefits are enormous. With a utility computing model, organisations will be able to capitalise on sudden and short-term business opportunities and shorten the time it takes to benefit from new applications and business processes.'

That is the vision. What is actually happening? One version of utility computing already deployed is that of the computational grid. Grids connect multiple servers and PCs, allowing users to pool their resources and gain access as and when they need to a lot more computer horsepower than they can muster on their own. By decoupling the software from the hardware, grids allow processing-intensive functions - such as calculating complex algorithms - to be completed in a much shorter time.

For this reason, they appeal to financial services organisations, among others. 'Grids have proven themselves at more than one Wall Street firm,' explains Damon Kovelsky, a Financial Insights analyst in the Capital Markets group. 'Many firms have seen more than 12 hours of computations decrease to an hour or less. We expect many more firms to implement computational grids in the next year.'

Grids are gaining ground in scientific research requiring high processing demands. Researchers into particle physics recently gained access to the world's largest grid, made up of more than 6,000 machines spanning 78 countries, called the Large Hadron Collider Computing Grid (LCG). It will process a mind-boggling 15 petabytes of data annually.

A different kind of utility computing operates at British Land, which rents datacentre functionality from hSo to support its core business applications.

Peter Earl, group head of information systems at its head office, explains why. 'Probably the most significant benefit has been the increased resilience that this route offers. Recovery time for critical systems used to be around 48 hours and required regular testing. These systems now recover instantly, with no loss of service.'

There are also advantages in terms of flexibility and the simplicity of the pricing structure. 'Underlying this is the transparency of a single regular charge, just like a quarterly electricity bill,' says Earl, 'on top of which we can build as much or as little company-specific functionality as we require.'

Or consider the 'virtual infrastructure' supplied to Esat BT (part of BT's Global Services business). The company was looking to implement a large customer relationships management project, in theory requiring 10 servers. Instead, it has bought four, loaded with software from VMware that allows multiple applications to run on each server - delivering the 'virtual machines' on the hoof as required. 'The VMware fabric gives Esat BT a pool of processing power that can be sliced and diced among applications based on demand and business need,' says Esat BT CIO Martin Wickham. 'We have evolved from a server-centric world to one that is data-centric.'

But if you thought utility computing sounded straightforward - an attempt by the IT industry to simplify its bewildering range of services - you'd be wrong. Says Colin Bannister, consulting manager at the UK and Ireland Practices Group of Computer Associates: 'Some technology limitations today restrict the ability to deliver true utility computing.' He believes the business community should put pressure on vendors to overcome them.

However, other problems may be less tractable. Although utility computing is supposed to be generic, vendors call it by different names: for IBM, it's 'on-demand computing'; for Microsoft, 'software as a service'; for Cisco, the 'intelligent information network'; for Oracle, it's a grid. And these categories break down into further subdivisions: IBM, for example, talks of 'deep computing capacity on demand' and 'capacity on/off on demand'.

This is only partly a semantic issue: nuances in nomenclature imply variety in vendor offering too. In other words, the utility computing market is at least as confusing for businesses to buy into as the one it's supposed to be simplifying. Moreover, given that market forces tend to reduce the many offerings of today to the dominant players of tomorrow, potential users are holding back for fear of getting locked into a model or vendor that fails to survive the natural selection process.

What might be called 'degrees of utility' complicate the issue further.

It's difficult to decide where utility computing starts and established models such as outsourcing or delivery by an ASP (application service provider) end. Again, this is more than mere semantics. Given the tight reigns controlling most IT budgets, IT vendors are tempted to wrap old services in new paper simply to drive revenues.

Research conducted by the Management Consultancies Association in conjunction with MT found that the overwhelming majority of business people are unfamiliar with the concept of utility computing (see 'Switched on?' panel). And even when organisations are alert to its possibilities, the market for it may be smaller than expected. Executives believe that utility computing would be hard for them to implement, with concerns over effective supplier management, costs of implementation and security at the top of the agenda.

Says Jack Noble, director of core services at Fujitsu Services: 'I recognise those concerns and, as a consequence, advise organisations to keep it simple. Pick a supplier - preferably, a single source - that has a proven capability. Don't rely on contract penalties or service-level agreements, as you'll spend more time arguing these than delivering the service. Look for a means of measuring that proves you're getting what you pay for.'

On the other hand, the evidence is that end users are implicitly asking for utility computing, as much as the vendors are keen to push it. IT infrastructures are a mess: years of piecemeal server implementations at many companies have created a veritable minestrone soup of a network, and it often costs too much to sort out. 'Time and resources no longer allow companies to implement solutions in a traditional way,' says David Angwin, a senior manager at Wyse Technology. 'We see many projects failing because the requirements have changed long before the pilot is complete.'

Worse still, plenty of evidence shows that server aggregation would not solve the problem. As with the national grid, servers need to handle huge variations in demand: companies must cater for occasional peaks many times higher than the average (the IT equivalent of millions of TV viewers putting the kettle on when Coronation Street finishes). Maintaining the redundant capacity to service these peaks is hugely expensive and inefficient. It is this problem that utility computing aims to address.

The time has come for a new financial model for IT purchasing, one that can wring more value out of its assets. This suggests a different way of looking at utility computing: as a commercial rather than an IT option.

From this perspective, the aim is not so much to buy new technology as to move from fixed pricing and buying licences to 'pay-as-you-go' purchasing.

Suki Gallagher, managing director of IT finance specialist CCL, believes that the days of paying for technology on a licence basis are gone. 'Companies don't want to fork out tens or hundreds of thousands of pounds upfront for software that needs an upgrade in six to 12 months,' she says. 'They are demanding to pay on their own terms, which could be one of many different options, including subscription, rental, pay-as-you-go, and so on.'

One in three software vendors already use subscription as a primary pricing model, she adds, and the expectation is for that to grow to one in two within two years. This is not just a more economical way of paying for IT, it also increases transparency of costs, which in turn leads to a renewed drive for standardisation.

But will the economic argument get off the ground? For one thing, it is far easier to migrate to utility computing when an IT network has been highly rationalised: it's easier to pull the plug on one system and switch on another. But the rationalisation process itself drives out costs, reducing the appeal of utility computing.

A similar economic clinch might also be felt by service providers. Utility computing must be cheap for them too, and this means substantial economies of scale. If the utility computing market does not reach critical mass, it will not be viable in the long term for the suppliers.

If the economics of utility computing are questionable on the vendor side, the demands it makes on their business models compound the difficulties.

In short, vendors themselves need to recognise that utility computing is a service and not a product. 'Licensing issues remain a major hurdle, since vendors will be unwilling to adopt licence schemes based on usage, because their margins will reduce,' explains Una Du Noyer, executive architect at Capgemini. In the worst cases, a vendor would not survive such a transition.

'Many platform-focused vendors will have to change from a product focus to a services focus once utility computing becomes the standard,' says Michael Hjalsted, director of systems and servers, EMEA, Unisys. 'Will those companies with a product focus be successful once they move to a services-led focus?'

On paper, utility computing is a good idea. It would deliver IT that is more economical and flexible. But although utility-like computing will spread, its full-blown manifestation - IT as water out of a tap - seems a long way off. 'Few organisations are committing their most mission-critical applications to utility computing environments,' concludes John Starling, director of technology integration at Deloitte. 'As with the electricity grid, computer brownouts are a real possibility if demand exceeds supply, and the capacity just isn't there.'

The future depends on universal standards, painless migration paths and the relinquishing of vested interests. All of which makes the day when organisations will simply pay the bill for IT seem way over the horizon.

- Copies of the report Utility Computing: Not switched on yet? are available, at £100 in print or £150 in pdf format. To order, e-mail davina.page@haynet.com

SWITCHED ON?

The Management Consultancies Association and MT asked executives what utility computing meant to them.

69% had never heard of utility computing; 24% said they were familiar with the term, but were confused about its exact meaning.

- Only 7% said they or their organisations were interested in or actively investigating the applicability of utility computing in their immediate environment.

- 25% said they were sceptical about the likely benefits.

- Budgetary pressures (32%) and keeping pace with IT change (27%) were the top two issues driving an interest in utility computing.

- 81% believed that adopting a utility approach would be either 'difficult' or 'very difficult'.

FIVE TOP TIPS TO UTILITY COMPUTING

1. Treat it as a business decision rather than an IT decision.

2. Focus on specific systems rather than seeing utility computing as a comprehensive solution.

3. Prepare the ground by putting existing systems in order and winning the commitment of users.

4. Many of the disadvantages of the outsourcing approach still apply: concerns over who controls the data, the quality of a supplier's staff, quality control, security and disaster recovery will not disappear.

5. Agree the right charging structure; without that mechanism in place, utility computing loses its raison d'etre.

Find this article useful?

Get more great articles like this in your inbox every lunchtime

Subscribe

Get your essential reading delivered. Subscribe to Management Today