Menu
Plug and Pay

Plug and Pay

Utility computing promises processing power when you need it, where you need it. But the technology isn’t making sparks fly yet

Reader ROI

  • Today's reality versus the utopian dream
  • The cultural and technical issues associated with computing-on-demand
  • How companies are using utility computing now

Utility computing: Who would have thought that a technology with such a pedestrian label would become a top IT story?

During the past two years, most of the leading IT services companies have announced initiatives with that unprepossessing name. All the products and services sold under that banner appeal to a common vision: computing tasks buying what they need and only what they need, automatically, from a huge pool of interoperable resources (potentially as large as the whole Internet). Each task or transaction would have an account and a budget and would run up payables; every resource would record and collect receivables. Computing power would be as easy to access as water or electricity. While the products and services currently being introduced under utility computing do not go this entire distance, they move a long way in that direction.

Consider American Express executive vice president and CIO Glen Salow's situation. Like many companies, when AmEx introduces a new product, that action typically triggers traffic surges back onto the enterprise network. Some of that traffic will support marketing efforts, some technical support and some the service itself, such as executing an online transaction. It is critical that adequate resources be in place to support that service, particularly during the early days of an introduction. Yet calculating ahead of time what this demand surge will be is almost impossible.

To date, all a CIO could do was overprovision, but as Salow points out, that imposed a double penalty: paying more than was technically necessary and waiting for the new equipment to be installed and tested. "I don't want to tell marketing that I need six months to have the infrastructure in place," he says. So Salow took a different approach and structured a deal with IBM Global Services to buy storage and processing for delivery over a network, per increment of traffic demand. That is not utility computing in the purest sense, since resource procurement is not calculated automatically or per transaction. But the term still applies because of the much tighter fit it allows between the provisioning and demand curves. The advantages of utility computing are self-evident: Resource use becomes more efficient, and because resource changes are automatic or at least highly automated, it also conserves management time. By contrast, the current system - in which IT hooks up and exhausts large blocks of resources in a general free-for-all, at which point another large block is trucked in and wired in place - looks antediluvian. On paper, at least, the case for the transition to utility computing seems compelling.

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

More about American Express AustraliaBillBioinformaticsDarcyeBayEDS AustraliaGatewayHewlett-Packard AustraliaHISHPIBM AustraliaIT PeopleMacsOpswarePLUSSun MicrosystemsUnited Devices

Show Comments
[]