Problem solve Get help with specific problems with your technologies, process and projects.

Utility computing promising, but proceed with caution

Utility computing looks promising, but proceed with caution.

Don't ask Rennie DiLoreto which operating systems power his company servers. He doesn't know and he doesn't care.

What he does know is that centralizing server resources through use of utility computing has produced huge dividends for ACF Environmental, his Richmond, Va.-based sales and distribution company. Two years ago, ACF turned over management of its data center functions, including hardware provisioning, software licensing, network maintenance, and storage, to Super-Server Inc., also of Richmond. It pays Super-Server a monthly fee to provide these discrete computing services on demand.

ACF's products include heavy fabric used in highway construction. Since opting to use a utility architecture, ACF sales have jumped. Salespeople in the field connect to Super-Server's highly secure data bunker to retrieve digital images, PowerPoint presentations and other files used to illustrate previous ACF projects. It's enabled salespeople to close sales more quickly, says DiLoreto.

"We wanted a system where anything that was of interest to all our sales people was always available. A lot of the tasks we're doing now we couldn't do before, because we had no way to centralize folders, files, and presentations," says DiLoreto. "We don't have an IT staff (so) this system has become the heart of the company."

Buying computing power on an as-needed basis potentially reduces risks and headaches of managing your IT resources. Network administrators don't have to worry about keeping networks tuned, justifying huge capital-equipment purchases, or managing software licenses/upgrades. Those mundane but important tasks instead fall to the service provider, enabling technical staff to focus more on business processes.

"Data center managers and IT people spend most of their time under the hood working on the engine, instead of driving the application. Utility computing gets them out of networking and into applying computers to solving business issues, like servicing customers better or boosting productivity," says Frank Butler, Super-Server's chief executive officer.

Companies like ACF still are a minority, though. Only about 15 percent of enterprises use some form utility computing now, says Carl Claunch, a vice president of research with Gartner Inc. By 2006, adoption should double to about 30 percent - a significant three-year increase, but still just a fraction of companies. "And the 15 percent who are doing it now are not using it for all of their datacenter operations, just for selected bits," says Claunch.

On the other hand, cash-strapped companies want to turn IT from a drain on expenses to a profit center. Many companies wind up paying for servers, data storage and other resources that they seldom use, just to be ready for spikes in demand. Rather than owning those assets, utility architectures enable you to virtualize servers, software, storage, networks, and services, analysts say.

"Companies are under cost pressures to be better managers of IT. If you tell them they can have an infrastructure that's more efficient, they're going to be receptive, but it's going to be an evolution, not a revolution," says John Madden, senior analyst with Summit Strategies of Boston, Mass.

Hoping to speed the evolutionary cycle, larger computing vendors are getting in on the action. Hewlett-Packard last year began offering its HP Utility Data Center product to accelerate commercial use of grid computing. Also, IBM's eLiza Project focuses on building autonomic computers that can recover from disruptions without human intervention. Sun continues to tout its N1 portfolio, a heterogeneous architecture that provisions computing, storage, and network resources based on varying demand for services. Microsoft Corp. tied its dynamic systems initiative to its unveiling of the Windows 2003 Server product.

Using a utility or dynamic architecture may help lower costs and alleviate IT headaches, but no enterprise should expect it to be a panacea, analysts say. Companies need to weigh the kind of data they want to turn over to a third-party vendor, which entails asking some hard questions about security, backup and facility access.

"Security is top of mind when you talk about these kinds of architectures. A lot of vendors are working to crack the code on that. Utility computing is about saving money, but it's also about making sure you have a secure IT environment," says Madden.

Some enterprises will want to protect high-value tools that differentiate them from competitors. "If you found something unique that IT enables you to do, and your competitors are going to take two extra days to ship things or their cost will be higher, then you would want to hang on to that highly commercially valuable information," says Claunch.

Some studies have placed resource utilization in enterprises as low as 5%. To that end, a host of software companies are scrimmaging for market share. They include VM Ware of Palo Alto, Calif., Egenera of Marlboro, Mass., OpsWare of Sunnyvale, Calif., Think Dynamics of Toronto, Canada, and others. Two other companies, Terrasping of Fremont, Calif., and Connectix Corp. of San Mateo, Calif., were purchased by Sun Microsystems and Microsoft, respectively.

Dig Deeper on Oracle competitors and market analysis

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.