Utility Computing revealed
Before you jump into utility computing, here's a hard look
at the realities of this service. by Joy Tang
The basic concept of utility computing has been around
for years. Get computing power in the same way as water or electricity, in consistent
amounts that is 'always-on,' and pay only for what is used.
The expanded concept of utility computing couples business
and IT, so that IT tasks can be prioritized according to business objectives
through SLAs. This in turn allows the cost of uptime or downtime to be more
It will result in an IT manager being able to tell,
which applications are affected by a server failure, right down to which business
transactions would have to be cancelledall in dollar terms. And if resources
are pooled, the server failure may even go un-noticed as relatively idle servers
are automatically re-tasked to pick up the slack.
Utility computing breakdown
While utility-based computing has been available for
some time, new solutions are emerging.
To IBM, utility computing is a component of its on-demand
strategy and refers to delivering infrastructure and business processes as a
service, either on a pay-per-use basis, or through outsourcing. In addition,
the company said that e-business-on-demand users should have integrated, open,
virtualized, and autonomic infrastructures, to better respond to surges in traffic,
and have 'virtual sharing, management and access to devices across an enterprise,
industry or workgroup.'
Two of IBM's on-demand initiatives are grid computing
and the autonomic computing initiative. Grid computing allows the handling of
a single task across network resources, whereas autonomic computing makes software
and servers that can optimize, configure, protect and heal themselves.
The company already offers eServers, which feature
autonomic computing, and its existing WebSphere architecture and Tivoli management
tools, coupled with the DB2 database and Linux, are expected to back the company's
HP's utility computing vision includes Capacity on
Demand (iCOD), both 'instant' and temporary. In the instant mode, users can
turn 'on' extra power as required and pay for it. In the 'temporary' mode, users
pay for the extra capacity for as long as they use it.
There is also Pay Per Use (PPU), a usage-based lease
program where the equipment may reside with HP or at the customer site.
K Sudershan, Director, Infra-structure Solutions, Enterprise
Systems Group, HP Asia Pacific, said that businesses like to have large equipment
expenses off the balance sheet so that they do not have to carry the depreciation
"PPU is an excellent option for a customer's capitalization
strategy as well as their technology strategy. 40 percent of all Superdome servers
were acquired through the PPU model," he revealed.
Utility computing is a subset of HP's Adaptive Enterprise
strategy, where business is driven by an IT infrastructure that responds to
changes in real-time, which is what HP's Utility Data Center (UDC) had been
designed to do.
The UDC includes software, multivendor hardware, and
services that optimize IT asset usage and allow administrators to architect
new systems and activate them with a 'drag and drop' approach, said Sudershan.
"A pool of servers, storage and network devices
are wired once to support virtual allocation of resources for the entire system,"
explained Sudershan. "As the resources are virtualized, they can be dynamically
allocated without having to rewire any physical components."
In the quest for an adaptive enterprise, users can
also stabilize their infrastructure with the help of HP OpenView modules like
Operations, Network Node Manager, and Performance Insight.
K.P. Naidu, Director of Infrastructure Solutions, Asia-Pacific,
Sun Microsystems said that all of Sun's solutions, from hardware, OS, to grid
solutions, are enabled for utility computing.
The company offers its N1 framework, which features
virtualization, provisioning, and policy automation, as well as accounting on
a per service basis, but does not consider N1 utility computing per se.
"Customers want to match compute resources to
business demand, align costs to activity, and improve asset utilisation,"
CA rolled out its on-demand strategy in April with
self-healing, provisioning, asset management, and helpdesk tools.
Unicenter Network and Systems Management (NSM) 3.1,
which gives an overview of infrastructure based on the services it supports,
was introduced together with Unicenter NSM Dynamic Reconfiguration Option, which
monitors business service levels, plans for additional capacity, and allocates
it across multiple platforms.
The NSM option will be available within six months,
as will CA's new Sonar technology, which provides detailed resource allocation
based on business priorities, said CA's consulting director, Sherwin Wong.
"Whenever an accountant uses a package for transactions,
we can analyze the network and draw a map of its consumptiona business
topology," added Wong. "Before, when there were five applications
on a server, no one could differentiate which applications were consuming which
Storage and Services
The commoditization of storage, together with dynamic
storage provisioning and storage resource management, have turned storage into
a no-brainer utility.
Major storage players already have utility-based solutions,
like Veritas' SANPoint Control, which discovers all storage devices in the network
and manages them in a policy-driven manner, and Storage Central, a storage resource
Veritas OpForce offers a similar solution for the servers,
enabling hardware to be shared among different applications, and features automated
re-imaging and provisioning, while Veritas i3 analyzes total system performance,
identifies the root causes of performance problems, and helps correct them.
Better tracking of user-defined IT services will be
available through Veritas Service Manager, which is expected to launch in Q4'03.
"It gives visibility to IT asset usage by business
unit and paves the way to 'charge back' expenses," explained Alvin Ow,
Technical Consulting Manager, Asia South, Veritas.
EMC uses the concept of Automated Networked Storage
to show how its software, platforms and networked solutions can simplify storage
management. The company's OpenScale storage asset and financial management programme
features automated billing for the networked storage infrastructure, including
storage capacity, SAN switch ports, NAS servers and storage software, said Ajit
Nair, director, Technology Solutions Group, EMC South Asia.
In the service provider realm, Atos Origin offers utility-based
services. "Its SLAs have dropped the percentage utilization of capacity
in favor of the computing capacity productively used by the application,"
said Atos Origin's Nilesh Nerurkar, General Manager, Managed Services (MS).
Srikanth Seshadri, Senior Consultant, MS, agrees to this statement.
Service fees are based on a combination of transaction
volumes, number of devices, and data volumes. "Servers, networks, storage
and the related financial models from technology vendors still have gaps that
restrict the options for offering a complete seamless
utility-based service in a cost-effective manner,"
Nerurkar said. "In the next nine to 12 months, technological developments
will bridge the gap."
Anything that can save money is an attractive proposition,
but utility computing solutions may not come cheap. However, Sun's Naidu said
that users do not have to invest very much to gain the pay-per-use benefits.
"It's as simple as looking at the environment and improving capacity utilization,"
Even so, anecdotal ROI claims can be impressive. Atos
Origin, for instance, estimated that TCO for IT services delivered on a utility
model could fall by 20-30 percent. "The quantum of savings achieved depends
on the amount of services that are converted to a utility model," said
Nerurkar and Seshadri.
Simon Ho, Country Manager, Veritas Hong Kong, said
utility computing can provide high availability without recourse to expensive
"We can provide high availability on demand to
users at a 50 percent saving on traditional high-availability platforms,"
he said. He also described how Veritas i3 had led to an Asian retail customer
improving its IT performance.
"Consultants were suggesting that they buy more
hardware. Instead, they deployed i3 and found that it was not a hardware problem,
but an implementation problem. They delayed their relatively expensive hardware
purchase," he said.
And according to HP's Sudershan, HP Labs,' the use
of the HP Utility Data Center to consolidate its worldwide IT infrastructure
will allow it to scale to more than eight times its current capacity in the
next few years without adding staff.
"Millions of dollars could be saved as 1,000 servers
could be operated by fewer than 20 peoplefive times the ratio of servers
to people used in most data centers today," he said.
Trouble in paradise
Clearly, utility computing means different things to
different peopleand some have chosen to keep a distance for now.
Damian Crotty, Dell's director of Advanced Systems
Group in the Asia-Pacific, said that, "Utility computing means either outsourcing,
which is a concept that has been around for over three decades, or overpaying
for proprietary systems with excess capacity, and that ultimately limit flexibility."
Plus, it's early days yet, with few customer testimonials
out there. "According to a Merrill Lynch survey of CIOs in the US and Europe,
respondents doubt utility computing will materialize, certainly not before 2006
or 2007. What IBM and others are proposing is based on old proprietary technologies
with a new marketing effort," Crotty added.
Rakesh Kumar, an analyst with the Meta Group, concurred
that utility computing will not be real till 2007 or so, and said, "Vendors
will create sophisticated marketing programs to repackage existing offerings
targeted at non-IT executives, often packaged with essential and expensive consultancy
Meta Group argued that fundamental weaknesses will
remain in vendor offerings as well as in the organizational processes for users
to handle the models at least until 2005.
"Without critical evaluation of both, many users
will suffer inappropriate contracts, paying too much for the wrong technology
and not being able to account prudently for all costs," warned Kumar.
Further, Atos Origin cautioned that only server OSs
launched in the past year are offered as a utility-based service, and many applications
are not geared for this due to licensing schemes. Vendor lock-in could also
occur if the solutions do not support third-party products, or which will only
work with other products from the same vendor to deliver utility computing.
Security is another problem with utility computing,
said Dell and Symantec. Glenn Gunara-Chen, Consulting Manager of Symantec Asia
Pacific, recommended that utility computing should not be adopted where critical
or sensitive data are concerned.
"If one client wants strong security and another
client has weak security, then the security of the shared infrastructure is
weak," he contended.
Moreover, business needs should be considered before
rushing to implement utility computing. Sun's Naidu said users often make the
mistake of not understanding the actual business environment. "Simple business
models do not work," he pointed out.
And as EMC's Nair observed, outsourcing storage may
not save money, a warning that could easily apply to utility computing as a
"Companies may face additional cost because they
didn't take into account certain variables like increased application loads
and rapid information growth. Storage-on-demand means exactly that. You will
get charged for every bit of storage demand growth. So plan carefully,"
In addition, getting management buy-in can be tough
for utility computing. "Getting the businesses to buy into a differentiated
delivery of IT services based on what they are willing to pay is a new concept
that needs to be accepted," observed Nair.
CA's Wong agreed. "Users need to manage service
levels, not availability or response times. They need to explore how to do this."
In fact, Meta Group predicted that over 80 percent
of global organizations, with their sophisticated accounting tools, will find
a utility-driven environment with its irregular cash flow, granular payment
schedules, and complex internal cross-charging, hard to swallow.
"Organisations evaluating such offerings must
combine their IT, procurement, and accounting groups to evaluate financial return
on new contracts and how the cost of internal processes will need to change
to manage them," Kumar advised.
Asian Culture Clash
Another obstacle could be Asian culture, which usually
flows top-down. "The hardest thing will be to change the culture. It's
easy to talk about change in the classroom, but it is hard to change in the
real world," observed Captain Payoongsak Silagul, Vice-President, Information
Technology (Operations), Ayudhya Allianz CP Life PCL. AACP is a HP user hoping
to improve quality of service through best practices.
One success story is JBWere, an Australian investment
house which began the process to provide IT services on a pay-per-use basis
to its end-users in 1999. Thomas Higgins, Chief Technology Officer, JBWere,
has found ITIL (IT Infrastructure Library), a set of best practices for IT service
management, crucial to utility services. "Before ITIL, we didn't know the
true cost of running IT products and services. We couldn't make rational decisions,"
ITIL provides a way to charge for IT services that
is understood by everyone in the organization, he explained. "Without a
common language, you have human miscommunication issues, and then service delivery
Higgins said ITIL allowed the IT department to negotiate
SLAs more precisely, for instance. When IT did not cost anything, one department
asked for no downtime at all, he said. They changed their minds when told the
IT costs would need to triple, and that the cost would come from their departmental
However, management has to demonstrate a strong commitment
to enforcing ITIL, Higgins said. One of JBWere's biggest challenges was to make
its IT service catalogue real. "Until we start getting people to bill and
charge their time to it, it's just another taxonomy," he recounted.
Utility computing may be more challenging than the
vendors have painted it to be and hype abounds, but this won't stop the wave
of new techniques which could enable enterprise IT to work better and align
more closely with business goals. "This is the year more solid roadmaps
have become available," said CA's Wong. "The awareness is there."
Taking the time to understand what different solutions
can achieve, being aware of potential problems, and starting small can place
users on the leading, rather than the bleeding, edge.
This article first appeared in Network Computing
- Understand the different solution approaches.
- Identify business needs and see if a utility
computing model can meet these needs.
- Identify all internal IT processes and
document service level definitions.
- Visualize the utility-based architecture
you want for your servers, storage, and network.
- Optimize resource utilization.
- Define the unit of computing and financial
models for metering and charging based on it (changes needed at the
IT, procurement, and accounting department level).
- Ensure software licensing matches the
hardware utility model and can scale as needed.
- Set policies and create tiered SLAs covering
performance monitoring, disaster recovery, security and backup/restore,
and clearly state the obligation of the service provider with regards
to data confidentiality, integrity and availability. In the case of
storage, there should be policies on information availability, quick
provisioning, configuration management, and change management.
- Focus on automating provisioning and management.
- Link business processes to IT events such
that business managers can understand how computing resources are allocated.
- Ensure data is protected and encrypted
at all times.
Source: Atos Origin, CA, Dell, EMC, HP, Symantec,