Computing power: capacity on demand

Computing power will become an easier commodity to manage when it becomes a utility like gas, water or electricity

Computing power will become an easier commodity to manage when it becomes a utility like gas, water or electricity

Introduction

The idea of computing power, such as applications, storage and processing power, becoming available like electrical current or the flow of water is not new. Often called utility computing, it has two main schemata. The most common has the various components (such as servers, hard disks, RAM) located on your premises, but dormant. These components are installed by a hardware supplier to be activated and purchased upon demand. The second method uses a data connection to a source, such as an ASP, that provides processing power, data storage and applications in a "metered" fashion.

The idea of having "capacity on demand" stems from the mainframe world. Companies such as IBM have long offered extra capacity within large enterprise systems which is not used but available as both a fault tolerance measure and to increase response times to problems. Considering that the cost of an installation may run into the millions, this extra capacity is likely to be used and doesn't eat into the profit margins by a great deal. These types of "capacity on demand" deals have normally been worked out on a case-by-case basis and are only available to the Fortune 500 types who can command a great deal of attention directly from larger computer manufacturers.

The ASP model of "capacity on demand" is centred on applications. Instead of each customer paying for a fixed number of licences, or providing enough server capacity to support a wide range of applications, each application can be served on a per user or even per use basis. The number of service providers in this space is growing quickly as are the variety of applications. These can range from simple programs, such as word processing, spreadsheets and contact management, to complex enterprise systems such as SAP and human resource management. ASP is often compared with the outsourced capacity model; when the infrastructure is also managed as a service, it makes it easy to produce a simple cost per user, or applications complete with tight service level agreements.

Data storage is the last element that is moving towards a utility model. As bandwidth cost for data starts to tumble down, the prospect of huge, secure facilities that simply provide data storage to customers via broadband links becomes a reality. At the moment, companies like Backitup and Safeguard already offer this service for low-capacity, single users. However, ISPs are starting to look at extending these products to the corporate space.

The industry

One of the leaders in the space of "capacity on demand" is Hewlett-Packard (HP), which has aggressively targeted the SME market with services normally associated with their high-end 9000 series. Terry Walden, Enterprise Server marketing manager for HP, recently outlined the direction of HP's recently announced "capacity on demand" initiative for the enterprise.

"We have seen massive growth within the start-up sector, and the challenge has been reacting quickly enough to provide for these energetic companies. In our high-end server market, providing additional storage and processing power within the customer premises as both a failsafe and to deal with high demand is common. This same principle is now available on entry-level servers such as the new A-Class."

Through HP's channel partners, a company can specify that a server is shipped with extra processors, memory and storage capacity that lies dormant until demand rises. For this extra capacity, a nominal monthly charge is made. This extra capacity is provided based on the usage forecast calculated between the customer and a consultant from the reseller. Some increase in demand may result from application development work, seasonal conditions or a successful marketing campaign. As demand increases, capacity can be increased via software keys that activate extra hardware inside the servers and attached storage.

"The instant capacity service fits into a overall utility computing blueprint that we have been working on for sometime" continued Walden. "Complete IT infrastructures can also be provided on a pay on forecast basis, which helps start-ups get off the ground without committing themselves to an expensive and inflexible IT infrastructure." Walden admits that just having extra capacity built into hardware is not the complete solution, as adequate service provisioning and infrastructure need to be designed with increased capacity in mind.

Fujitsu Siemens Computers is also moving towards a more utility-focused market, especially with their high-end PRIMEPOWER servers. Definitely sitting within the supercomputer realm, the system scales from 16 processors up to a massive 128-processor array with each RISC based CPU able to support 2Gb of memory. For businesses expecting massive growth, adding additional freestanding servers can be a complex process, as Ian Stewart, the company's Unix marketing manager, explains. "Adding extra standalone servers is not suitable for the larger ASPs, both in terms of management and the support infrastructure required to power, back-up and monitor these systems."

"We have seen the demand for Unix-based enterprise server systems steadily increase from companies offering ASP services. These companies need to be able to rationalise their hardware and scale up services quickly without increasing the complexity of system management. With our PRIMEPOWER series, you can effectively add extra processors and RAM via a single slot-in card which provide extra virtual machines for each customer, but with a single management interface and reduced installation overheads."

Data carriers

Even the carrier markets are becoming more utility-based. Noel Dunn, one of the founders of Unica, a company specialising in providing data and voice services, points out that the "...amount of fibre being put into the ground is phenomenal. The cost of a 1Mb lease line a year ago now gets you 2Mb, and data charges are literally falling by the month. As a business, Unica purchases bandwidth based on what the best deal is at the moment from a portfolio of carriers that provide our required level of service. From this, we can then offer a single price solution to our customers."

"So, if a customer comes to us and requests more bandwidth or different services, we can look across our carrier suppliers and find a service to suit them. From their perspective, this change is transparent. They deal with us and we provide to them increased capacity without having to deal with multiple suppliers or technical issues." Unica is to launch a business-to-business service later this year where customers can purchase data and voice services directly via the Web.

Summary

The prospect of IT as a utility like gas, electricity and water is coming closer. The need to simplify computing helps both customer and vendor. However, Richard Wendland, a senior researcher at Durlacher, offers a word of caution. "The idea of utility computing is not new, but the biggest issue facing the industry and customers is provisioning. Managing the distribution and implementation of computer resources needs to change if utility computing is to become a benefit to business."

Whether you call it "utility computing", "instant capacity" or "scalability", IT infrastructure can be deployed quickly under the right circumstances. The dotcoms seem to be the focus of much of the activity in this area, but any business can gain more at the flick of a switch.

Will Garside

This was last published in May 2000

Read more on IT for small and medium-sized enterprises (SME)

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close