Sun launches per-hour per-unit servicesin first move into field of grid computing

Industry giants begin offering utility pricing for commercial grids

Industry giants begin offering utility pricing for commercial grids

Sun has begun offering a utility computing service with processing facilities charged at $1 per-CPU per hour, and storage priced at $1 per gigabyte per month.

Facilities have been established in London, Texas, Virginia and New Jersey. Sun said that a number of global companies are running pilot programmes.

Separately, Sun is also in discussions with electronic stock exchange group Archipelago Holdings to create the world's first market for selling blocks of computing power to corporate users in a similar way to how stocks and shares are sold.

The demand is likely to come from organisations that want to carry out, for example, large-scale financial analyses or geophysical modelling.

Sun said it is being "extremely transparent" about the $1 per CPU/hour pricing for its Compute Utility processing service, and that it offers a complete software stack with no licensing fees. This includes the infrastructure, power, Solaris 10 operating system, staff support, network management and storage.

Robert Youngjohns, executive vice-president, strategic development and Sun financing, said, "Our customers think they spend anywhere from $6 to $16 an hour, which is as much as one tenth of standard computing costs. This is going to change the way IT is bought and managed."

The concept of grid computing is attractive as it allows users to pay for only as much computational power as they need to complete a given task.

Analysts question whether software licences will be valid on commercial grids, because they tend to require the user to state the number of processors the software is running on. There are also practical difficulties with shifting large amounts of data between the user organisation and the grid facilities.

Gary Barnett, research director at analyst firm Ovum, said licensing complications will pose a problem for firms interested in adopting utility computing.

"You can only run software on a grid if it is capable of running on a grid. Very few big commercial software suppliers are able or willing to license software on a per use basis. There are no commercial applications that can be run across so many processors," said Barnett.

He suggested that grid suppliers such as IBM, Hewlett-Packard and Sun might be able to arrange special deals with software suppliers, for example, buying a million licences and promising to allocate them on an on-demand basis.

Barnet did not consider Sun's pricing to be particularly cheap, "One dollar per CPU per hour is not cheap. In reality, users are not going to pay a great deal less for their software, and suppliers certainly will not settle for less money."

IBM is also offering a grid computing service, Deep Capacity Computing On Demand, which starts at about 47 to 50 cents per CPU per hour, although Sun has queried what is actually included in that service.

Barnett pointed out further hurdles with the computing-on-tap model. "How do you make your organisation's infrastructure ready to plug into such a network?" he said.

"Also, I am sure that for financial organisations, the Financial Services Authority would be very interested to learn if a firm was sending customer data over the network. This will have data protection and security implications."



What is the grid?

Grid computing (or the use of a computational grid) is applying the resources of many computers in a network to a single problem at the same time.

Grid computing requires the use of software that can divide and farm out pieces of a program to as many as several thousand computers. Grid computing can be thought of as distributed and large-scale cluster computing and as a form of network-distributed parallel processing. One likely area for the use of grid computing will be pervasive computing applications.

A well-known example of grid computing is the ongoing Search for Extraterrestrial Intelligence (Seti) project, in which thousands of people share unused processor cycles of their PCs in the search for signs from outer space.



Firms unite to push open source grid

IBM, Hewlett-Packard, Intel and Nortel Networks recently joined together to promote open source-based systems for grid computing.

The four have joined the Globus Consortium, which is promoting the use of the Globus Toolkit, an open source platform initially developed by fellow consortium member Univa. Univa is particularly active in the education market, but is also pushing its toolkit in the commercial sector.

With the help of the four companies in the consortium, open source grid activity in the commercial sector should be given a major boost.

Oracle, a keen advocate of grid computing, co-founded the Enterprise Grid Alliance. But the Globus Consortium, unlike its Oracle-led alternative, focuses on open source grid solutions.

Gary Barnett, research director at analyst firm Ovum, said, "For the next two years at least, most potential utility computing users are wrestling with the task of becoming utility ready, and it will be a long time before the notion of compute exchanges becomes anything more than a niche.

"Sun is wrong to suggest that the notion of a market for computing resources is a new way of thinking about utility computing. Every supporter of utility computing has talked about a free market in computer resources as the logical ultimate destination for utility computing."

This was last published in February 2005

Read more on Business applications

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close