Grid computing predicted to eat up 'fibre glut'

The "fibre glut" widely blamed for a weak long-haul telecoms equipment and services market still exists in most areas, but there...

The "fibre glut" widely blamed for a weak long-haul telecoms equipment and services market still exists in most areas, but there are signs that demand might soar in the next few years.

Carriers and industry analysts are confident there is plenty of fibre in the ground after the building boom of the late 1990s. However, the oversupply is bigger in some areas than others, and carriers might have to light up more fibre pairs already in the ground to meet growing demand.

Unfortunately, most companies are unable to tap into that bargain, long-haul capacity without going through the local loop. For them, the cost of Wan services is usually dominated by the price of the last-mile connection, says Dave Passmore, an analyst at Burton Group. Only a tenth of business locations have fibre going to them, he added.

Meanwhile, there are indications that the recent deep slump in spending on equipment to light up that fibre might be ending. Within a few years, new ways of organising and using IT resources might change the picture dramatically.

The market for wavelength-division multiplexing (WDM) equipment to light up long-haul fibre plummeted from $5.1bn in 2000 to $376m last year and an estimated $269m this year, according to RHK analyst Ron Kline.

RHK expected revenue to pick up next year and increase slowly to $307m by 2007. Growth from spending by carriers responding to demand will drive that recovery, Kline said.

Verizon Communications plans to light up long-haul fibre across the US as it develops its long-distance voice and data business, said Verizon spokesman Ellis Edwards. Most of that will be fibre it leases from existing long-haul providers.

AT&T spokesman Dave Johnson said his company is expanding its network outside the US to meet demand for data services by multinational companies. That network, which mostly has capacity leased from other carriers, serves 120 cities.

But long-haul fibre is abundant in most places. AT&T is probably far from exhausting the capacity of networks on its international routes, Johnson said. The same is true of AT&T's US network, where no major expansions are in the works.

As an example of overcapacity, of the long-haul fibre that goes through Chicago, last year only 3.9% was lit, according to US telecoms analyst groupTeleGeography.

Of the lit capacity, only 2.7% was dedicated to IP bandwidth and probably less than 1% to voice and other types of networking, says TeleGeography analyst Alan Mauldin. The rest of the lit capacity was not deployed.

Mauldin added that it was unlikely all of that is being held back by carriers as inventory or by corporations as back-up capacity.

Not all routes are as wide-open. US consultancy TeleChoice said in July 2001 that it had found the supply of lit fibre near a critical point on 14 of the 22 intercity routes it studied in the US. On those routes, 70% or more of the lit fibre was in use, a threshold at which carriers usually expand capacity.

Based on one growth scenario TeleChoice studied that year, on half of the 22 routes, all the lit and dark fibre would have been used up by 2006. Growth did not happen at anywhere near that rate, said Russ McGuire, who was lead analyst on that report and is now an independent consultant at Seek First Networks.

As more organisations start to use grid and on-demand computing and to virtualise their processing power and data storage, demand for long-haul capacity could grow dramatically, said Frank Dzubeck, president of consultancy Communications Network Architects.

Grid computing brings together the resources of one or multiple organisations to solve computing problems. On-demand computing lets a company turn on capacity on the spot based on how much power is needed.

By 2005 or 2006, these could become "virtual" resources that might come from just about anywhere and use long-haul capacity along the way, Dzubeck said. To be useful, grid and on-demand computing eventually will require at least 100Mbit/sec of capacity.

By that time, more consumers will use broadband and will take advantage of more bandwidth-hungry applications, he added. Networked remote sensors also will start to consume bandwidth.

Stephen Lawson writes for IDG News Service

Read more on Managing IT and business issues

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close