The IT department at Cern, birthplace of the World Wide Web, is creating the world's largest computing grid to help scientists study the origins of the universeA grid of this type could eventually have applications in the finance sector. Such a development could see financial firms dealing with gigabytes of data per second within the next five years, analysts said.
The grid, which will go live in June, will pool the processing power of approximately 100,000 CPUs worldwide. It will process information at a rate of 1gbps, said Francois Grey, head of Cern's IT communications team.
"About 20% [of CPUs] will be here at Cern, another 30% will be in 11 major datacentres, and the rest of the computing power will come from 250 other centres worldwide," said Grey.
Scientists working on the Large Hadron Collider project need vast computing power to process the mass of data generated by the experiment, which recreates the "big bang". The experiment will produce roughly 15 petabytes (15 million gigabytes) of data a year - enough to fill 100,000 DVDs.
Grid computing is already being used by Google and Amazon and it will have applications in the financial sector as processing demands grow, said analyst firm Gartner.
A key challenge in grid computing is ensuring reliable and secure access to a widespread IT infrastructure.
Cern co-developed next-generation middleware, called gLite, to use the processing power of computers reliably in a 24-hour operation.
"A lot of the academic projects work with existing middleware products, but when you want to use it 24x7, you discover they have some bugs," said Grey.
The gLite software authenticates users to the grid and ensures that contributors of computing resources are not exposed to security threats.
"What is special about this middleware is it handles all the issues that arise when you have many organisations sharing resources. We are talking about legally independent organisations in many different countries with different laws on how you process data," said Grey.