refresh(PIX) - Fotolia

Cern calls on industry to collaborate on its datacentre challenges

Particle accelerator lab has set out a plan, inviting the industry to help it develop next-generation IT to support science

Cern, home of the Large Hadron Collider (LHC), has identified 16 IT challenges that it wants to work with the IT industry to overcome.

In a whitepaper describing its challenges, Cern Openlab, the facility’s public-private partnership, has categorised the 16 issues into four main areas.

Cern’s datacentre houses about 10,000 dual-core CPU servers with 300,000 processor cores. It also runs a datacentre site in Budapest, connected to Cern over a 100Gbps link. The Budapest site runs 3,500 dual-processor servers.

In the whitepaper, Cern notes: “A weakness in the architecture of many of today’s datacentres is that they cannot easily accommodate rapid variations of cycles and workloads.”

It said that if it decided to build a new datacentre, it would aim for a power usage efficiency (PUE) of 1.1 with 15-20kW per server rack. The electrical consumption of this centre would need to grow from 4MW to 12-16MW and would need to support a heterogenous IT environment. The whitepaper describes how rack disaggregation could help it to allocate the correct amount of computing and storage resources.

It identified software-defined infrastructure as one of the technologies new datacentres could be based on.

Networking is another aspect of datacentre technology that Cern wants industry collaboration on. The whitepaper states: “Globalisation of science means that research centres are becoming community hubs for worldwide collaboration. Modern big science requires continuing developments to be made in nearly all aspect of networking. Rapid developments in networking speeds will enable datacentres to become more interconnected.”

The third datacentre technology for which Cern is seeking industry collaboration is its database requirements. It said: “Databases for the LHC, the experiments and the associated workloads require real-time and batch ingestion at high rates of throughput.”

It identified stream processing, cloud resources, machine learning and scale-out databases as among the areas that could be investigated. It has also looked at NVRam (non-volatile memory), which could be used to run in-memory databases and analytics workloads.

The final piece of Cern’s datacentre requirements concerns cloud infrastructure and the use of OpenStack. In the whitepaper, it said: “There are ongoing investigations into simplifying the user experience by providing virtual machines, containers and bare metal nodes via the same user interface and administrative systems.”

CW+

Features

Enjoy the benefits of CW+ membership, learn more and join.

Read more on IT architecture

Start the conversation

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close