cherezoff - stock.adobe.com
Aiming to prepare itself for the next wave of scientific discovery for which it will be inevitably front and centre, CERN (European Organisation for Nuclear Research) has embarked on a programme to increase the capacity of its datacentres and technical network to adapt to increasing research requirements and to create a network built for extreme computing.
One of the world’s largest and most respected centres for scientific research and home to the world-renowned Large Hadron Collider (LHC), CERN’s mission is to learn more about how the universe works, with the organisation advancing the boundaries of human knowledge through breakthrough research in fundamental physics.
The LHC experiments are designed to observe up to 1.7 billion proton-proton collisions per second and produce a data volume of more than 7.5 terabytes per second. Only some of these events lead to new discoveries, meaning data flow is filtered and reduced to a manageable level. The LHC experiments’ Trigger and Data Acquisition Systems handle data filtering, collection and infrastructure monitoring.
CERN’s Geneva datacentre supports all of its scientific projects, experiments and administrative systems, from spotting camera-shy pentaquarks and charm mesons to everyday video conferencing and payroll tasks.
The datacentre network supports more than 15,000 servers and 260,000 processor cores. It enables researchers worldwide to receive data from the LHC experiments for analysis. In the past 12 months, 370 petabytes of data have moved across the network.
To have a network that can support the potential of the research, CERN has turned to Juniper Networks for high-density switching capable of providing the required high-throughput connectivity to support the data collection and infrastructure monitoring.
The QFX Switch has been deployed to provide high-throughput connectivity to support the data collection and infrastructure monitoring of the LHC experiments. Its 100-Gbps port density is seen as able to support the inevitable traffic growth and the core network can be scaled from 40 Gbps to 100 Gbps.
The EX9200 Ethernet Switch will connect 11,000 devices, supporting the LHC operations and experiments, as well as monitoring and safety systems.
Automation has simplified the configuration and management of 400 routers and switches across CERN’s campus.
“After electricity, networking is the most important element for us at CERN,” said IT infrastructure group lead Tony Cass.
“The physicists need the experiment data to be moved to the datacentre where it’s processed and sent to our partners around the world. We also have thousands of employees who need access to the research database and regularly need email and web access. So, if the network doesn’t work, CERN doesn’t work.”