Computing demand pressures the datacentre at Southampton University

"In common with many people in academia, the datacentre was was built in 1975/76 - trying to put 21st century equipment into an old factility meant that cooling was one of our key issues."

"In common with many people in academia, the datacentre was was built in 1975/76 - trying to put 21st century equipment into an old factility meant that cooling was one of our key issues."

So says Dr Parchment, infrastructure services manager, iSolutions at the University of Southampton.

"Our existing datacentre was bursting around the edges, so buying new kit and investing in water cooling bought us some breathing space."

That breathing space won't last long however and Southampton University is designing a completely new datacentre, hastened by ever growing-demand for high performance computing (HPC).

Dr Parchment and his team are at the basic requirements capture stage. More pressing matters involve transitioning the older pizza box rack set up to an IBM idataplex system. Like all transitions, it is at the mercy of events. The day before we spoke, the datacentre had suffered a power outage, which, of course, took immediate precedence.

The datacentre has around 400-500 pizza box servers. Dr Parchment says, "The team is doing a swing, having to do a partial decommissioning of existing kit to a partial commissioning of the new system."

"The idataplex is a funny set-up," says Dr Parchment. "It has half depth but double wide racks. The theory is that the air has less distance to travel and power is reduced because the fans don't have as far to push the air. The publicity from IBM says we can reduce the heat footprint from HPC in the datacentre by 100%."

The system takes around 300 KW. Notwithstanding the supplier's claims the University opted for a couple of 200 KW water chillers which "pump the cool water into the back of the racks."

The existing CRAC units alone could not handle the cooling.

"The challenge was not to get condensation," says Dr Parchment. "This meant deploying leak detectors and setting the inlet temperatures correctly. The balance is making it cold enough so we can extract the heat but not too cold to cause condensation."

Overcoming hydrophobia

There was, says Dr Parchment, a psychological issue in putting water into the datacentre. A couple of issues in the past with water getting into the datacentre meant there were fears to overcome.

The reality is, he says, that users are demanding high performance computing and this requires water cooling.

"Hundreds of them can't wait to get a crack at it," he says (Southampton University is a major engineering research centre).

Water cooling is a mature technology but Dr Parchment has his doubts whether even water will be sufficient to cool HPC equipment as it reaches 70KW or even 100KW per rack.

"Water might turn out to be a short term solution," he says.

The system should be fully commissioned by mid to late September.

The computer, which was custom-designed and built and configured for the University by UK high-performance computer and storage integrator OCF plc, has a capability equivalent to around 4,000 standard office computers, running simultaneously. The University and OCF signed their contract in July 2009. IBM will receive £1.8m from its sale into OCF.

 IBM System x iDataPlex servers.
 The half-depth form factor reduces the airflow required across the components, lowering the power needed for cooling, whilst providing twice the number of servers in the same space as a standard 42u rack.
 Rear door of idataplex has built-in heat exchanger which uses water to cool the expelled heat before it enters the datacentre.
 Equivalent of 26 normal racks; 1000 nodes; 2000 Intel Quad Core Processors = 8000 cores; 100 TB storage using IBM DS4700

Cluster Resources Adaptive HPC Suite to provide, on demand, a mixed Linux and Windows workload.

First published in Focus Magazine






Read more on Networking hardware