As datacentres across the globe continue to grow, the IT industry is under ever more urgent pressure to devise and promote more sustainable ways to satisfy organisations’ ever-increasing demands for processing power.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
Much of the energy waste in datacentres stems from traditional cooling techniques that generally rely on an army of noisy, motorised fans and compressors.
Yet while we are seeing the emergence of innovative new cooling technologies, most commercial organisations have neither the time, space nor resources to experiment with them.
That makes testbed projects such as Leeds University’s successful implementation of liquid-cooled servers from specialist manufacturer Iceotope all the more vital.
They help prove the benefits of such innovations in a working environment. In turn, this helps to kick-start new markets by reducing the risks of deployment for commercial organisations. The project won “best technology innovation” at this year’s Computer Weekly European Datacentre Awards.
Cool runnings - more on liquid cooling
- Modern datacentre cooling systems: No muss, no fuss, no waste
- Immersion cooling and other source-of-heat options for cooling datacentres
- A liquid-cooled datacentre benefits from lower noise, less energy
- How to use mineral oil cooling to efficiently cool servers
- Advances in immersion cooling can help servers chill out
Research project with HPC benefits
Senior lecturer Jon Summers from the university’s School of Mechanical Engineering says the purpose of the Iceotope project was twofold: to investigate the environmental benefits of applying the technology; and to use the system as part of a high-performance computing (HPC) setup in its thermodynamics laboratory.
“Primarily, it’s a research project, but we’re also rolling it out for use with students,” he says.
All the electronic components inside the Iceotope servers are immersed in 3M Novec Engineered Fluid, a non-toxic, non-conductive liquid coolant. The excess heat is then carried away from the cabinets via standard water pipes and central heating pumps. From there, it passes through domestic radiators to warm a large, open-plan laboratory. On hotter days, it is simply vented outside.
The reduction in energy use is astounding. “We are saving about 85-90% on cooling costs alone,” says Summers. “The system has no fans or compressors and only requires three pumps using a combined total of around 100W.”
As far as system maintenance goes, Summers says the Iceotope is little more bother than a standard blade server.
“The cassettes are hot-swappable – you just pull one out and pop another in. There are double-blind sockets for the water that are completely drip-free, and the system doesn’t leak at all. Inside the cassette is a water jacket housing the electronics, which are immersed in Novec fluid. As long as you have a spare server handy, it’s totally straightforward to swap them out," he says.
"If you want to change the electronics in the cassette itself then the process is a bit more involved than usual, because you have to empty the cartridge first, but it’s not hard and the Novec fluid can be reused,” he says.
Suited to high-density, low-noise and hostile environments
Beyond the green benefits, the university is also seeing other advantages, particularly the ability to deploy the servers in relatively high density where there is limited space or ventilation.
We are saving 85-90% on cooling costs alone
Jon Summers, Leeds University
“In the HPC arena we’re always stuck for space. We also want all our computers very close together because when you’re using distributed computing to do very big calculations, you need low latency and high bandwidth between nodes,” says Summers.
While most enterprise environments won’t need to compute fluid dynamics equations, many have similar concerns over space.
“The units would be ideal for an office. Although we’re using them for HPC, the servers use commodity motherboards with standard Intel and AMD processors. They are insulated and don’t give off much heat. And there are no fans so they’re perfectly suited to locations where you want a low level of noise,” he says.
Because the electronics in the system are insulated from heat, light, humidity, particulates, dust and (to a certain degree) shock, the Iceotope servers also operate well in hostile environments, he adds.
From his own knowledge and experience, Summers thinks the “writing’s beginning to appear on the wall” for traditional air cooling.
He says: “From what I know of the datacentre industry, HPC users are less risk-averse, so we’re happy with ‘water to the rack’, but I know lots of people are still nervous about it.
"With energy costs now so high, however, air could only have another four or five years as the leading means of datacentre cooling. We need a much more efficient way of moving the heat around, and there’s nothing better than liquid.”