In the deep, dark past of computing, water was the default means of keeping a computer cool. Now, in the 21st century,...
it is making its presence felt again. If you compare the use of water to air as the cooling technique within data centres, a lot of IT professionals are exploring how to achieve energy efficiency through air cooling. But don’t write off using water for direct cooling in the data centre.
As electronics improved and as computers became smaller, air cooling became the norm. Chilling down vast volumes of air that could be blown across the electronics works where electronic densities are still relatively low and where energy prices are reasonable. Much engineering work has improved heat exchanger design at the CPU level, but design issues still occur as equipment densities are pushed.
Increases in density along with sharp increases in energy prices have forced many IT pros to look at how inefficient existing cooling practices are. The majority of data centre owners are investigating ways to save energy while maintaining adequate cooling through the use of more targeted air circulation using computational fluid dynamics (CFD) and thermal imaging. But, direct cooling through water use in the data centre is raising its head again.
Air cooling: Popular but flawed
Air cooling’s faults are becoming more apparent. It’s just not very good as a heat transfer medium. As data centre equipment has increased in density, the use of large fans has decreased because of a lack of space; so has the capability to move the large volumes of air required through the equipment. Being a gas, air has poor heat conductivity, so extra means are required to transfer the heat from its source into the air itself. Therefore, fins must be attached to hot spots to maximise the surface area available for heat transfer away from the equipment.
The amount of energy required to chill down air to required temperatures and to move it around and through the data centre is becoming somewhat expensive. Although water may be the most cost-effective way forward, today it’s still predominantly a secondary source.
Another means of efficiently using water to chill air is water-side economisers, which are finding increasing use to either replace or supplement standard computer room air conditioning (CRAC) units in order to lower energy costs.
Other data centres are taking things a little further. For instance, Google is using sea-water cooling for its new data centre in Finland, while PlusServer, a German organisation, is building a new data centre in Strasbourg that will use ground water at a fixed 12 degrees to 14 degrees Celsius as feed water for cooling air in the data centre.
Other similar approaches include falling curtain evaporative cooling (a method where air is forced directly through a falling curtain of water and cooled due to evaporative energy exchange) in hot climates as well as direct river-water cooling in cold climates.
More water-cooling vs. air-cooling facts
Water has between 50 and 1,000 times the capacity to remove heat than air and can therefore be far more effective to cool hotspots if it’s engineered and implemented in the correct manner.
With mainframe and certain midsized computers that used water cooling in the 1970s, ’80s and ’90s, this water was run through copper pipes at positive pressure and used to cool hotspots as needed, specifically at the CPU. Most other electronics within the computer ran at a low temperature and in an open space to be cooled through low-pressure air flows fed by chiller systems. But, if a water leak occurred, the positive pressure would force water out into the heart of the computer, and, unfortunately, water and the insides of a computer do not mix that well.
However, water cooling has matured to a point where leakage should not be the problem that it once was. For example, data centres can use rear door heat exchanges as self-contained systems, such that if there is a leak, it is all contained within the system with no risk of the water getting to any electrical equipment. Here, the rear of a 19-inch rack is replaced with a heat exchanger through which chilled water is pumped. Even as a passive system (i.e. no forced air used), IBM and others claim that such a system can remove 60% of the heat from a 33 kW high-density rack. Used within a self-contained sealed rack, rear door heat exchangers can provide considerable savings against having to put in place new CRAC units along with targeted cooling.
Targeted water cooling
Another approach is to use highly targeted water cooling, in which metal pads (generally copper or even gold-plated copper for additional thermal efficiencies) with micro-channels are used to replace the standard cooling fins used on CPUs. Pure water (no dissolved solids or gases) is passed through the micro-channels, removing heat directly from the CPUs; often it can be used in other parts of the building as output at temperatures that are high enough to heat water.
Although CPU-based water cooling is not particularly new in itself, what is new is that systems are now run at negative pressure, so that the water is sucked round the system, rather than pumped. If the system develops a leak, air is sucked in, rather than water leaking out. Sensors monitor the system continuously so that if this does happen, administrators are informed immediately and can take remedial action.
For those wanting to move to maximum equipment densities, cooling is one of the biggest issues they must deal with. Using forced air only may be short-sighted as energy prices continue to fluctuate but trend upwards. The use of water to more effectively cool hotspots, combined with higher overall data centre temperatures provides the means to optimise energy usage and provide the capability to create and operate a high-availability, high-density data centre for the future.
If you are wondering whether to use air cooling or water cooling in the data centre, you will find that using water is very cost effective in most scenarios.
Clive Longbottom is a service director at UK analyst Quocirca Ltd. and a contributor to SearchVirtualDataCentre.co.uk.