Column

Gain data centre energy efficiency by cutting cooling

Clive Longbottom
A commonly used measure of the effectiveness of a data centre's energy utilisation is PUE -- the Power Usage Effectiveness score. The way to calculate this is to take the total energy used by a data centre and divide this by the amount of energy that is used by the IT equipment (servers, network equipment, storage, etc.). The overall power usage has to include the rest of the data centre's power usage -- areas such as lighting and, more substantively, cooling.

Overall, a "reasonable" data centre will use around 2 W of energy overall for every 1 W used by the IT equipment, leading to a PUE score of 2. Many data centres are far worse than this, with some being well over a level of 5, meaning that there is essentially 4 W of energy being used (or wasted) in running 1 W of IT equipment.

As soon as virtualisation is brought in to the mix, things can get far worse. Taking an existing data centre and virtualising from, say, 1,000 to 300 servers (and then rationalising network and storage equipment appropriately) will undoubtedly save a lot of energy at the IT equipment level. However, unless some heavy reengineering of the data centre itself is carried out, it is unlikely that much in the way of change will be made to the existing cooling arrangements.

Surprisingly, less equipment actually runs hotter despite the use of cooling power that could easily manage three or four times the amount of technology.

Clive Longbottom, Contributor,

A data centre for 1,000 rack mount servers will probably be around 2,500 square feet, with a roof height of 10 feet (including a raised floor and dropped ceilings), resulting in 25,000 cubic feet of air volume. Existing approaches of volume cooling of the air mean that even when space requirements drop, the systems must still cool down the full 25,000 cubic feet. However, a highly-dense, virtualised environment will have different heat characteristics than a standard rack mount or tower-based data centre, and hotspots will tend to run at higher temperatures than before.

Less equipment doesn't necessarily mean a cooler environment
Surprisingly, less equipment actually runs hotter despite the use of cooling power that could easily manage three or four times the amount of technology.

There are a few things that should be done here. First, it is necessary to reset perceptions. In the olden days (up to about two years ago), running IT equipment at high temperatures led to premature failure of items such as CPUs and disk drives. On a server with a dedicated disk running one application, this was an obvious problem, and so the aim was to run the equipment well within the thermal envelope defined by the vendor. For example, when a disk drive's rated life or mean time between failure (MTBF) was around 7,000 hours (approximately 1 year), everything had to be done to make sure the drive would last as long as possible.

It's time to revisit this and look at what the vendors themselves are saying. The MTBF for a standard consumer disk drive is now greater than 600,000 hours, or close to 100 years. Newer enterprise drives have MTBFs quoted in millions of hours. Research by Google also found that, surprisingly, disk drives had more failures when run at the colder end of the scale (80°F) rather than at a warmer level (97-120°F).

At the CPU level, it has been thought that running at higher than recommended levels will lead to disastrous failure. However, by using computational fluid dynamics (CFD), the areas where cooling needs to be applied most can be identified and targeted, and ducted cooling can be applied to remove heat from the specific hotspot.

Ambient temperatures
This then leaves the rest of the data centre. Time and time again, IT research and analysis firm Quocirca enters a data centre and thinks "Wow! Cool!" Whereas the data centre manager tends to take this as a positive comment on how their data centre has been set up, what we really mean is that the ambient temperature in the data centre has been set too low.

Average data centre ambient temperatures tend to be around 65°F. This enables mass volume cooling across all the equipment in the data centre, but also sets a nice temperature for any humans. However, the move should be toward a lights-out environment -- humans should only be in the data centre when there is a problem to deal with. New guidelines from the American Society of Heating, Refrigeration and Air-conditioning Engineers say that the ambient temperature in a data centre can be around 80°F, and vendors rate much of their equipment to 90°F or above.

At 80°F mass volumetric cooling becomes far cheaper, as inlet air temperatures into the data centre can be much higher than when trying to maintain mass volumes at 65°F.

Further gains can be made if contained cooling is used. For this, the use of cold aisles or in-rack cooling systems means that the volumes of air being cooled are minimised to only those required to bring the temperatures of the various hotspots back into limits. Outlet air can be vented directly to the outside, vented into the data centre outside of the controlled zones, or preferably reused for space or water heating elsewhere in the building as required.

In many cases, such an approach will allow some of the existing cooling units to be switched off, saving energy and improving PUE. Where investment allows, older units can be replaced with variable flow and variable load units, which are far more energy efficient and can adapt rapidly to changes to cooling volumes requirements.

The most important step, though, is to stop looking at the data centre in human terms; any person that enters the data centre should only be there for a short period of time, and therefore should be able to deal with a raised temperature. Anything that has to be in the data centre on a continuous basis is not a human and a temperature of 80°C (or higher) will not adversely affect it.

Clive Longbottom is a service director at UK analyst Quocirca Ltd. and a contributor to SearchVirtualDataCentre.co.uk.


Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
 

COMMENTS powered by Disqus  //  Commenting policy