Energy saving tips: Reduce energy consumption with hot and cold aisles

Expert Advice

Energy saving tips: Reduce energy consumption with hot and cold aisles

With cooling being a major energy drain in any data centre, it is that an IT manager ensures that the overall approach to cooling is optimised in order to reduce energy consumption. Many existing data centres use computer room air conditioning (CRAC) units to chill and condition air, ensuring that it is at the right moisture level, and then channel this to the bottom of equipment racks in the data centre via the plenum (the gap between the solid floor and the raised floor). The cold air makes its way through the rack and is eventually expelled through the top of the rack (and in many cases along the sides of the rack) into the general volume of the data centre from where it is extracted via fans.

This approach can be wasteful for two reasons. First is the need to count on the temperature of the general data centre remaining within whatever limits are decreed. If a data centre covers a general area of 10,000 square feet and has a raised floor to dropped ceiling height of 10 feet, then it will be required to maintain 100,000 cubic feet of air within a specific temperature and moisture level. Second, after a while, raised floors become somewhat patchy. As equipment is moved, perforated tiles may not be moved as well, non-perforated tiles become dislodged and under-floor cabling can block the effective flow of the cold air.

Energy saving tip: Use containment to minimise the volume of cold air being used. That can reduce the wasting of cold

Continue Reading This Article

Enjoy this article as well as all of our content, including E-Guides, news, tips and more.

air and therefore can reduce the data centre’s energy consumption.

A starting point is to create a “cold aisle”, which can be done in many ways -- from relatively simple to highly engineered. Taking the relatively simple as an example, the problem is how to create a contained environment between two equipment rows. You can do that by covering over the rows with some form of an enclosure and placing simple overlapping plastic hanging doors at each end. Blanking plates in the front of the racks can then be replaced with open, semi-open or perforated plates to allow air to enter, with the same at the rear of the racks to allow air to escape.  Cold air is pumped into the covered aisle (generally through the cover so that the cold air can fall naturally and use positive pressure to push through the equipment racks), and this makes its way through the equipment racks and is expelled into the open area of the data centre. Figure 1 shows where each two rows create a single cold aisle, and each group of two equipment rows plus the cold aisle is a self-contained environment within the data centre itself.

Figure 1

This is a good start to reducing energy consumption; cold air volumes are being applied across a greater area of equipment and should be more effective than a bottom of rack to top of rack approach. 

Energy saving tip: When the cold air route is via the aisle, the chances of any significant blockage, as could be found in a raised floor with cabling and other “out of sight” items, is minimised. This should result in a decreased need for cold air to be produced. However, the exit air is still heating up the general data centre, and this can be dealt with through introducing a “hot aisle” concept – the converse of the cold aisle.

Figure 2 shows the spaces between the cold aisle units are now covered over with doors at the end of the aisles. Cold air is pumped into the cold aisle spaces as before, but now the hot air is contained within smaller volumes in the hot aisles, rather than being vented into the open data centre. What is gained here is that the total volumes involved in the data centre for cold and hot air are both reduced and the rest of the data centre volume can be run at whatever temperature you want. Indeed, no cooling may be required at all for the volumes outside of the cold and hot aisles. From the 100,000 cubic feet of cooling required in an uncontained data centre, it is more likely the cooling will now be down to around 20,000 cubic feet – a saving of up to 80% in cooling required.

At the highly engineered end, vendors provide completely self-contained systems that include the racks, specialised cooling systems, highly targeted cooling air ducting, pressure monitoring to ensure continuous cooling and so on. For complex environments, this will be required to ensure high availability of systems.

 

Figure 2

Energy saving tip: Bearing in mind that modern equipment runs far warmer than equipment in the past, it is possible to run inlet cooling air at a higher temperature as well – so reducing the need for energy even further. Indeed, by taking the idea of hot and cold aisles one step further, and making the equipment rack its own self-contained system using in-row cooling, air flow routing and direct hot air ducting outside of the data centre, energy use can be completely minimised to a level where more than 90% savings can be made.

You’ll need to address two issues when using containment approaches such as those above. The first is you must fully understand the cooling flows required, and here it is recommended that you use computational fluid dynamics to ensure that flows reach the areas that need cooling and that monitoring of temperatures is used to ensure that hot spots do not build up.

The other issue is that should the cooling fail, there is less cold air sink to keep the equipment cool whilst the problem is being fixed than there would be if the equipment racks were open within a large uncontained data centre. Therefore, be sure that cooling is fully redundant where downtime has to be avoided, or that equipment can be elegantly switched off within a suitable time frame to avoid overheating should cooling become unavailable.

Containment can be a relatively easy means to reduce energy consumption in a data centre, and it’s something that can be retro fitted relatively easily.

Clive Longbottom is a service director at UK analyst Quocirca Ltd. and a contributor to SearchVirtualDataCentre.co.uk.

 

Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

This was first published in June 2011

 

COMMENTS powered by Disqus  //  Commenting policy

Disclaimer: Our Tips Exchange is a forum for you to share technical advice and expertise with your peers and to learn from other enterprise IT professionals. TechTarget provides the infrastructure to facilitate this sharing of information. However, we cannot guarantee the accuracy or validity of the material submitted. You agree that your use of the Ask The Expert services and your reliance on any questions, answers, information or other materials received through this Web site is at your own risk.