News

11 best practices save energy in the datacentre

Antony Savvas

Firms can save one million kilowatt hours a year by implementing 11 best practices in their datacentres, according to analyst Gartner.

Gartner says that in a conventional datacentre, 35% to 50% of the electricity consumed is for cooling. The figure is 15% in best-practice "green" datacentres.

"Virtually all datacentres waste enormous amounts of electricity using inefficient cooling designs and systems," said Paul McGuckin, an analyst at Gartner. "Even in a small datacentre, this wasted electricity amounts to more than one million kilowatt hours annually, that could be saved with the implementation of some best practices."

The overriding reason for the waste in conventional datacentre cooling is the unconstrained mixing of cold supply air with hot exhaust air, said Gartner.

"This mixing increases the load on the cooling system and energy used to provide that cooling, and reduces the efficiency of the cooling system by reducing the delta-T (the difference between the hot return temperatures and the cold supply temperature). A high delta-T is a principle in cooling," said Paul McGuckin.

Gartner's 11 best practices:

  1. Plug holes in the raised floor
    Most raised-floor environments have cable holes, conduit holes and other breaches that allow cold air to escape and mix with hot air. This single low-tech retrofit can save as much as 10% of the energy used for datacentre cooling.
  2. Install blanking panels
    Any unused position in a rack needs to be covered with a blanking panel to manage airflow in a rack by preventing the hot air entering the cold-air intake of other equipment in the same rack. When the panels are used effectively, supply air temperatures are lowered by as much as 22°F (5.6°C), greatly reducing the electricity consumed by fans in the IT equipment, and alleviating hot spots in the datacentre.
  3. Co-ordinate CRAC units
    Older computer room air-conditioning units (CRACs) operate independently in cooling and dehumidifying the air. These units should be tied together with newer technologies so efforts are co-ordinated. Alternatively, humidification responsibilities can be removed from them altogether and place with a newer piece of technology.
  4. Improve underfloor airflow
    Older datacentres typically have constrained space underneath the raised floor that is not only used for the distribution of cold air, but also has served as a place for data cables and power cables. Many old datacentres have accumulated such a tangle of these cables that airflow is restricted, so the underfloor should be cleaned out to improve airflow.
  5. Implement hot aisles and cold aisles
    In traditional datacentres, racks were set up in what is sometimes referred to as a "classroom style," where all the intakes face in a single direction. This arrangement causes the hot air exhausted from one row to mix with the cold air being drawn into the adjacent row, thereby increasing the cold-air-supply temperature in uneven and sometimes unpredictable ways. Newer rack layout practices instituted in the past 10 years demonstrate that organising rows into hot aisles and cold aisles is better at controlling the flow of air in the datacentre.
  6. Install sensors
    A small number of individual sensors can be placed in areas where temperature problems are suspected. Simple sensors store temperature data that can be transferred into a spreadsheet, where it can be further analysed. This can provide insight into possible datacentre temperature problems, as well as a method for analysing the results of improvements made to datacentre cooling.
  7. Implement cold-aisle or hot-aisle containment
    Once a datacentre has been organised around hot aisles and cold aisles, dramatically improved separation of cold air supply and hot exhaust air through containment becomes an option. For most users, hot-aisle containment or cold-aisle containment will have the single largest payback of any of these energy efficiency best practices.
  8. Raise the temperature in the datacentre
    Many datacentres are run colder than an efficient standard. The American Society of Heating, Refrigerating, and Air-Conditioning Engineers (ASHRAE) has increased the top end of allowable supply-side air temperatures from 77°F (25°C) to 80°F (26.7°C). Not all datacentres should be run at the top end of this temperature range, but a step-by-step increase, even to the 75°F (23.9°C) to 76°F (24.4°C) range, would have a beneficial effect on datacentre electrical use.
  9. Install variable speed fans and pumps
    Traditional CRAC units contain fans that run at a single speed. Emerging best practice suggests variable speed fans be used whenever possible. A reduction of 10% in fan speed yields a reduction in the fan's electrical use of around 27%. A 20% speed reduction yields electrical savings of around 49%.
  10. Exploit "free cooling"
    "Free cooling" is the general name given to any technique that cools air without the use of chillers or refrigeration units. The two most common forms of free cooling are air-side economisation and water-side economisation. The amount of free cooling available depends on the local climate, and ranges from approximately 100 hours per year to more than 8,000 hours per year.
  11. Design datacentres using modular cooling
    Traditional raised-floor-perimeter air distribution systems have long been the method used to cool datacentres. However, mounting evidence strongly points to the use of modular cooling (in-row or in-rack) as more energy-efficient.

"Although most users will not be able to immediately implement all 11 best practices, all users will find at least three or four that can be immediately implemented in their current datacentres," said McGuckin.

"Savings in electrical costs of 10 to 30% are achievable through these most-available techniques. Users committed to aggressively implementing all 11 best practices can achieve an annual savings of 1 million kilowatt hours in all but the smallest tier of datacentres," he said.

Gartner will be discussing power and cooling strategies at the Gartner Data Center Conference on 2-5 December in Las Vegas.


Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
 

COMMENTS powered by Disqus  //  Commenting policy