Data centre cooling management on a budget

Article

Data centre cooling management on a budget

Kayleigh Bateman, Site Editor

U.K. IT managers are finding new and unique ways to build physical barriers for preventing the mixing of hot and cold air in their data centres, including methods involving vinyl sheeting.

According to Julian Kudtritzki, vice president of the Uptime Institute, hot and cold air aisle containment is an inexpensive way to optimise data centre coolings.

"Some IT managers are sealing holes in the floor, using pads and pillows to block gaps. Many are recycling sheets of plastic and sealing off whole aisles so hot and cold air is not mixed," he said.

Aisle airflow containment involves server racks being lined up in alternating rows -- cold air intakes all face one way and hot air exhausts face the other.

Getting started with data centre cooling

Lex Coors, vice president for data centre technology and engineering at data centre operator Interxion, said he started looking into hot and cold air containment back in 2004.

These types of solutions are available from vendors, but they're costly compared to finding alternatives yourself.

 

Lex Coors, vice president for data centre technology and engineering at Interxion,

"Some of our customers were asking for high densities per cabinet, so we looked into designing a solution that would contain cold air," he said, "and we came up with a door and a roof design."

Coors said the European data centre operator now visits its local blacksmith, who supplies Interxion with sliding doors and a plastic ceiling. With 26 data centres across Europe, Interxion has now included this feature for all of its customers, instead of just the customers that previously needed high densities. Interxion has a 12kw per rack limitation.

"These types of solutions are available from vendors, but they're costly compared to finding alternatives yourself," said Coors.

How Interxion discovered the issue of data centre cooling

He said Interxion noticed that mixed hot and cold air was reaching its computer room air conditioning units (CRAC). This was causing problems with servers that were sucking hot air back in and therefore failing. To create hot aisle containment, cooling units, normally only 30cm wide, were positioned in between the racks. In this scenario Interxion closed off the hot aisle and cooled down the hot air before blowing it out into the cold aisle.

Six years ago, the company decided to measure each of its customers' energy usage, which they called an energy overhead ratio (same concept as Energy Using Products or PUE). Once Interxion knew how much energy it was losing and where it was coming from, it started to plug holes and add blanking plates within server racks.

In addition, the main change to aid its data centre cooling design was adding doors at the end of each cabinet row.

"It's great to plug any holes, but if there are no doors on the ends of the rows, then all the cold air is going straight into your CRAC unit," said Coors, "and that can short circuit the unit." If cold air enters a CRAC unit it will signal the unit to think the air in the data centre doesn't need cooling, causing its valves to close.

"The middle of the data centre will get hotter and hotter as the CRAC unit can only sense that the edges are cold," said Coors.

Cool off with Uptime Institute's tier 3 and 4 infrastructure

More UK businesses are investing in their power and data centre cooling infrastructure, according to the Uptime Institute.

Julian Kudtritzki, vice president of the Uptime Institute, said the company has seen an increase in infrastructure consulting projects over the last three to four years. This is expected to continue as more businesses push high density gear into smaller spaces. As companies continue to squeeze more equipment onto a fewer number of racks, cooling the data centre equipment becomes increasingly difficult. Having to cool infrastructure that's crammed into a tight space can be both practically and economically challenging.

Data centre consolidation amongst UK businesses has also contributed to increasing energy bills. Geographically dispersed data centres are now being consolidated into one or two central locations, and while this can improve data centre management, it may also increase power usage and cooling deficiencies.

Kudtritzki said the Uptime Institute has recently participated in site assessments for Cable and Wireless in Birmingham and Fujitsu in London, both of which have a tier 3 infrastructure (see side bar). In addition, the Uptime Institute worked with Infinity on a tier 4 certification of the company's infrastructure.

According to Kudtritzki, tier three and four infrastructures are growing in popularity due to the ability to enable higher densities, but only running an estimated 4kwph of power.

"Tier 3 is great for concurrent maintenance," he said. "Components can be taken out without the infrastructure going down. Maintenance and replacements can be done without affecting the IT load."

Kudtritzki said tier 4 is also helpful when undergoing maintenance, as it has full tolerance and can withstand unplanned events without disruptions.

"If there is a UPS disaster, your infrastructure won't be affected with tier 4," said Kudtritzki.

 

Kayleigh Bateman is the Site Editor for SearchVirtualDataCentre.co.uk.


Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
 

COMMENTS powered by Disqus  //  Commenting policy