Data centre management adapting green computing needs

Rising power and energy costs have caused a gradual shift in data centre manager priorities, putting green computing at the forefront. Industry experts reveal how they're dealing with data centre management issues.

There has been a dramatic shift in priorities for the IT manager and data centre management, according to industry players, as more and more companies find themselves in a race to meet new green computing requirements.

This change comes from growing business computing needs, a new awareness of global warming and a rise in power and energy costs.

Dan Lowe, managing director of data centre and network services operator UKSolutions, said the needs of the data centre manager have changed significantly over the last few years.

"IT managers have different needs now," he said. "Our IT managers put in projects, not business systems. They want to try out something new, but they want it on-site. It costs too much on-site and brings too many extra green costs with it."

"Now many of our customers have gone global, and they need 24/7 access," Lowe added. "That means a lot of costs in staff and training around managing systems 24/7 and cooling them efficiently."

UKSolutions has a team of 25 data centre managers and boasts customers such as Aston Martin and the Co-Op.

According to Lowe, new regulations such as the CRC Energy Efficiency Scheme (formerly known as the Carbon Reduction Commitment), which is set to be introduced next month, reflects how the industry is now being responsible about carbon reduction and how to use power more efficiently.

The scheme is designed to raise CO2 awareness amongst enterprises and encourage them to make changes in behaviour and infrastructure, such as implementing improved heat and cooling strategies.

UKSolutions data centre cooling techniques
UKSolutions has two data centres: UKSolutions North and UKSolutions South.

Our IT managers put in projects, not business systems. They want to try out something new, but they don't want it on-site.


Dan Lowe, managing director, UKSolutions,

Lowe said: "We use fresh air from outside, which cools the water system. This goes down to the data floor and cools the data centre. In the summer, we don't use this system -- we cool the water ourselves."

In total, the company has five data floors spanning two buildings on its Studley-based campus. The North site is designed to deliver N+N environments and power services, balanced with techniques to manage energy usage, whereas the South site offers hosting.

UKSolutions North is the newer of the two sites and was designed with data centre cooling techniques in mind. With the premise that "heat is an enemy to reliable data storage," each suite in the North site is kept cool through "dry coolers" that use ambi-cool technology. These test the external temperature and, when possible, use the air to deliver cooling. This also reverts to a standard refrigeration technology on warmer days.

The suites are arranged in enclosed pods so the maximum amount of cold air can be forced into the hosting racks, in addition to the hot air being expelled into the cooling equipment.

The site uses 2.5kW/M2 per system and a 3.8MVA 11kV supply of power delivered via a dual cable arrangement from Studley and Redditch substations. Suites are designed to average 4kW per footprint.

Twelve air handling units, at 42kW, are used per suite, in addition to two 300kW dry coolers, to cool air.

UKSolutions South was built back in 1998 to support 2kw per rack, delivering 250 footprints over two data floors. The site is connected directly to the national grid via two 11kv connections. These are fed from different sub-stations -- Studley and Redditch. Each system is also backed up through battery powered and generators. The site uses an N+1 Denco close control air conditioning system for cooling.

The challenge of calculating PUE
Lex Coors, Vice President of Engineering and Technology Group at Interxion, stressed another changing green computing challenge - the importance of how power usage effectiveness (PUE) is calculated.

PUE is determined by dividing the amount of power entering a data centre by the power used to run the computer infrastructure within it. The number is shown through a ratio: overall efficiency improves as the amount decreases towards 1.

Interxion has 55,000m2 of colocation space, 7,500m2 of which reside in East London. With 26 data centres spanning Europe, the company offers basic colocation services in addition to fully outsourced solutions for disaster recovery.

According to Coors, the total data centre energy usage divided by IT equipment energy usage gives you PUE. To get started, you should measure your energy use at or near your facility's utility metre. If your data centre does not have a separate utility metre, you should estimate the amount of power being consumed by the non-data centre portion of the building and remove it from the equation.

Next, the IT equipment load should be measured after power conversion, switching and conditioning is completed.

The Green Grid believes the most useful measuring point is at the output of the room's power distribution units (PDUs), as this would represent the total power delivered to the server racks.

Coors said the dream data centre would have a PUE of 1, meaning every watt of power is delivered directly to the IT equipment. He added that this is not physically possible, however, as other services such as cooling still have to be provided.

"High PUEs of 2 or more are becoming more and more unacceptable these days -- some go up as far as 7 or 8," he said. "This means that more energy is spent on powering infrastructure for IT than on powering the equipment itself."

He added that these types of data centres are typically in-house and use legacy based equipment.

The community cloud -- a new approach to reducing CO2?
Kevin Collins, solutions development manager at Magirus, said one way to reduce CO2 and improve data centre efficiency is the notion of the community cloud.

The National Institute of Standards and Technology (NIST) describes the community cloud as a cloud infrastructure that is shared by several organisations and supports a specific community that has shared concerns; for example mission, policy, and compliance considerations.

The cloud may be handled by internal data centre management or a third party, and may exist on-premise or off-premise.

However, Collins highlighted that several green questions are yet to be answered around cloud computing, and the community cloud in particular.

For example, if four local, non-competing government bodies decide to share a community cloud, who is paying for the green tax? Does this responsibility fall on the hosting provider if the service is outsourced or on one of the companies that has decided to host the cloud on-premise?

"This is where service-level agreements become very important and stops any finger pointing that may happen when green issues arise," he said.

"Trust is important in a community cloud, because if one of the businesses is hosting the servers for the other companies on premise, then that business may find themselves having to pick up the green tax on their own."

Kayleigh Bateman is the Site Editor for

Read more on Datacentre capacity planning