Article

Hosting company employs CFD models to improve power and cooling

Manek Dubash, Contributor
The pressure is on data centre managers to deliver more computing power in less space. This means higher densities, which in practice usually means packing racks with blade servers, each of which runs multiple virtual servers.

High-density facilities boost power consumption -- often to over 5 kW per rack -- and extracting the resulting heat can double the total power bill, which can max out the local power utility's electricity capacity.

Planning a new data centre facility
Hosting company Star in Bristol, U.K., reckons that it's starting to change that with its new £3 million data centre, Star Data Centre 3, designed to aid Star's focus on the hosted applications market. It opened in April 2009.

According to James Griffin, the head of hosting strategy at Star, the facility accommodates up to 7 kW per rack, runs hotter than normal, uses ambient air cooling, and uses a flywheel as part of its backup power system.

Griffin recounts how and why the facility came to be built. Building the 10,000-square-foot data centre next to a military base solved the power problem: the local utility also powers the military and was able to deliver the required 3 megawatts. "Power was key," said Griffin. "We looked at lots of places from 45,000 square feet down, but the economic equation is about density. The higher the density, the quicker the payback."

Philosophy of the design
That's a key part of the philosophy behind the project. "This was a commercially -- not a technically -- driven project," said Griffin. "We didn't set out to build a data centre to show how technically advanced we were."

Power was key. We looked at lots of places ... but the economic equation is about density. The higher the density, the quicker the payback.

James Griffin,
head of hosting strategyStar

And while green issues are important, containing energy consumption was a key element of Star's design strategy. The company used computational fluid dynamics (CFD) to model the facility before any physical changes were made. So the floor was raised to accommodate cooling pipes and cabling. The building was previously a data centre mothballed by its prior owner.

"We re-floored it to get more air pressure, as we had to fit chiller pipes under the floor," said Griffin. "Our CFD modelling showed that with larger pipes and more cables, air wouldn't get through to the top of the racks. So increasing floor height made a big difference -- small details like that make unbelievable differences. We matched our CFD models with the reality and it works."

Modelling also revealed that the facility can be run at 24 degrees Celsius. "Most other data centres are too cold; they blast air around without thought for configuration," said Griffin. He added that servers can run at even higher temperatures but that colocation customers might be reluctant to run their servers any hotter.

Greener power and cooling systems
The backup power system includes a 60-tonne diesel generator allied to a permanently spinning flywheel system for bridging the gap between a power outage and the generator spinning up to full power.

The choice here was between batteries and flywheels. While flywheels are greener because they don't use noxious substances, the reason for rejecting standby batteries was that they cost much more in power and maintenance.

"Batteries would need a chiller full time to keep them cool, which means 250 kW off the top of our power usage, compromising our density proposition," said Griffin. "Also, batteries aren't particularly green -- although green is important, most customers won't pay extra for it."

The cooling system also includes an ambient air cooler. Griffin described it as "a big radiator" that's deployed when external temperatures drop below 10 degrees Celsius.

While customers won't pay for greenwash, being greener can translate into business opportunities, said Griffin. He noted that the 100-tonne annual carbon saving from the flywheel decision has opened doors with organisations such as the Forestry Commission. "This makes it an easier conversation to have," said Griffin. "We want to be aligned to our customers' business strategies, not their IT strategies."

Unique data centre features
Other unique features include Star's custom designed, 42 U racks. The facility has to be flexible to host applications, act as a colocation facility and service the finance industry. But racks that could accommodate all requirements (banks' servers need particular physical and other security measures) didn't exist. So the company designed its own racks and commissioned a rack builder to produce them.

The system also includes movable partition walls that allow Star flexibility with its security zones. Other physical security measures include a perimeter fence, 360-degree IR CCTV cameras and manned security to prevent unauthorised access, while access to the building is by smart key-card only.

Futures
Star plans to spend another £10 million over the next three to four years developing the facility. Its philosophy will remain business-driven, rather than technology-driven, said Griffin.

Manek Dubash is a contributor to SearchVirtualDataCentre.co.uk.


Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
 

COMMENTS powered by Disqus  //  Commenting policy