High-density facilities boost power consumption -- often to over 5 kW per rack -- and extracting the resulting heat can double the total power bill, which can max out the local power utility's electricity capacity.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
Planning a new data centre facility
Hosting company Star in Bristol, U.K., reckons that it's starting to change that with its new £3 million data centre, Star Data Centre 3, designed to aid Star's focus on the hosted applications market. It opened in April 2009.
According to James Griffin, the head of hosting strategy at Star, the facility accommodates up to 7 kW per rack, runs hotter than normal, uses ambient air cooling, and uses a flywheel as part of its backup power system.
Griffin recounts how and why the facility came to be built. Building the 10,000-square-foot data centre next to a military base solved the power problem: the local utility also powers the military and was able to deliver the required 3 megawatts. "Power was key," said Griffin. "We looked at lots of places from 45,000 square feet down, but the economic equation is about density. The higher the density, the quicker the payback."
Philosophy of the design
That's a key part of the philosophy behind the project. "This was a commercially -- not a technically -- driven project," said Griffin. "We didn't set out to build a data centre to show how technically advanced we were."
And while green issues are important, containing energy consumption was a key element of Star's design strategy. The company used computational fluid dynamics (CFD) to model the facility before any physical changes were made. So the floor was raised to accommodate cooling pipes and cabling. The building was previously a data centre mothballed by its prior owner.Power was key. We looked at lots of places ... but the economic equation is about density. The higher the density, the quicker the payback.
head of hosting strategyStar
"We re-floored it to get more air pressure, as we had to fit chiller pipes under the floor," said Griffin. "Our CFD modelling showed that with larger pipes and more cables, air wouldn't get through to the top of the racks. So increasing floor height made a big difference -- small details like that make unbelievable differences. We matched our CFD models with the reality and it works."
Modelling also revealed that the facility can be run at 24 degrees Celsius. "Most other data centres are too cold; they blast air around without thought for configuration," said Griffin. He added that servers can run at even higher temperatures but that colocation customers might be reluctant to run their servers any hotter.
Greener power and cooling systems
The backup power system includes a 60-tonne diesel generator allied to a permanently spinning flywheel system for bridging the gap between a power outage and the generator spinning up to full power.
The choice here was between batteries and flywheels. While flywheels are greener because they don't use noxious substances, the reason for rejecting standby batteries was that they cost much more in power and maintenance.
"Batteries would need a chiller full time to keep them cool, which means 250 kW off the top of our power usage, compromising our density proposition," said Griffin. "Also, batteries aren't particularly green -- although green is important, most customers won't pay extra for it."
The cooling system also includes an ambient air cooler. Griffin described it as "a big radiator" that's deployed when external temperatures drop below 10 degrees Celsius.
While customers won't pay for greenwash, being greener can translate into business opportunities, said Griffin. He noted that the 100-tonne annual carbon saving from the flywheel decision has opened doors with organisations such as the Forestry Commission. "This makes it an easier conversation to have," said Griffin. "We want to be aligned to our customers' business strategies, not their IT strategies."
Unique data centre features
Other unique features include Star's custom designed, 42 U racks. The facility has to be flexible to host applications, act as a colocation facility and service the finance industry. But racks that could accommodate all requirements (banks' servers need particular physical and other security measures) didn't exist. So the company designed its own racks and commissioned a rack builder to produce them.
The system also includes movable partition walls that allow Star flexibility with its security zones. Other physical security measures include a perimeter fence, 360-degree IR CCTV cameras and manned security to prevent unauthorised access, while access to the building is by smart key-card only.
Star plans to spend another £10 million over the next three to four years developing the facility. Its philosophy will remain business-driven, rather than technology-driven, said Griffin.
Manek Dubash is a contributor to SearchVirtualDataCentre.co.uk.