Article

Milan facility pumps local wells for data centre cooling

Tracey Caldwell, Contributor

The power needed to cool data centres in warm climates has become increasingly difficult to justify. Some companies have moved their operations north to cooler climes, but Milan-based data centre I.Net Business Factory looked underground, where a naturally replenished waterbed 40 metres below ground revealed sustainable data centre cooling possibilities.

Andrea Marini, the head of I.Net Business Factory management, has had his work cut out getting permission for a data centre. Cutting a deal to share the water with local farmers helped push it through, but the process has been long and complicated and is still ongoing.

"It is important to verify the impact of pumping about 200 litres per second from 40 metres depth, to the wells in the area around [the data centre] to avoid problems with their water availability," Marini said. Fortunately the area around the buildings is rich with water and there are also farmers' fields nearby that need to be irrigated. "Farmers are happy to have very clean water at 18 degrees Celsius using irrigation canals in which we put water coming out of our data centre," Marini said.

Going underground for data centre cooling solutions
Work on the data centre facilities began in November 2005. I.Net drilled four wells and installed pumps to bring the underground water to a reservoir beneath the data centre floor.

I.Net drilled four wells and installed pumps to bring the underground water to a reservoir beneath the data centre floor.

 

,

The first customer installation came in mid-2006. "In the past three years, we have been setting up about 3,000 square metres in dedicated rooms for about 100 customers. We are at 50% of use," said Marini.

In the face of Milan's hot summers, the data centre has kept at standard tolerances of 21 degrees Celsius plus or minus 2 degrees. Humidity is controlled to avoid high humidity of more than 50%. "In our data centre, for high-power density (up to 10 kW per rack), it is not important to control the temperature of the room, but it is important to supply cold air in the right quantity in the cold aisle and to extract heat from the hot aisle using direct air," Marini said.

Under normal operating conditions, power is now used only for the pumps that raise and circulate the water, and for the fans. This has reduced energy consumption by nearly one megawatt compared with powering a conventional air conditioning system, saving  ,800,000 a year. Lower power consumption has reduced CO2 emissions by 4,200 tonnes a year. In the cooler winter months, fresh-air cooling supplements the water-based cooling process.

I.Net is part of BT Italia. Harkeeret Singh, head of data centre strategy at BT Global Services, says location has become critical to the data centre total cost of ownership equation. "Data centre energy-efficiency has become more prominent over the past three years. As energy prices have increased, it has become more of a factor in total cost of ownership of a data centre. Historically we have had sites located next to rivers and water where it can be used," said Singh.

Proximity to water that can be used to offset data centre cooling costs has come under the scrutiny of regulatory authorities: "BT and I have personally contributed to the development of the EU Code of Conduct. The part of the Code of Conduct talking about local water, lakes and rivers was there directly from our input," Singh claimed.

He noted that much of the energy expended on data centre cooling may be unnecessary and is looking at pushing up standard tolerances: "We are looking at increasing temperatures where we can," he said. I.Net has clients with contracts that are older than three or four years when they used to have terminology along the lines that the facility will be run at 21 degrees plus or minus two degrees. "We are constrained by the contract," he said.

"Going forward, working within standards, we are looking at the ASHRAE (American Society of Heating, Refrigerating and Air conditioning Engineers) guidelines for temperature and humidity in data centres," Singh said.

Singh believes the standard boundary of 22-25 degrees could be extended even beyond the recent ASHRAE recommendations of 18-27 degrees. "We feel that that has not gone far enough and are trying to push further, and we are also pushing suppliers. There are suppliers out there for IT equipment that will support their equipment in higher temperatures than 27 degrees."

He added, "Cooling is the big overhead in the PUE [power usage effectiveness] figures; we know we need to work with the industry to enable that."

Tracey Caldwell is a contributor to SearchVirtualDataCentre.co.uk.


Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
 

COMMENTS powered by Disqus  //  Commenting policy