Facebook's Arctic datacentre in Luleå, Sweden goes live

News

Facebook's Arctic datacentre in Luleå, Sweden goes live

Archana Venkatraman

Facebook’s newest datacentre in Luleå, Sweden has been turned on to handle live traffic and is storing users’ photos, videos, comments, and Likes. The Arctic datacentre is likely to be one of the most efficient and sustainable datacentres in the world, Facebook claimed.

The facility, which is located on the edge of the Arctic Circle, where the River Lule meets the Gulf of Bothnia, is Facebook’s first datacentre outside the US.

1. Facebook

All the equipment inside the new Swedish datacentre is powered by locally generated hydro-electric energy.

“Not only is it 100% renewable, but the supply is also so reliable that we have been able to reduce the number of backup generators required at the site by more than 70%,” Facebook announced on its website

In addition to harnessing the hydro-electric energy, the social network is using the icy Arctic air to cool the datacentre hardware including the thousands of servers that store users’ data, making it a renewable facility.

The excess heat produced by the datacentre room is used to keep the office warm, it said.

“Our commitment to energy efficiency is also evident inside Luleå's giant data halls,” it added.

The social media site has more than one billion monthly active users as of December 2012 and 618 million daily active users on average.

One of the biggest challenges for Facebook is the amount of data it has to store, secure and deliver on demand to its users, said Jay Parikh, Facebook’s vice-president for infrastructure when the company launched its first engineering centre in London in October 2012.

Vast amount of content

At that time Parikh added that there are 220 billion photos stored on Facebook’s datacentres with 300 million new pictures added every day. In addition, there are 2.5 billion content objects shared everyday on Facebook.

Almost all of the technology equipment in the Luleå datacentre facility, ranging from the servers to the power distribution systems, is based on Open Compute Project designs.

The Open Compute project, initiated by Facebook aims to build energy-efficient computing infrastructures at the lowest possible cost.

The initiative encourages the development of "vanity-free hardware designs that are highly efficient and leave out unnecessary bits of metal and plastic. These designs are then shared with the broader community, so anyone can use or improve them,” Facebook said.

The project defines designs and standards for dual processor server motherboards, server power supplies, a simple screw-less server chassis, designs for 42U server racks and 48 volt DC battery cabinets, an integrated DC/AC power distribution scheme, a 100% air-side economiser and an evaporative cooling system to support the servers.

High level of efficiency

The average power usage effectiveness (PUE) of Facebook’s newest datacentre is around 1.07, the company’s initial tests have showed. The PUE metric measures just how effective and energy-efficient an organisation’s datacentre is.

PUE is a direct relationship between how much energy is used across the whole of a data centre divided by how much energy is used to power the IT equipment. 

According to the Uptime Institute, the typical datacentre has an average PUE of 2.5. This means that for every 2.5 watts in at the utility meter, only one watt is delivered out to the IT load. It estimates most facilities could achieve 1.6 PUE using the most efficient equipment and best practices.

Facebook has been using real-time PUE monitoring tool in its US datacentres and will introduce the monitor in the Luleå facility to transparently share how the datacentre is performing energy-wise on a minute-by-minute basis.


Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
 

COMMENTS powered by Disqus  //  Commenting policy