planetjohnson - Fotolia

Icelandic datacentres may lead the way to green IT

Iceland may soon become even more attractive to companies wanting to minimise their carbon footprint while using high-performance computing services

This article can also be found in the Premium Editorial Download: CW EMEA: CW Nordics: Icelandic datacentres point way to greener IT

Icelandic datacentres operate on sustainable energy – a mix of geothermal and hydroelectric power generation. Furthermore, cooling is free thanks to the naturally cool climate, and there are three submarine cable systems linking Iceland to other regions – with the next one on the way.

The fourth submarine cable system is expected to be ready by the end of 2022. The new system, called Iris, will provide a direct connection from the southwest of Iceland to the west coast of Ireland. Installation of the cable began on 23 May. 

The need for Iris is just one more indicator of a trend towards using Nordic datacentres to take advantage of sustainable energy. Nordic datacentres are particularly well-suited to high-performance computing applications, such as machine learning, scientific computing, protein folding and modelling of financial markets.

Iris will give Iceland an added advantage in respect to its neighbours. The route will run directly to the west coast of Ireland, where there are direct links to Nova Scotia and the east coast of the US, including New York. This will give Iceland the shortest latency from any Nordic country to the east coast of the US, which is where a lot of the demand for high-performance computing comes from.

Verne Global operates the largest enterprise datacentre in Iceland, providing high-performance computing services for a variety of applications. The company was founded in 2007 and was able to acquire a campus that had served as a NATO Allied Command base just a year before.

With very little effort, they were able to create a secure site, taking advantage of the physical security that was already in place. NATO hadn’t just randomly picked a location. They picked a place that was on a very strong foundation – and that had excellent access to hydroelectric and geothermal energy.

Somebody building a regionally based cloud approach might not find Iceland to be the most appropriate location. Because of its geographic separation and sparse population, it isn’t a natural hub.

Read more about IT in Iceland

But when it comes to application-specific services, the equation looks quite different. “Iceland should be your first destination if you’re building an application-specific cloud,” said Tate Cantrell, chief technology officer of Verne Global.

“If you think about Kubernetes, which is a container management system, you can start deploying applications based on the metadata that developers provide and let Kubernetes act as a traffic cop,” he told Computer Weekly. “You send a container with an application, and it says, for example, criticality is level 1, sustainability is level 2, latency is level 3. Then it decides that the container is a perfect fit for our highly sustainable, mid-level latency computing platform.”

One of the main application areas where Icelandic datacentres make a lot of sense is in artificial intelligence (AI). With the advancement of AI methodologies such as unsupervised machine learning, for many applications, AI training and inference now needs to occur in the same location – they need to be colocated to facilitate iteration between the two processes.

Foundational AI models run for weeks or months to do a re-education, so running a full training data set is very energy intensive. Businesses that depend on AI models do training continuously to get different versions of the models. For example, they might train for a specific customer who has a data set they want trained against.

“There will be an increasing need for these energy intensive applications, but they are going to cause sustainability and resource problems in the future,” said Cantrell. “Supercomputers are used to generate models that will provide insight to scientists, artists and businesspeople, giving these people starting points for their thinking. Because the stakes are so high, it places a tremendous responsibility on anybody involved in training AI models.”

A second type of application where Icelandic datacentres make sense is in financial services. Although trading applications require very low latency and are usually placed close to exchanges in edge or metro locations, they depend on the output of larger, more compute intensive applications. These applications use thousands of computers 24 hours a day to run Monte Carlo simulations and Markov Chain analysis to make predictions about market trends. This kind of processing requires high-performance computing – and because latency is not an issue, it can be run in Nordic datacentres.

“Some of Verne Global’s biggest customers are in financial services, many of them now consuming multiple megawatts of power for their infrastructure,” said Cantrell. “The demand for power is rising fast. The datacentre industry is already a big consumer of energy, and the pressure is on for the industry to grow in a sustainable way, without causing substantial growth in overall emissions.”

Greenhouse gas reporting 

One of the stories Verne Global is trying to communicate this year is that as the datacentre industry grows, it can set an example for other industries. One way it can do that is through better reporting on emissions.

While the Greenhouse Gas (GHG) protocol encourages companies to report, there are no specific requirements to do so. There are some standards about how to report, but different industries choose different ways to report.

“Verne Global believes that one way the datacentre industry can lead is through truly open reporting,” said Cantrell. “And we believe Verne Global can be the example for the datacentre industry. It benefits us, because we’ll be able to show that we have one of the lowest total footprints on the planet. Of course, we’ll offset any emissions we have. But that’s not the point. Even before you do offsets, it’s important to consider the impact that you’re making.”

Scope 1, scope 2 and scope 3

The GHG protocol stems from a research paper from 1998 by the world research institute in collaboration with the world sustainable business coalition. There are three ways companies should report: scope 1, scope 2 and scope 3.

Scope 1 reporting is on direct emissions from owned or controlled sources. Typically for datacentres this is very small, because they get their power from the grid. In some exceptional cases, facilities do have their own power plant, and they would be concerned with scope 1 emissions. Another case is when datacentres use backup generators that they test from time to time. They would report the emissions from the tests.

Scope 2 reporting is on indirect emissions attributed to the electrical utility company that the datacentre buys from. This will be a very large number for any facility that draws power from a carbon-generating grid.

Scope 3 reporting is on all other indirect emissions that occur across a company’s supply chain. This is where the challenge comes in for datacentres, and most other organisations. Here they have to report on everything their suppliers do. If a supplier drives to the site to provide a service, or if a building has embodied carbon, this has to be reported. If a company generates a product, they have to record the lifecycle carbon usage for that product.

“Scope 3 is very broad, and a lot of companies are not fully reporting on that,” said Cantrell. “And although it’s certainly a challenge to report on emissions from multiple sources that aren’t under your control, it’s essential to identifying and addressing a company’s true environmental impact. We believe there’s a growing expectation that organisations report these emissions, and we attempting to lead that movement by reporting our Scope 3 emissions for 2021 and beyond.”

New synergies: Iceland, London and Finland 

In September 2021, Verne Global was acquired by Digital 9 Infrastructure, a company named after the United Nations’ 9th Sustainable Development Goal, which is to “build resilient infrastructure, promote sustainable industrialisation and foster innovation”.

Digital 9 also recently acquired two other datacentres: Volta, a 6 MW data centre in London, and Ficolo, a data centre operator in Finland.

“One of the ways that we’re going to be able to create synergies with the London facility and the financial services world is that some of the applications will need to be closer to the trading centres,” said Cantrell.

“A datacentre literally in the centre of London, on Great Sutton Street, is well suited to interact with the networks that are going to be there locally.

“If you create good, solid network connections between a central point in a business district and a remote location like Iceland, that makes it easy to start creating this traffic cop approach. That allows you to operate locally on the applications that need to operate locally, but to make a decision to push the work to places that are optimised to the needs of a specific application. You can get the optimal combination of cost, sustainability, and latency.

“Finland is also a developing datacentre market,” he said. “There are some sustainability benefits to Finland over Central Europe and Digital 9 will take advantage of that. From cost and efficiency standpoints, there are some nice cooling options that you can use in Finland.”

Two trends to watch 

Something to think about regarding the future of datacentres is the concept of embodied carbon – how much energy was used and how much greenhouse gases were emitted to bring steel on-site, to erect the steel, to put concrete in place. Builders can think about whether they use concrete, or whether they use graphene-embedded concrete to avoid that carbon cycle.

“The datacentre industry is young and there’s a lot of investment coming in,” said Cantrell. “It also involves a lot of high-tech, so we should invest in ways to reduce our embodied carbon and show other industries that there are unique ways to reduce the amount of embodied carbon going out as we continue to build buildings.

“The datacentre industry can play a part in helping to influence the technologies that people are going to think is second nature by 2030 or 2040,” he said. “Hopefully, what we do now will help reduce the environmental impact expected over the next 30 to 40 years thanks to the increased demand in urban population that will require construction of new buildings with floorspace the size of New York City every month for the foreseeable future.

“It’s also very important for datacentres to be more efficient,” said Cantrell. “Only a small fraction of the world’s datacentres are in the Nordics. Most of the others are still powered by grids that rely on coal or natural gas. We do need to improve energy efficiency.”

The second trend to watch is the use of liquid cooling – directly cooling servers by running liquids through them. The precipitous rise in demand for computing power is driving innovation not only in the construction of datacentres, but also in supercharging the demands of the silicon chips that are the brains behind the power-hungry servers in the datacentres. Chip manufacturers are exponentially increasing the power required to drive each central processing unit and graphics processing unit.

Chips that used to be 100 or 200 watts are now coming in at 350, 500 – and soon even 700 watts per socket – with no end in sight in the increases that datacentre operators will see year over year. This intensity of required cooling will not allow for conventional air-cooled solutions in the datacentres of the future.

Liquid cooling technologies such as direct immersion and liquid cooled cold plate are already available for deployment, and forward-thinking datacentre providers are already incorporating these technologies into their designs. “Our customers will be some of the first to widely deploy liquid-cooled servers for their HPC and AI applications,” said Cantrell. “But since this transition will not happen all at once, our datacentres are designed to flexibly accommodate both traditional air-cooled and liquid-cooled equipment in the same environment.”

Read more on Clustering for high availability and HPC

Data Center
Data Management