In this guest post, Danny Quinn, managing director of datacentre services provider DataVita, assesses the environmental impact of artificial intelligence from a server farm perspective
The potential impact of artificial intelligence (AI) on society and the economy has been one of the most discussed topics of 2023. ChatGPT, in particular, caught the public imagination, with more than 10% of people in companies worldwide saying they had tried using it in the workplace, according to figures from Statista. In fact, a Goldman Sachs study from earlier this year suggested as many as 300 million jobs could be automated by the AI platform and others like it.
Whatever the consequences of widespread adoption might be, it’s clear the use of AI will only increase – nine in ten businesses surveyed last year by NewVantage said they were investing in the technology. The level of demand will have a major knock-on effect on the infrastructure that underpins AI, particularly datacentres.
The impact of AI in the datacentre
AI operates in two main stages: training and inference. Both require a significant amount of computational power, and that’s where Graphics Processing Units (GPUs), come into play.
During the ‘training’ phase, AI learns from vast amounts of data. It’s similar to how we study for an exam. With a system as advanced as ChatGPT, this training is extensive. The system is processing vast amount of information. And it’s not just one GPU doing the work – imagine thousands of them working in harmony, non-stop, sometimes for days or months, to ensure the AI is well-trained.
After the training, we arrive at the ‘inference’ stage. This is when AI puts its knowledge into action, answering our questions or assisting with tasks. While this stage is typically less power-intensive than the training phase, it’s crucial for giving us real-time responses. GPUs also play a role here, ensuring the AI can quickly and efficiently use what it has learned.
GPUs demand a lot of electricity and specialised datacentre rack space. As more and more AI systems are developed and used, these datacentres need to grow and consume even more power.
A standard datacentre rack consumes somewhere in the order of four kilowatts (kW) of electricity. For the high-performance computers required for AI applications, it can be as much as 15 times that number. In rough terms, the average rack to support AI requires around the same amount of power as 25 houses. Deploying hundreds of these in a datacentre will consume the same amount of energy as a small town.
All of this has serious implications for businesses’ sustainability objectives – and even national net-zero targets. The Office for National Statistics’ latest Business and Insights Conditions Survey (BICS) found that just over one in six businesses (16%) are implementing at least one AI application – a figure that is only likely to grow in the years ahead and, with that, their carbon footprint.
What emissions are in-scope?
To mitigate against the potential for AI to undermine their drive towards net zero, organisations in all sectors will need to thoroughly understand what is known as their ‘scope 3’ emissions. Scopes 1 and 2 refer to the emissions caused by an organisation directly and the way in which the energy it uses is produced, but scope 3 also covers the emissions in its value chain – in other words, through the use of suppliers and third-party infrastructure such as data centres.
When deploying large-scale AI systems, it’s vital to consider both the direct and indirect environmental impacts, particularly in terms of carbon emissions. Scope 2 emissions, which pertain to the electricity consumption of a business or operation, become especially significant for AI due to the high energy demands of GPUs in datacentres. This means that even if an AI operation itself is energy-efficient, the source of its electricity can have a vast environmental footprint.
That’s where the carbon intensity of the region comes into play. Different areas produce electricity in varied ways: from fossil fuels, like coal and natural gas, to renewable sources, such as wind and solar.
The carbon intensity, which measures the amount of carbon dioxide emissions produced per unit of electricity, can greatly differ based on these energy sources. Last year, for example, the energy in the south of Scotland – taking in the area from the Borders up to Stirling – was 400% cleaner than London’s and 600% times less carbon intense than electricity in Ireland.
Businesses and individuals will only make more use of AI in the years ahead. But, they need to understand the carbon footprint associated with their supply chains so that their IT decisions support their drive towards sustainability, rather than hinder it.