How data centres are preparing for AI and greater tech responsibility

In this guest post, Wesley Anastase-Brookes, commercial sales director at colocation company Ark Data Centres, talks about the importance of responding to the surging demand for artificial intelligence workloads – at a datacentre level – in a sustainable way

All industries want to grasp the opportunities of artificial intelligence (AI) – from chatbots to the automation of entire production facilities. It’s a rapidly expanding market, expected to grow twentyfold by 2030 to a value of $1.9tn.

This ramp-up of AI use will be a turning point for many enterprises, compelling them to re-examine their infrastructure use in an age where energy efficiency and sustainability are corporate imperatives.

There’s never been a time when technological advances and environmental sustainability have been more inextricably linked, and data centres are already at the heart of helping organisations to navigate their course.

The AI surge and its implications

Interest in, and the development of, AI was already substantial even before the advent of ChatGPT at the end of 2022, which triggered a surge of enthusiasm.

The growing levels of AI adoption we are seeing, and are set to see, are more than a trend – they are a fundamental change in the operational fabric of various sectors, including healthcare, finance, education, and manufacturing.

Cloud providers have been at the forefront of this revolution, actively developing clouds that offer AI capabilities either as a standalone service or integrated into existing product suites. Anticipating market demand, they are offering AI-powered solutions alongside traditional cloud services.

The impact of AI on datacentre operations

The datacentres that host these workloads are at the heart of this shift and crucial for realising AI’s potential. Although some headlines naturally ask some hard environmental questions of the industry itself, the more advanced among us have long been focused heavily on how we use energy more efficiently and operate in the most sustainable way possible.

This matters because, without adequate preparation, the substantial processing power required for these evolving AI services would pose significant risks in terms of reliability, sustainability and cost. Unlike traditional computing tasks, AI operations, especially deep learning models, require extensive computational resources for tasks like data processing, pattern recognition, and neural network training.

The increase in workloads inevitably leads to higher energy consumption. For example, the graphics processing units (GPUs) employed in AI processing are more power-intensive than their central processing unit (CPU) counterparts.

A path towards sustainable power management

The ability of datacentres to increase energy efficiency and reduce their carbon footprints depends on the operator’s investment and maturity – both in technology and sustainability practices.

Escalated power requirements have necessitated a re-evaluation of how energy is sourced, used, and optimised. This is not just about meeting the immediate energy needs but doing so in a way that aligns with broader environmental goals. A key strategy is the integration of renewable energy sources.

Datacentres are increasingly looking towards solar, wind, and hydroelectric power. This transition is more than an environmental gesture; it is a strategic move to ensure long-term sustainability and cost-effectiveness.

Energy efficiency is a vital component of sustainable power management. Advanced AI algorithms are being utilised to optimise energy usage in datacentres, from cooling systems to server operations. These AI-driven optimisations not only reduce the overall energy footprint but also enhance the operational efficiency of the datacentres.

Innovations like using waste heat from datacentres for community heating projects or other industrial uses also represent a shift towards a more circular approach to energy use.

Redefining datacentre locations in the AI age

An interesting aspect of AI is how its use is redefining traditional datacentre hubs. Traditionally, these hubs have been around major cities, but AI is reducing the immediate priority attached to low-latency data centres.

Unlike previous set-ups where latency was critical, AI learning models do not demand real-time responses, thereby allowing significant flexibility in the choice of datacentre locations.

An increase in AI computing processed away from city hubs will alleviate the power constraints experienced by primary European markets and enable enterprises to use datacentres in regions with direct access to renewable power sources. This substantially reduces their environmental impact.

Organisations are already deciding to use datacentres for AI training and modelling outside the traditional metropolitan zones. They are opting for areas where grid capacity is less constrained, at the same time reducing the burden on the datacentre capacity ‘near’ their customers where low latency is mission-critical.

This not only allows organisations to explore more advantageous locations but also encourages investment in smaller communities which in turn could see an upgrade in their power grid and high-value job creation.

Such datacentres are already online. Take Ark’s Spring Park campus for example, located in rural Wiltshire but close to rail links and the M4, where demand for and availability of power resources are far more flexible than in London. The ‘AI-ready’ site is easily accessible, has directly connected PPA (power purchase agreement) options, and provides a convenient, scalable and sustainable location for organisations looking to develop outside the city.

AI: a catalyst for change

AI’s transformative influence extends beyond technological boundaries, bringing with it the promise of societal benefits across a huge range of sectors. From early detection of diseases to predictive analytics enhancing various industries, the potential benefits are extensive.

The AI landscape is essentially divided into the learning and generative phases. The learning phase involves training the model, a process that demands substantial power, but can be done further afield.

Once these models are developed, they are delivered to end-users, facilitating real-time responses and automation, which again involves substantial energy consumption and a location closer to the end user.

The future of AI and data centres

As we move further into the AI era, our collective responsibility for sustainable and socially responsible innovation will only become more pronounced. Datacentre operators, along with their customers, must continue to navigate these changes together, carefully balancing the scales of technological advancement with environmental stewardship.

The journey ahead is complex, but the industry’s commitment to aligning the march of technological progress with the imperative of ecological responsibility is unwavering and significant steps forward have already been made.

The future of AI and datacentres is not just about technological excellence but about challenging the status quo and pioneering a path that harmonises innovation with the greater good.

CIO
Security
Networking
Data Center
Data Management
Close