BigBlueStudio - stock.adobe.com

AI’s hidden sting: a threat to millions of bees

The energy demands of artificial intelligence could have a devastating impact on Australia's honeybee population, a new study warns

Artificial intelligence (AI) is notoriously power-hungry, and the consumption of electricity from non-renewable sources means more emissions. But understanding the environmental effects of using AI has been difficult.

A new study from the Centre for AI, Trust and Governance at the University of Sydney provides a perspective that many people will be able to comprehend: the use of AI could lead to the death of millions of bees, putting at risk the A$4.6bn of Australian agricultural production that relies on honeybee pollination.

Until about 2020, the growing demand for digital services was largely offset by improvements in efficiency. Then generative AI (GenAI) shattered that equilibrium, with AI models using about 10 times more electricity than a conventional search. Consequently, datacentre electricity demand is expected to more than double by 2030, with AI responsible for about half of the increase.

To put that in a local context, electricity consumption by new datacentres in western Sydney is expected to be the equivalent of two large aluminium smelters.

So where do bees come into the story? According to the study, the incremental warming from datacentre growth means a fractional increase in the number of hot days per year. This leads not only to increased death rates for eggs, larvae, pupae and adult bees, but also a reduction in sperm quality, which can cause the eventual collapse of a colony.

AI-related emissions are expected to contribute 4.8 to 15.4 gigatonnes of cumulative global carbon dioxide emissions, increasing temperatures by between 0.0026 °C to 0.0084 °C. This could result in the loss of an estimated 3.5 to 14.1 million bees a year from heat stress, bushfires, the spread of Varroa mites (which amplify the effect of higher temperatures), and habitat loss.

“A relatively small change in temperature has a big impact on pollinators,” said Rob Nicholls, senior research associate at the centre and author of the report. He suggested that the bee example is one that resonates with people: “If I’m a keen user of GenAI, I might want to know that there’s been a little bit of a thought about sustainability from the supplier of my GenAI service.”

For example, Google has shown it can build a datacentre in Arizona that uses solar and wind power backed by a very large battery. “Why has it done this? This is not necessarily because it thinks that sustainability is a sensible approach. Its approach is, ‘it’s cheaper that way, much cheaper.’”

Another part of the problem is that the graphics processing units (GPUs) used for AI are optimised for processing, not energy use. “It’s time to start thinking ‘can we do this sustainably?’” Nicholls said. “And I think the answer is ‘yes, we can.’” Instead of measuring deals in megawatts, he suggested they should be measured in terms of sunshine hours, the amount of wind, and the battery size required.

The cheapest source of electricity is renewables, he said, and they can be battery-backed to ensure supply continuity. “Batteries used to be really expensive and very difficult. They’re not anymore.”

On the consumer side, Nicholls argued that people should have a choice. If two search engines gave the same results, but one was certified by independent authorities such as the Australian Energy Market Operator (AEMO) and the Australian Competition and Consumer Commission (ACCC) as sustainable, many consumers would choose it.

This does not mean we need to discourage AI adoption. “What we need to do is, rather than trying to tamp down demand, to actually deliver that demand in a way that's a bit more sustainable,” Nicholls said.

That might be achieved by running lots of smaller language models locally. He pointed to IBM’s recent development of nano models which can power AI agents that run on a phone. While this makes the energy impact visible as the phone’s battery drains, the report noted that the energy used in compute outside of datacentres, that is, by phones, tablets, desktops and laptops, is comparable to that of datacentres.

Over time, centralised AI is likely to become more efficient. Nvidia, AMD and other chip manufacturers are starting to focus on reducing GPU power consumption, just as they previously did with datacentre CPUs. Nicholls said that while this won’t happen immediately due to network and platform effects, the signs are there.

While “AI everywhere” doesn’t make sense, it can be used where it adds real value. For example, a council’s focus on rates, roads and rubbish can be addressed simultaneously. AI-equipped cameras on GPS-tracked rubbish trucks can detect potholes, allowing for more efficient and cheaper repairs than simply responding to resident complaints.

Conversely, some applications create new problems. Law firms might think they can automate discovery and due diligence, but then find in a few years they have no mid-level lawyers. “AI is here, people are going to use it,” Nicholls said. “The human value-add is from the analytical approach that takes in different inputs.”

At the University of Sydney, some courses now instruct students to use AI for a first draft of an essay, then critique the result from various perspectives and identify hallucinations. This shows the “type of value that graduates can offer in a world where you’re going to start to increasingly use AI, where those graduates in a couple of years’ time will be really valuable and probably more valuable mid-level employees than they would have been not doing it.”

AI can be part of the solution

Ravi Kumar Mandalika, executive partner and Asia-Pacific lead for energy, sustainability and utilities at IBM Consulting Australia, said the key question is whether AI is part of the problem or the solution. IBM is committed to the latter, he said.

He pointed to technology for predicting floods, managing vegetation to reduce bushfire risk, and IBM’s own NorthPole chip architecture, which offers high-performance AI inferencing with much greater energy efficiency than traditional GPUs, largely due to the way it packs memory onto the same chip as the processor cores.

Furthermore, IBM’s Blue Vela AI supercomputer, based on the Nvidia SuperPod reference architecture and used to train IBM’s enterprise-focused Granite foundation models, is housed in a datacentre that uses 100% renewable energy.

More generally, locating new datacentres in renewable energy zones has a dual benefit, Mandalika said. It reduces their emissions and, by virtue of their high electricity consumption, helps ensure the financial viability of those zones.

The report, which was supported by IBM as a silent partner, offered several recommendations for action. In the year ahead, datacentre operators should conduct comprehensive energy audits and shift to carbon-aware workload scheduling, so that intensive tasks like AI training occur when renewable energy is available.

After that come capital-intensive projects, including adopting advanced cooling technologies and installing on-site renewable generation and storage. By then, organisations should aim to migrate at least half their workloads to verified sustainable providers.

In the longer term, operators should aim for at least 80% renewable energy with transparent and verified environmental reporting, while their customers should achieve 100% sustainable datacentre sourcing and preferably reduce their carbon footprints despite business growth.

“The bee barometer points not toward inevitable decline, but toward the urgent need for integrated action,” the report concluded. “The technology that created the challenge can also provide the solutions, and the frameworks for implementing those solutions now exist. The question is whether we have the wisdom and resolve to implement them in time.”

Read more about AI and sustainability

Read more on IT efficiency and sustainability