WavebreakMediaMicro - Fotolia
In October 2019, the Department for Food and Rural Affairs’ (Defra) Sustainable technology annual report 2018 to 2019 reported that cloud-first and digital agendas, policies and strategies have led to the closure of inefficient on-premise datacentres.
But as departments adopt more efficient cloud, private cloud or colocated datacentres, the operators must become more transparent in terms of sustainability, says Susanne Baker, who is responsible for TechUK's climate change programmes. Baker believes there is now greater emphasis on cloud providers to demonstrate how efficient they really are.
According to Greenpeace, since 2010, when it first started reporting on datacentre energy use, more than 20 of the largest internet companies – including Facebook, Google and Apple – have established public commitments to power their digital infrastructure with 100% renewable energy.
Last year, the environmental group claimed that unlike other leading IT firms that have adopted 100% renewable energy commitments, Amazon has “remained notoriously opaque when it comes to publicly reporting” information about its current energy use.
The Defra sustainability report found that while government departments are indeed lowering the carbon footprint of their IT estates, the increasing use of cloud providers by public sector organisations makes it far harder to calculate their overall carbon footprint.
“There are high-level protocols to assess the carbon footprint of outsourced cloud services,” says Baker.
But as every organisation becomes digitised and consumes more cloud services, she says it is now extremely hard for businesses to account for the carbon emissions these cloud-powered services create.
In a datacentre, servers tend to use power continuously because they operate 24/7. According to Baker, server lifecycle carbon impact is heavily dominated by the applications being run. As such, it is environmentally good practice to replace older servers with new ones on a regular basis. This runs contrary to perceived wisdom that assets should be sweated for as long as possible.
In Baker’s experience, servers over five years old are unlikely to contribute to an efficient IT operation. She says third-party providers are generally incentivised to optimise refresh rates, compared with in-house operations, where the datacentre is not run as a business unit.
“The large, hyperscale operators may replace the central processing units as frequently as every 12 months, but two to three years is more common practice,” she adds.
Services that run in the cloud consume processor cycles, storage and network bandwidth. These parameters can be measured, as they are used for billing. But, according to Baker, an accurate assessment of how green a particular service is very complex due to the highly distributed nature of cloud computing. This is a problem for the government and large enterprises as they attempt to meet sustainability targets.
For instance, in the past, a traditional media company may have been able to measure its carbon footprint by following its supply chain to assess the impact of printing newspapers and distribution. But, as media firms have moved to the cloud, it is now far harder to understand their carbon footprint.
Baker says an executive at a traditional media company recently told her that now the product is in the cloud, the company does not have “the faintest idea” of its carbon footprint.
Unlike when newspaper and magazine printing was outsourced to a printing company that needed to have its own sustainability reporting, it is difficult to gather evidence in the cloud. Not only do the media companies that wish to calculate their carbon footprint now need to understand how much energy is being used to host their digital media services, they also need to count the carbon footprint impact of the readership accessing those services.
And here lies another issue organisations face as they try to drive down their carbon footprint in the drive for zero emissions. Since Moore’s Law was coined in the late 1960s, the industry mantra has been to offer more for less: more processing power, which doubles every 18 months, according to Moore’s Law; more bandwidth; and more storage for the same cost.
This has driven up consumption of digital products and services – people are enticed to use more, because the industry claims this is better and does not cost anything more. For instance, standard-definition video is now defunct. Its successor, high-definition (HD), is quickly being replaced online by Ultra HD video. And such content consumes more and more bandwidth. According to the recommended bandwidth for Google’s Stadia stream gaming service, 10Mbps is recommended for 720p standard definition, 20Mbps for HD and 35Mbps for the best experience in Ultra HD.
Suzanne Baker, TechUK
While Stadia and other premium internet services such as Spotify and Netflix charge for higher quality streaming, requiring more bandwidth, much of the internet is free. Baker says consumers need to appreciate the environmental costs of these supposedly free services. While the younger generation prefers on-demand video services and YouTube, “broadcast has a way lower carbon footprint”, she says.
But there is no going back to the era when families would all gather in front of the TV to watch a broadcast. Instead, consumers are being offered higher and higher definition video streaming.
“How do you communicate to the consumer that high-definition video is very byte-intensive? There is an impact at every stage,” says Baker.
For example, if someone creates an ultra-high-definition video, uploading the file will consume some network bandwidth, and it will use significantly more storage than a standard-definition video, but, says Baker, “what happens if the video goes viral and is downloaded by 800,000 people?”.
Just as with high-definition premium internet services, each download will need to be processed in datacentres running servers, network switches and gateway servers, somewhere on the global internet.
For some applications, the consumer is likely to make a conscious effort not to use the ones that are less efficient. For instance, Baker says smartphone apps that are most energy intense are the ones that get weeded out.
But she says the advertising-based business model, where producers of content and consumers are not charged, does nothing to promote or encourage energy efficiency. “If you had to pay to upload a photo to Facebook, there would be outrage,” she says. “There is no information on the energy impact of the photo.”
Sustainability rising up the agenda
However, Baker believes the Defra report, which was produced in conjunction with the HMG Sustainable Technology Advice and Reporting team, will help to drive greater transparency in the datacentre operators’ market. “I am seeing change influenced by the government, which has net zero emissions goals,” she says. “There is a greater level of scrutiny.”
The growing pressure for datacentre and cloud operators to provide transparent reporting on sustainability could benefit both businesses and consumers. For instance, in the government department, Baker says green ICT is becoming a tender requirement. The sustainability credentials of the providers is part of the decision-making process.
As organisations assess what to outsource, and whether to use cloud infrastructure or a cloud platform and in-government department, green ICT is becoming a tender requirement for them too.
“The companies that ask questions on sustainability are very big customers, and this will apply pressure,” says Baker. “I would hope that in five years there will be much higher levels of sustainability reporting.”
Read more about green IT
- The energy usage habits of datacentres often see the sector labelled as a contributor to climate change, but the situation is far more nuanced than that, it seems.
- Printing reams of paper and travelling to meetings are among the areas targeted in the green ICT makeover.