The role of APIs in controlling energy consumption

In this guest blog, Chris Darvill, solutions engineering vice president for Europe, Middle East and Africa (EMEA) at cloud-native API platform provider Kong, sets out why the humble API should not be overlooked when organisations are looking to make their IT setups more sustainable

Within the next 10 years, it’s predicted that 21% of all the energy consumed in the world will be by IT. Our mandates to digitally transform mean we’re patting ourselves on the back celebrating new ways we delight our customers, fuelled by electricity guzzled from things our planet can’t afford to give.

Addressing this isn’t about the steps we take at home to be a good citizen, such as recycling and turning off appliances when not in use.  This is about the way we architect our systems.

Consider that Cisco estimates that global web traffic in 2021 exceeded 2.8 zettabytes. That equates to 21 trillion MP3 songs, or 2,658 songs for every single person on the planet. It’s almost 3 times the number of stars in the observable universe.

Now consider that 83% of this traffic is through APIs. While better APIs can’t alone improve energy consumption (no one thing can), they do have the potential to make a big difference, which is why we need to be making technical and architectural decisions with this in mind.

Building better APIs isn’t just good for the planet and our consciences; it’s good for our business too. The more we can architect to reduce energy consumption, the more we can reduce our costs as well as our impact.

To reduce the energy consumption of our APIs, we must ensure they are as efficient as possible.

This means eliminating unnecessary processing, minimising their infrastructure footprint, and monitoring and governing their consumption so we aren’t left with API sprawl leaking energy usage all over the place.

Switching up API design

APIs must be well-designed in the first place, not only to ensure they are consumable and therefore reused but also to ensure each API does what it needs to rather than what someone thinks it needs to.

If you’re building a customer API, do consumers need all the data rather than a subset?  Sending 100 fields when most of the time consumers only use the top 10 means you’re wasting resources: You’re sending 90 unused and unhelpful bits of data every time that API is called.

How to build and deploy a sustainable API

Where do your APIs live? What are they written in? What do they do? There are many architectural, design and deployment decisions we make that have an impact on the resources they use.

We need the code itself to be efficient; something fortunately already prioritised as a slow API makes for a bad experience. There are nuances to this though when we think about optimising for energy consumption as well as performance. For example, an efficient service polling for updates every 10 seconds will consume more energy than an efficient service that just pushes updates when there are some.

And when there is an update, we just want the new data to be sent, not the full record. Consider the amount of traffic APIs create, and for anything that isn’t acted upon, is that traffic necessary at that time?

Deployment targets matter. Cloud providers have significant research and development (R&D) budgets to make their energy consumption as low as possible; budgets that no other company would be prepared to invest in their own datacentres.

However, with the annual electricity usage of the big five tech companies — Amazon, Google, Microsoft, Facebook and Apple — more or less the same as the entirety of New Zealand’s, it’s not as simple as moving to the cloud and the job being finished. How renewable are their energy sources? How much of their power comes from fossil fuels? The more cloud vendors see this being a factor in our evaluation of their services, the more we will compel them to prioritise sustainability as well as efficiency.

We must also consider the network traffic of our deployment topology. The more data we send, and the more data we send across networks, the more energy we use. We need to reduce any unnecessary network hops, even if the overall performance is good enough.

We must deploy our APIs near the systems they interact with, and we must deploy our gateways close to our APIs. Think how much traffic you’re generating if every single API request and response has to be routed through a gateway running somewhere entirely different.

Manage API traffic

To understand, and therefore minimise our API traffic, we need to manage it in a gateway. Policies like rate limiting control how many requests a client can make in any given time period; why let someone make 100 requests in one minute when one would do? Why let everyone make as many requests as they like, generating an uncontrolled amount of network traffic, rather than limiting this benefit to your top tier consumers?

Caching API responses prevents the API implementation code from executing anytime there’s a cache hit – an immediate reduction in processing power.

Policies give us visibility and control over every API request, so we know at all times how and if each API is used, where requests are coming from, performance and response times, and we can use this insight to optimize our API architecture.

For example, are there lots of requests for an API coming from a different continent to where it’s hosted?  If so, consider redeploying the API local to the demand to reduce network traffic.

Are there unused APIs, sitting there idle? If so, consider decommissioning them to reduce your footprint. Is there a performance bottleneck? Investigate the cause and, if appropriate, consider refactoring the API implementation to be more efficient.

Having visibility and control over APIs and how they are consumed will greatly impact overall energy consumption.

Time to think again

We all happily switch between Google Drive, iCloud, Software-as-a-Service apps and the umpteen different applications we use day-to-day without thinking about their impact on the planet.

Thanks to privacy concerns, we have a growing awareness of how and where our data is transferred, stored and shared, but most of us do not have the same instinctive thought process when we think about carbon emissions rather than trust.

It’s time to make this a default behaviour. It’s time to accept, brainstorm and challenge each other that, as technologists, there are better ways for us to build applications and connect systems than we’ve previously considered.

CIO
Security
Networking
Data Center
Data Management
Close