Green coding - KX: Going green in preparation for the year of the AI agent

This is a guest post for the Computer Weekly Developer Network written by Conor Twomey in his position as head of customer success at KX.

KX describes itself as a global leader in vector and time-series data management and KDB.AI Server is a highly-performant, scalable, vector database for time-orientated generative AI and contextual search. 

Twomey writes in full as follows…

We’re approaching our 54th Earth Day in a few weeks (the first event was held in 1970) and each year, support and programmes aimed at environmental protection continue to gain importance.

The Paris Agreement calls for global carbon emissions to be reduced by 45% by 2030 and reach net zero by 2050, placing pressure on enterprises to take a closer look at what is contributing to higher carbon emissions and how they can do their due diligence to reduce these numbers. Sustainability initiatives are most likely already on companies’ agendas, but there are areas of focus that can reduce carbon footprint while also achieving benefits in other aspects. Spoiler alert: it’s your computing demands.

Green IT has surged in importance recently, driven by the growing dissonance between computing demands and organisational sustainability goals. With the expansion of cloud services and the rapid adoption of AI technologies, like generative AI (gen-AI,) businesses are amassing and processing unprecedented amounts of data. These factors have an impact on computing demands and ultimately lead to heightened carbon emissions. This prompts a crucial question: are businesses willing to prioritise sustainability alongside efficiency? Luckily, there are strategies to harmonize both objectives.

Year of the agent

I recently had the opportunity to attend the latest Nvidia GTC conference that shed some light on a specific topic that is driving an increase in consumption. We can say that 2023 was the year of the prompt, but 2024 is shaping up to be the year of the agent. This prediction is certainly proving to be true, as we’re already seeing consumption skyrocket with people moving away from manually entering prompts to cascading automated AI workflows. But how, exactly, are these agents contributing to increased consumption?

AI agents are revolutionising workflows with their ability to chain multiple tasks together, each of which demands substantial data and energy resources. While this leads to enhanced performance and efficiency in AI applications, it also escalates carbon emissions and energy consumption. Consequently, green coding initiatives become paramount in ensuring that the efficiency gains from increased AI adoption are not overshadowed by environmental costs. The foundational components of green coding include more efficient compute hardware, sustainable cooling mechanisms, streamlined data management frameworks and efficient application development. On paper, this may seem like an easy task, but these components often prove to be challenging for developers. I encourage development teams to ensure that they have tools and partners accessible that can help them compress data, utilise less compute, process data in vectors and eliminate data duplication. By adopting these practices, developers can mitigate the environmental footprint of AI agents while advancing sustainability goals within their organisations.

I would also be remiss not to mention that green coding efforts can have a positive impact on other areas of your organisation.

KX customer champ Twomey: AI agents are revolutionising workflows, let’s embrace the change but keep our code mean, clean and green.

Many technologies in the AI ecosystem aren’t built for efficiency, so when conscious efforts are made to remedy inefficient code – whether for environmental or productivity reasons – we often see fewer barriers to the adoption of gen-AI across the enterprise.

Plus, of course, more efficient applications and less compute can reduce some of the high costs associated with AI development and production.

Data duplication eradication

Simply eliminating data duplication, especially data sitting within storage, can have a profound impact on costs that organisations can quickly rack up.

It’s really easy to get caught up in the hype cycle of gen-AI and move too quickly to put AI applications into production. Failure to construct efficient AI applications will have a significant increase on your computing requirements and will negatively impact execution, carbon footprint and costs.

I always recommend the classic approach of ‘go slow, to go fast’ – just like Aesop’s “The Tortoise and the Hare” success is often found most by those that take their time, rather than being quick and careless.

Building strong, efficient code and picking vendors and technologies that can scale with your ambition, align with your sustainability goals and increase efficiency will be a game changer. 

Data Center
Data Management