ra2 studio - Fotolia

IBM: AI cost pressures fuelling cloud repatriation

The drive to fund AI projects is forcing IT leaders to re-examine cloud spend, prompting a rethink of cloud-first strategies and a new focus on IT financial management

Much has been written about the challenges of artificial intelligence (AI) adoption, from establishing a solid data foundation and setting appropriate guardrails to deciding whether the goal is to improve existing operations or create new business models.

But where does AI fit within IT management?

According to Ajay Patel, general manager for Apptio and IT automation at IBM, the pressure point is cost. The demand for technology is out of sync with the available budget, so the general theme is, “How do I squeeze the non-performing aspects of my technology spend to fund the growth in these AI initiatives?”

A secondary question is, “How do I think about where to invest my precious AI dollars, which of my AI projects and experiments should be put into production, and how do I do that safely and cost-effectively?” This is especially pertinent as an IBM study found that only one in five AI projects makes it into production.

IBM thinks AI will present a bigger challenge than the cloud because it will be more pervasive with more new applications being built on it. Consequently, IT leaders are already nervous about the cost and value implications and are looking for ways to get ahead of the curve.

Repeating the experience of cloud adoption, AI is being driven by business teams, not by back-office IT. AI is becoming a significant driver for shifting workloads back to private, on-premise systems. This is because data becomes the most critical asset, and Patel believes few enterprises are ready to give up their data to a third party at this stage.

IBM’s systems integration partners are repatriating workloads from the cloud, and its consulting partners are setting up hybrid practices again. Drivers include geopolitical considerations, data sovereignty and a general desire to find the best place to run each workload.

Pete Wilson, vice-president for the IBM Apptio business and Asia-Pacific general manager at Apptio, observes that one in five of its customers is now discussing workload repatriation from the cloud.

Another reason for this is that certain workloads were not designed to run in the cloud, and a “cloud-first” policy is starting to drive costs up. While storage and compute costs have fallen, an application that is a heavy application programming interface (API) consumer can cause ingress and egress costs to soar. These workloads need to be brought back from the cloud until they are either retired or re-architected.

The cloud market did a very good job of focusing on the compute and storage aspects, but what was often overlooked was the impact on networks. For example, a link connecting two key datacentres might have handled most of your traffic, but now that work is being done in a public cloud. Getting out of your datacentres does not immediately reduce your network cost because you are committed to the telco lines for up to 12 months. And very “chatty” applications were not a big problem on internal networks, but if you lift and shift them into the cloud, you are paying every time they chat.

The cloud is an excellent platform for many workloads, just as there are certain workloads that run extremely well on a mainframe. The key is to understand workload placement: is my application best placed on a mainframe, on a private cloud or on a public cloud?

As they start their AI journey, some of Apptio’s customers are not ready for their models, learning and intelligence – their strategic intellectual property – to sit in a public cloud. There are consequences when things go wrong with data, and those consequences can be severe for the executives concerned.

So, when a third party suggests putting all of the customer, operational and financial data in one place to gain wonderful insights, some organisations are unwilling to do this if the data is outside their direct control. Thus, the move into AI is driving some of that data repatriation. Time will tell whether it stays that way.

That said, there is a clear divide based on the age and legacy of the organisation. For those that started in the past five to seven years, everything has been in the cloud and everything will stay in the cloud. But for established organisations with legacy systems, the cloud accounts for less than 20% of their technology budget.

Moving beyond spreadsheets

How can an organisation track and manage its IT expenditure, whether for AI or more generally?

One of Wilson’s roles is product ownership of Apptio, IBM’s IT financial management (ITFM) product, and he is also currently acting as chief customer officer. He said a current trend in Asia-Pacific, and indeed the rest of the world, is a shift away from using spreadsheets for ITFM.

Small to mid-size technology shops need something more capable, he says, in part because technologists are no longer seen just as operations people, but as strategic partners at the C-suite table. Technology is often the second-largest area of spend in a company’s budget, so business leaders want to know if they are spending the right amount and getting value from every dollar.

“That value conversation is difficult to have when relying on spreadsheets,” says Wilson. “So, technology business management [TBM], IT financial management, strategic portfolio management and FinOps practices are moving downmarket into smaller organisations.”

Australia and New Zealand are relatively mature in TBM. The majority of the ASX 20 use one or more of Apptio’s products. Australia, in particular, has embraced the TBM discipline, whereas Southeast Asia is still in the early days of adoption.

Five years ago, only large organisations had any degree of TBM maturity. But that has changed, and Apptio is now working with organisations that have IT budgets down to $25m.

A two-dimensional spreadsheet does not provide the ability to dig in and find real insight. It prevents on-the-spot conversations with a technology consumer. While analysts could create endless linked spreadsheets, the desire for real-time analysis has significantly expanded this market into smaller organisations over the past five years.

Southeast Asia is well supplied with low-cost staff, so over the past two years, Apptio has built products specifically for this less mature segment of the market. The latest Gartner and Forrester analyses show that this market is opening up. As those companies grow and become more advanced, they can move into some of Apptio’s more traditional products.

The need for financial intelligence

Patel believes organisations realise they need to be more data-driven. “‘What we can’t measure, we can’t improve’ may be a cliché, but it is also the truth,” he said. “They need to understand their current technology costs, they need to do real-time planning and forecasting, and they need to collaborate on analytics.”

Apptio’s position is that while there is a clear need for systems of record – financial, customer relationship management, HR and workflow systems – there is also a need for systems of financial intelligence for organising and aggregating financial, user, performance and risk data to provide a multi-dimensional view.

This means financial awareness is being decentralised rather than remaining a centralised function. Everybody wants to do the right thing, but they do not currently have access to the information needed to make correct decisions. In a world of AI and agentic infrastructure, they will not only have the information, but will also receive help with decisions, such as choosing between two projects or reallocating resources.

The introduction of AI provides an opportunity to move away from a reactive mindset, according to Wilson. If an organisation has this data, AI can surface insights early enough for proactive changes to be made before costs are incurred. Apptio is working on providing persona-based insights. For example, a DevOps leader could benefit from insights that will allow cost-saving changes to the way software is built.

But “garbage in, garbage out” still applies, warns Patel. If the underlying data is poor, applying AI will not magically produce a great answer. Patel says IBM brings its domain-specific expertise to help customers curate and organise their data, creating a set of AI tools that address the needs of people in various roles. That is where the company sees the future.

IBM’s vision is that every enterprise will have to build an organisation-wide strategy for financial intelligence, bringing operational, financial, performance and reference data into a single place so AI can operate at scale. This, he predicts, will differentiate successful companies from those that are not.

Read more about AI in APAC

  • Dell Technologies has opened an AI innovation hub to speed AI adoption for enterprises across Asia-Pacific and upskill 10,000 students and mid-career professionals in Singapore.
  • Snowflake’s chief data analytics officer, Anahita Tafvizi, explains why a data strategy focused on governance, consistency and accuracy is the only way to build AI that users will trust.
  • Zendesk once pushed its AI vision, but now customers are leading the charge. Its CTO explains how this reversal is creating roles like the ‘bot manager’ and shaping the future of customer experience.
  • SK Telecom is building the Haein Cluster AI infrastructure to support its Petasus AI Cloud service in a bid to meet the demand for AI training and inference within its borders.

Read more on Managing IT and business issues