kwarkot - stock.adobe.com

Agentic AI speeds up mainframe modernisation, but human experts remain key

Agentic AI tools are helping organisations overcome Cobol skills shortages and untangle legacy infrastructure, but successful modernisation still requires an expert in the loop to manage complexity

Skills, cost and agility are the three main drivers for organisations considering agentic artificial intelligence (AI)-supported code modernisation, according to Michael Vincetic, Kyndryl’s practice leader for cloud, core enterprise and zCloud in Australia and New Zealand.

The skills issue stems from a persistent shortage of people with mainframe – especially Cobol – expertise. Addressing this gap reduces the risks associated with supporting legacy applications.

“That’s attracted a lot of media attention lately with things like the Anthropic announcement,” said Vincetic, referring to claims that Claude Code can automate much of the work needed to translate systems written in Cobol into modern languages.

However, as Gartner distinguished vice-president analyst Manjunath Bhat pointed out: “There is very little value merely in porting code from one language to another without modernising the architecture and infrastructure. It defeats the purpose, because we don’t benefit from the scalability and reliability benefits of modern architecture patterns.

“The other reason is that modernising applications using composable and modular architectures helps adopt proven software engineering practices such as independent testability and independent deployment. These practices reduce the blast radius of changes and therefore minimise associated risks.”

According to Anthropic, Claude Code does much more than simply translate languages. The company claims the AI understands dependencies, preserves business logic while modernising to current frameworks, and generates both test units and modern documentation from legacy code.

“Agentic AI is already playing a major role in code modernisation projects from the most mundane to the most complex,” said Bhat. “When it comes to mundane activities, think of auto-implementing the system using modern design patterns, creating cloud landing zones, auto-generating new code as well as creating the tests and deployments needed to verify if the functionality works.”

A related benefit of agentic AI is that it can provide a way to operate this mainframe-centric infrastructure using natural language instructions, Vincetic pointed out.

As for cost, “the unit cost and commercial models for running on traditional infrastructure have shifted dramatically with the advent of new technologies,” he said.

Previously, trying to understand the logic within and around legacy systems was often more expensive than rewriting them entirely. But, Vincetic added: “AI has flipped that script now, and provided the capability to understand some of that detailed logic and interdependency that’s built up over, in some cases, tens of years.”

Bhat agreed: “AI becomes more useful when we use it for semantic conversion rather than just syntactic conversion, by mapping out the underlying data flows and therefore ensuring that the new system works as expected.

There is very little value merely in porting code from one language to another without modernising the architecture and infrastructure
Manjunath Bhat, Gartner

“Think of AI-assisted complex activities as the ability to explain what the code does, which parts of a complex system should be modernised first, which parts are most risky, and what the interdependencies are – both architectural dependencies and inter-team dependencies. These activities are part of what we might consider ‘discovery’ in the planning process. The more insights we glean using AI at this stage, the more prepared and seamless the downstream aspects of modernisation will be.”

If an organisation's goal is to increase agility, that probably means moving at least parts of the wider system onto a hyperscaler, Vincetic suggested, allowing them to take advantage of evolving capabilities in areas such as data management and analytics.

When an organisation runs a mix of legacy and contemporary infrastructure, there is a risk of “two-speed IT”, where the former operates in a rigid, waterfall manner and the latter in a dynamic, agile way. Running multiple operations capabilities usually incurs more costs and hampers speed to market, especially during digital transformation initiatives, he warned.

Banking and government are two of the primary sectors focusing on modernisation. Banks are being driven by ongoing digitalisation projects, while governments want to improve the quality of citizen services.

Kyndryl’s 2025 State of mainframe modernisation survey found that among the 80% of organisations that shifted their mainframe modernisation strategies over the past year, 43% are modernising more on the mainframe, 34% are integrating more with the cloud, and just 16% are moving more applications off the mainframe. Notably, only one of the 500 respondents planned to move off the mainframe entirely.

However, the predominant driver for modernising code is to eventually move it off the mainframe, Vincetic noted. Kyndryl’s survey found that 98% of respondents are moving some applications off the mainframe, migrating an average of 28% of their workloads to other platforms. Conversely, 56% are increasing their overall use of mainframes, in part by positioning them as the centrepiece of a hybrid environment.

Whichever strategy is in play, Vincetic suggested there are three key elements to modernisation: modernising the infrastructure (e.g. moving from a mainframe to a public cloud), modernising the operations capabilities (to cope with the characteristics of the new environment), and modernising the overarching operating model.

Extracting value from modernisation requires capabilities such as code conversion, but these only address specific aspects of the project. The real trick is understanding the context and interdependencies within a complex ecosystem like a mainframe.

Agentic AI can perform much of the code conversion, but more importantly, it can untangle legacy systems to determine business rules and other characteristics. What’s more, “it can do it in probably about half as much time… at a high degree of quality,” said Vincetic.

A mainframe application typically involves a set of data flows and integrations built over many years, alongside strict controls around data and policy management. Consequently, “the expert in the loop is still very critical,” he added.

These controls cover regulatory compliance, availability, and the disaster recovery requirements set by the organisation.

“If you really want to extract full value, you need to fundamentally re-architect by and large, which then brings AI to the table,” Vincetic said.

Mainframes have survived for many years because they are highly secure, incredibly good at processing high-volume transactions, and highly available.

“There are certain workloads that absolutely are best placed to reside on a mainframe for those intrinsic capabilities,” he explained. “But that’s always countered by pressures from the market to become more agile, to present more data, and to leverage digital channels.”

Therefore, it is a balance between preserving the intrinsic characteristics that make the mainframe so effective, and providing citizens or customers with the modern digital capabilities they expect.

Modernising a system carries the implied goal of achieving equal or better capabilities than before, and that requires human expertise.

Modernisation isn’t an all-or-nothing endeavour. Addressing one workload at a time is a practical approach, but understanding the context of that workload and the overarching drivers for modernisation is absolutely critical for success, Vincetic concluded. While AI can assist, expert guidance remains essential.

Read more about AI in Australia

Read more on Datacentre systems management