Sergey Nivens - Fotolia

Applications: Combining the old with the new

We look at how organisations can integrate legacy applications with newer digital platforms

Legacy applications are still extremely valuable to organisations, as they tend to offer the kind of security, reliability and resilience required. Innovation-fuelling data is also often stored in legacy databases. These tend to be systems of record and are used to support critical business functions, such as data processing.

But, says Ian Fairclough, vice-president of services for Europe, the Middle East and Africa (EMEA) at MuleSoft, when it comes to modern business requirements, legacy applications often fall short because they are less well suited to new consumer demands and less able to adapt to rapid change.

“Since abandoning legacy applications isn’t an option, organisations must find ways of modernising them to allow newer digital platforms and technologies to integrate with them more seamlessly,” says Fairclough. “They must unlock the data they contain and decentralise access, so that data can be used to enhance customer experiences more effectively.”

Tackling IT integration

In the past, organisations adopted point-to-point custom code integration, where developers effectively built a bridge between two systems to enable data to flow freely across the organisation to drive digital services.

But with today’s more complex IT systems, Fairclough says: “Often this results in tight coupling between applications, databases and devices, which creates close dependencies that are difficult to untangle, making change harder to implement and slowing the progress of digital initiatives and application modernisation.”

Many organisations see the cloud as a way to drive through application modernisation. The cloud enables them to adopt a cloud-native strategy, which limits the amount of software that needs to be on-premise. Software hosted on public infrastructure as a service (IaaS) can take advantage of the latest hardware, which helps organisations reduce their dependence on older hardware systems that may be reaching end of life.

But as Mark Cresswell, CEO of LzLabs, which specialises in modernising mainframe systems, points out: “It is conventional wisdom that mainframe legacy applications cannot participate in the new world of cloud-native computing; so much so that it is rare for commentators to question why this is.”

There are a few reasons why mainframes applications cannot be migrated to public cloud infrastructure easily. Cresswell says mainframe applications will not run on the underlying cloud hardware without significant refactoring and recompilation. “They are typically compiled into mainframe-specific machine code and the mainframe instruction-set architecture is substantially different from the x86 platforms that underpin almost all cloud services,” he says. “Legacy mainframe applications rely on infrastructure software to manage batch and online activity, data access and many other legacy mainframe features. Like the applications themselves, this infrastructure software is also tied to the physical mainframe hardware and will not run in a conventional x86 cloud environment.”

Read more about application modernisation

Among the opportunities of modernising applications is the ability to make use of new techniques like machine learning. But the toolset for developing machine learning capabilities is still quite immature.

To thrive in an era of multi-cloud architecture, driven by digital transformation, enterprise IT must evolve from ITIL-based gatekeeping to enabling shared self-service processes for DevOps excellence.

Another barrier to migrating mainframe systems is that the mainframe software development pipeline cannot support many of the rapid deployment features that cloud-native applications rely on, says Cresswell, and it is virtually impossible to spin up testing environments on mainframes without extensive planning. “There is just no support for large-scale, container-driven integration testing of traditional legacy applications after each merge of a code branch.”

Cresswell says much of the tech emerging in recent years to support cloud-native applications is simply not available within the legacy environment of mainframes. Without the ability to run application components using a containerised deployment model, many of the other cloud-native requirements are not achievable.

However, he says recent innovations have allowed for migration of applications, unchanged, to containerised open source systems where the fundamental mainframe infrastructure is faithfully reproduced in a software-defined manner in the cloud. “The application can then function exactly as it would in situ and the rest of the journey toward a cloud-native implementation unfolds,” he says. 

But simply moving a monolithic legacy mainframe application to a cloud environment does not make it cloud-native. For Cresswell, a key part of the rehosting process involves the replacement of legacy application programming interfaces (APIs) – which will not run on x86 servers or the cloud – with API implementations based on standard Linux facilities and cloud-native components. This implementation model allows for the exploitation of cloud-native components, he says.

MuleSoft’s Fairclough adds: “With an API strategy, it’s possible to modernise legacy applications while supporting immediate digital transformation needs – allowing enterprises to be open and ready for a whole new world of possibilities.” 

But creating modern APIs to access legacy systems is only one part of the application modernisation story. Rebecca Fitzhugh, director of developer relations at Rubrik, recommends treating APIs like software products. “To minimise the toil experienced by your API consumer, treat the API as a product – even if you are not billing the consumer for usage,” she says. “Consider the entire API lifecycle.

This means having a clear progression into a semantically versioned API with a defined deprecation period and process. This will result in fewer headaches experienced by both your engineering teams and API consumers.” 

API version control can help organisations reduce complexity and enable teams to make changes quickly. For any API change that may break applications, Fitzhugh urges developers to create a new version of the API. Such changes that require a new API version should cover any that alter part of the API, changing the response type and/or changing the response data format. Major version changes are probably unnecessary for enhancements to the API if there is no risk of breaking existing applications, such as when adding new endpoints or response parameters, she says.

When customers interact with companies today, they connect to a host of back-end systems spanning mobile, desktop, call centres and stores. This connectivity needs to be seamless to enable a good customer experience. Ben Stopford, lead technologist, office of the CTO at Confluent, says: “Customers expect a single, joined-up experience, whether it is viewing payments and browsing catalogues, being guided by machine learning routines or interfacing with sensors they interact with in the world.” To achieve this, says Stopford, data needs to flow from application to application, microservice to microservice or datacentre to datacentre. Any firm that blends software, data and people together in some non-trivial way needs to tackle this problem head-on. 

He suggests IT decision-makers consider an event streaming system, which offers data infrastructure that connects and processes data in flight. Such IT infrastructure provides a real-time conduit to connect microservices, clouds and on-premise datacentres. “These architectures, whether at internet giants like Apple and Netflix or at the stalwarts of the FTSE 500, all have the same aim – to make the many disparate, but critical, parts of a software estate appear, from the outside, to be one,” he says.

Central nervous system

According to Stopford, event streams provide physical separation by acting as a kind of central nervous system that connects applications at the data layer while keeping them decoupled at the software layer. “Whether you’re building something greenfield, evolving a monolith or going cloud native, the benefits are the same,” he adds. 

Application modernisation initiatives tend to focus on application development, promoting modern ways of working and managing, such as agile, DevOps and open leadership. But John Kendrick, agile and digital transformation delivery specialist at DMW Group, and Akshay Anand, a product ambassador for IT service management at Axelos, say many organisations are now finding their change management processes are slow-moving, cumbersome and bureaucratic. “The agile teams’ immediate response is either to ignore or, in some cases, to reject change management processes entirely.”

Kendrick and Anand warn that such projects may lack details of how they will impact people and operations. “The general belief held by many change management professionals is that as teams get faster at doing things, the chances of slipping up and introducing a bug – or missing a key compliance requirement, say – also get higher,” they say. 

Wider cultural impact 

Beyond the software teams, application modernisation has a wider cultural impact across organisations, and cultural impact is one of the biggest problems that can determine the success or failure of any application modernisation project. “Any work done around modernising IT has to either fit into the way the company works today, or a thorough culture change will be needed,” says Iain Chidgey, vice-president for EMEA at Sumo Logic.  

This can be much harder than any implementation of technology, unless people understand the value of the changes and can see how it will benefit them, he says. Organisations need to think about application modernisation beyond the realm of connecting information silos using techniques like API connectivity or event streaming. Instead, successful application modernisation initiatives should aim to improve business processes in a way that everyone can recognise as an improvement to their working practices. This, says Chidgey, requires a pervasive approach to data, covering people, processes and technology together.

Read more on Software development tools