Unlike digital-first organisations, traditional businesses have a wealth of enterprise applications built up over decades, many of which continue to run core business processes.
In this series of articles we investigate how organisations are approaching the modernisation, replatforming and migration of legacy applications and related data services.
We look at the tools and technologies available encompassing aspects of change management and the use of APIs and containerisation (and more) to make legacy functionality and data available to cloud-native applications.
Shanks writes as follows…
Platform, process, people
When a business wants to make a major change to its infrastructure, for example, by migrating its systems to the cloud, the technology choice is as important as processes and people that are put in place to set it up and manage it afterwards.
The technology becomes more and more complicated and abstract, but luckily there are solutions that take that complexity away and minimise amount of knowledge required to operate it.
With that said, there’s no substitute for meticulous planning and setting realistic time frames; the consequences of getting it wrong are high.
But, there can often be a knowledge gap at the top of large organisations.
The C-suite may be unaware of or underestimate the amount of work and time needed to make a modernisation happen. There are a lot of things that need to be considered: security, existing applications, current processes, costs, tool selection, application dependencies [and their related connections inside databases] and much more. The results can be delays, cost-overruns or, in the worst cases, serious security breaches costing the business millions, not to mention losing trust of their customers.
There are a number of ways to mitigate this risk.
First, you should start small with a pilot to prove the value of the migration. Start with choosing the right project for the pilot, decide what you want to prove and ways to measure it i.e. measures of scalability, reliability, speed of the feature development etc. Then, select next-generation technology that will match your objectives… and after the pilot is successful, consider other things that need to be improved, for example how securely sound is the solution you have tested?
Companies need to keep in mind portability – specifically avoiding vendor lock-in and changing the type of the database deployed. Therefore, if you are looking at databases as a service offering, you need to make sure the data application itself is designed for portability and flexibility.
It’s important to choose the right type of libraries, such as ORM-based Object Relational Mapping libraries such as in golang (gorm) that supports multiple types of SQL backends and allows for a lot more portability. Businesses need to also consider the security around the service i.e. how is data managed by the cloud provider, how do you get audit visibility on access, can you encrypt the data accordingly and where do these keys come from?
Also ask yourself, what country is the data going to need to reside in if there are compliance or legal constraints?
Another point to have in mind is how patches, updates and test cycles are managed and whether the cost benefits are in line with the operational benefits. Using a best of breed hybrid model to manage costs, such as using containers in development in your local environment and only using Database-as-a-Service (DaaS) as you’re moving closer to production can be a key route to getting those extra operational features that the organisation needs.
Appvia describes itself as a delivery platform for DevOps teams to develop and release at scale . Its products are designed to reduce infrastructure costs and give developers the freedom to develop. The company also has specialist knowledge of cloud and DevOps strategies, with a concerted focus on security, testing and monitoring.