Unlike digital-first organisations, traditional businesses have a wealth of enterprise applications built up over decades, many of which continue to run core business processes.
In this series of articles we investigate how organisations are approaching the modernisation, replatforming and migration of legacy applications and related data services.
We look at the tools and technologies available encompassing aspects of change management and the use of APIs and containerisation (and more) to make legacy functionality and data available to cloud-native applications.
Simon Spring is senior director at WhereScape, a company known for its work in data warehouse automation technologies which it channels to customers to design, develop, deploy and operate data infrastructure faster.
Spring and team reckon that, as part of the general drive for IT modernisation, data warehouse automation can help organisations lessen the risk of moving core business applications and analytical infrastructure into the cloud and retain flexibility.
In terms of how cloud migration works in the real world, Spring says he sees companies either moving straight to Platform-as-a-Service (PaaS), or, alternatively, moving to Infrastructure-as-a-Service (IaaS) as an interim step.
“You have your core business applications and then your analytical infrastructure, so moving those to the cloud is an obvious way of modernising your set up,” said Spring.
1) Companies can migrate their on-premises analytical databases up into the cloud running on the same infrastructure, but in the cloud (so Infrastructure-as-a-Service), eg: Microsoft SQL Server running on Azure VM, then the company gets to take full advantage of cloud pricing and flexibility.
2) Companies can then migrate databases onto a cloud native Platform-as-a-Service e.g. Microsoft Azure. He says that his team is seeing companies do that with application and analytics databases.
Spring writes as follows from this point forward…
Using automation software allows companies to regenerate their integrated and ETL code to run (and take advantage of) the new cloud native platform. For example, one of our customers, a major insurance company, had a bunch of old IBM infrastructure and was waiting for Snowflake. Building the data warehouse on DB2 meant the insurance company was safe in the knowledge that, when the cloud environment became available, it could republish all of the code, or logic, using all of the Snowflake native and optimised code.
Beyond a 10-year view
We are still educating folk around platform portability. Perhaps 20 years ago, people will have invested in Oracle with a 10-year view. For cloud, do we need a 10-year view? No, because of data warehouse automation. The flexibility means you could be on Azure this year, but Google Cloud next year, so vendor lock-in is less of an issue
The mechanics here involve using CI/CD concepts and with automation you can spin up Virtual Machine/test environments, develop code and then spin them down again. Let’s remember that CI/CD is cloud infrastructure by command line, so it’s very easy and cheap to do.
As adoption of the public cloud continues to soar, it’s only logical for organisations to look at their data warehousing requirements and wonder if the cloud will help them deliver data infrastructure projects to the business faster.
That’s why companies are turning to Snowflake, for example, a data warehouse built for the cloud that lets organisations store and analyse all data available to an organisation in one location.
But, savvy organisations aren’t stopping there. They are automating their way to data warehouses in the cloud to speed up time to value and limit cost and risk.