Unlike digital-first organisations, traditional businesses have a wealth of enterprise applications built up over decades, many of which continue to run core business processes.
In this series of articles we investigate how organisations are approaching the modernisation, replatforming and migration of legacy applications and related data services.
We look at the tools and technologies available encompassing aspects of change management and the use of APIs and containerisation (and more) to make legacy functionality and data available to cloud-native applications.
This post is written by Mathias Golombek in his capacity as CTO at Exasol, a company known for its in-memory analytics database that promises to help customers become ‘truly data-driven’ and transform the way business is carried out on-premises, in the cloud, or both.
Golombek writes as follows…
A shift in the data landscape
The platforms, processes and (application) programs that go towards forming the total IT landscape are constantly shifting, obviously. But within all that change, data itself hasn’t fundamentally changed i.e. the base level binary 1s and zeros are still there.
What has changed is the database.
This is because the nature of the database itself has changed to adapt to the frenetic workflows of contemporary business.
In-memory parallelized relational database systems that have been architected for cloud native deployment across strategically aligned compute clusters have shown us that there are more efficient ways to process terabytes of data as we scale upwards to petabyte-level business.
But next-generation enterprise data warehousing and advanced analytics through Business Intelligence (BI) acceleration isn’t always a plug-and-play affair for the companies looking to implement these technologies. As efficient as the new breed of data platforms are in themselves, it is the customer’s willingness to open up and democratise data across their business operations that sometimes poses the biggest challenge.
Drive to data-driven democracy
All too often we find that organisations have a disjointed approach to data strategy at an internal level. Enterprise organisations large and small understand that cloud computing can be a facilitating route to new efficiencies (and by that I mean faster query response times, more agility and a wider reach for BI in general), but they are held back by archaic workplace cultural trends that only serve to pin back their potential for IT infrastructure migration to cloud.
We often say that you can’t be a data-driven business if your teams simply can’t work with data… and by that I mean get access to data, interact with data, apply data to the coalface of business and extrapolate new business insights from data in live real time working environments.
In-memory, pure & true
What we have built is a next-generation architecture database that is extremely high powered for analytics. We believe we are the only vendor that has a pure and true in-memory offering in the database arena (other offerings from other vendors in this space are altogether more ‘stitched together’)… and so this offers new ways of working with new data streams because of the immediacy that in-memory provides.
That immediacy is key to enabling modernised data democracy i.e. you can’t get the information to the right people at the right time for the right use cases, then data democracy falters or even breaks down.
With this kind of power, organisations can ask more questions of their data. They can ask also more complex questions, go back further and question more data sets… and all with drastically reduced latency. This advancement drives business towards the creation of what we would call ‘operational BI’ i.e. a foundation that you can really run a company with day-to-day, based on automated decision processes.
From analytics to cloud analytics
People are now wondering how they should consider the shift from analytics to cloud analytics.
I believe that cloud architecture is not really a game changer in some senses. In so many ways it’s really just a different way of providing infrastructure… and to start with, the cloud was slower than many on-premises installations anyway.
What cloud gives us is the ability to not have to pre-define what infrastructure you will need at the start. Cloud provides a key pillar of agility… [it lets you play like a child in a sandbox to start with and then change what you’re doing]… and now we see some companies pulling back from cloud in certain elements of their deployment and using more hybrid systems. Public can be very expensive (reserved instances), so hybrid gives more cost-control-conscious options for core systems and also for data security and governance in this respect.
Departmental purchasing means that different departments can buy different parts of cloud as and when needed. The trend to fully managed services facilitate that trend and allow business stakeholders to circumvent the central IT. But this can release some level of management control and create shadow IT, so there is a trade-off.
Unifying BI and AI
With Exasol, you can install any programming language and run it, in addition to the standard SQL interface of databases. We also provide the ability to run analytics in-memory and use the data where it is, without having to ship any element of data out of the data warehouse. This way, we can consolidate and be able to operationalise data science. This, if you will, is our expression of IT modernisation.
Pulling data in and out of the data warehouse means that, typically, most organisations will find that they have failed to update the data warehouse enough. Doing it our way means you can work on raw data (such as log files and other information that comes from the data lake completely unparsed, categorized) and get the job done faster and more accurately. Some 80% of data scientists job is spent on data preparation, or so is said.
The people factor
We’ve spoke a lot about technology, but when it comes to people… I think that you need 3 keys things:
- Backing from the CEO at the company level.
- Super-technical subject matter experts and visionaries at the lower level…they need to be able to judge which is the best data technology for a specific problem, instead of just promoting the stuff they’ve already worked with.
- Between the two, you also need a role that is able to bridge between tech and business, this might be the head of BI and perhaps also business analysts.
I think that the whole data management market right now is so heterogeneous, the days of the big vendors offering one major platform for these tasks is behind us (and they themselves have bought so many smaller firms that they have not stitched the internal parts together)… so there are no best of breed solutions available in this sense. So if you want to be able to apply these kinds of technologies now you have to be extremely Agile from the start and accept that you know you will be changing your stack in the near future after you go live.
Where do we start?
So, by now you’re thinking (I hope), okay… so where should customers start on the road to data democracy?
Community Edition software options are a key route to running free test use cases and delivering Proof-of-Concept way to show the C-suite and the rest of the organisations stakeholders that data driven business is possible if we take a thoroughly democratic route to delivering information to everybody at exactly the right point for them to use it.
Driving towards data-driven democracy is a smart choice, who wouldn’t vote for that?