Quest argues for adaptive data controls in holistic heaven

The Computer Weekly Developer Network talks to John Pocknell is his role as senior product manager at Quest in line with a mini-series of posts related to the rise of what we have called the ‘holistic’ application.

Not perhaps a formally defined technology industry term, the holistic application is an app that has arisen in the age of cloud with containerisation, microservices and compartmentalisation of discrete components at its core.

Quest’s product set spans data protection, database management, endpoint systems management, migration and consolidation tools plus performance monitoring. Given this DBA and sysadmin-friendly spread, what is the firm’s position on the holistic application proposition, where a continuous stream of interdependent application parts must now be gracefully engineered into one single stream of fluid operations?

Pocknell says that in the realm of databases, more businesses are realising the importance of implementing processes and tooling to support continuous integration and deployment.

“But as the 7cs of DevOps shows, there will come an increasing demand for continuous monitoring, testing and others in order to be truly holistic. Continuous data replication will also be necessary to meet the demand of businesses running multiple database platforms in hybrid environments. Supporting a holistic application requires the different parts of the DevOps infrastructure, including the database, to become interdependent,” said Quest’s Pocknell.

Pocknell details some examples of database solutions designed to be adaptive as follows:

  • Continuous Monitoring – the ability to quickly detect changes in performance behaviour of databases inside the DevOps pipeline. Through the raising of automatic alarms in response to a comparison of performance against known baselines, operations staff can diagnose its root cause in order to trigger a change request.
  • Continuous Testing – the ability to automatically run a performance test, based on a known production transactional workload in a test or staging database to establish whether planned changes will scale in production. This can be provided via an open API to enable testing to be called programmatically as part of an automated deployment process.
  • Continuous Data Replication – the ability to automatically replicate changes to databases which need to be kept in sync. Target databases may be on-premise or in the cloud and may even be a different database platform.

“Some of the above processes are already used in traditional monolithic applications to reduce the administrative effort of managing multiple environments and enable IT to respond to unplanned changes faster,” added Pocknell.

From the Quest perspective then, there’s clearly no holistic without adaptive data management if we are going to achieve all-the-time continuous uptime. That’s definitely too long for a t-shirt, we need to work on this.

CIO
Security
Networking
Data Center
Data Management
Close