NicoElNino - Fotolia
Veritas aims at autonomous backup discovery and provisioning
Veritas bets on a backup platform that can discover all enterprise data – on-premise and in the cloud – and make its own decisions about the optimal place to store copies
Data protection heavyweight Veritas is aiming to rework its core software offering NetBackup around autonomous functionality.
Its Autonomous Data Management project will see the development of, for example, autonomous provisioning capabilities where backup software seeks out data to be protected across on-premise capacity and in the cloud and stores it optimally in the appropriate tier.
So, Autonomous Data Management would designate backup data to differing tiers of storage, from those that are close to production for rapid restore and use, to those that are archived off for long-term retention. The artificial intelligence (AI) would rate the importance of data in terms of its access requirements, disinfect corrupted files where necessary, and restore files to the appropriate place for future use.
“The problem for CIOs is the struggle to keep up with all the applications deployed in the enterprise, and that is aggravated by the proliferation of data in a multitude of cloud services,” said Doug Matthews, senior VP of products at Veritas.
“If you don’t understand the data you manage, you won’t be able to define the rules to protect it. Our aim is to eliminate the effort needed to manage data,” added Matthews during an interview with Computer Weekly’s sister website in France, LeMagIT, during a recent IT Press Tour event.
NetBackup 10 was launched in March 2022 with numerous enhancements that included some policy-based automation for provisioning, notably in AWS and Azure environments. However, full autonomisation of these kinds of functionality will not be complete until the next version in 2024.
Matthews clarified the distinction. “With automation, you define the rules and they apply themselves,” he said. “In autonomic mode, the rules are deduced according to metrics derived from your production workloads and which are constantly reassessed. All you have to do is label your resources via the interface and the software does the rest.”
Currently, Veritas has only presentations and small-scale prototypes to demonstrate its progress. But it said the challenges are so important to its customers that it plans to prepare them for an early switch to the era of autonomic processes.
Veritas is a long-established giant among backup product suppliers, with very large accounts well represented among its client list. These include the biggest global banks, telcos and pharmaceutical companies, all of which are heavily-regulated sectors where letting a machine work its way through enterprise data is not necessarily seen as best practice.
Read more on backup and data protection
- Create your data backup strategy: A comprehensive guide. This data backup guide will help you if you’re starting the planning process, looking for a refresh or seeking new options. Backup plans are critical in today's environment.
- Backup failure: Four key areas where backups go wrong. We look at the key ways that backups can fail – via software issues, hardware problems, trouble in the infrastructure and good old human error – and suggest ways to mitigate them.
“The reality of increasingly complex cloud deployments means that enterprises are putting themselves in danger if they rely on managing backup manually,” said Matthews. “Now, only AI can guarantee that you won’t restore corrupted data. In two years, none of our customers have lost data after a cyber attack, thanks to NetBackup. We are moving towards autonomisation to maintain that reputation.”
The company believes it can convince its customers to look beyond any regulatory concerns. Protecting against ransomware is a key argument it deploys, but eco-responsibility is another. Veritas cited a US study which calculated that to store a PB of data in the cloud for a year emits 3.5 tonnes of CO2. Addressing this concern, it claimed that its algorithms can significantly reduce data volumes, and that this will diminish CO2 outputs beyond that possible by an IT team and without excessively complex management.
Autonomous Data Management is likely to allow savings to be made, it said, not only in terms of the ability to purchase capacity in smaller volumes, but also by being able to select from better cloud tariffs.
According to Matthews, AI is the key to data protection that can adapt to changing circumstances in real time and instantly call up the correct response.
Having said that, AI is efficient only if it is trained sufficiently. Veritas said it plans to build a data lake of metadata that references the ways its customers protect their data, and that will serve as training data for its machine learning engine. Matthews stressed that no customer data will leave its site to feed the data lake.
“We have worked for two years on this AI engine and as far as we know, we are the only one to have gone in the direction of an autonomous system,” said Matthews. “That means we are the only one that will offer such a solution to enterprise customers.”
He predicted that Veritas’s turnover would grow by 8% to 10% a year when Autonomous Data Management has been fully productised, probably as part of NetBackup 11.