Cloud-centric software-defined information management company Veritas Technologies reminds is that 2.5 quintillion bytes of data a day are currently being produced today in 2017.
The company is now strongly focused on digital compliance and data visibility in relation to the tools it is currently working to refine and develop.
“Developers don’t want to think about storage,” asserts Mike Palmer, executive VP and chief product officer at Veritas. We need to provide the coding community with a means of knowing that the storage power is there and a means of indexing the information held within.
Storage not snorage
So this is data storage yes, but Veritas goes rather further than crusty old tapes and disks… this is cloud-controlled software-defined data management that embraces a notion of not just information technology (IT), but also information (on its own, as an entity) and technology (as platforms, tools and functions that look after our information).
Veritas CEO Bill Coleman spoke at the firm’s 2017 ‘Vision’ conference and exhibition to explain how his firm has grown now as a ‘de-merged’ distinct entity outside of Symantec.
NOTE: Veritas spent 10 years between 2004 and 2014 as a part of Symantec.
Coleman spoke of cloud native intelligent analytics and how his firm is providing a Software Development Kit (SDK) to be able to code to what is an information management platform serving cloud centric applications.
Putting secondary data first
Veritas also welcome product VP Palmer to the stage for a keynote session. Speaking of why backup technologies are so important, Palmer noted that Uber, Lyft and others have created the so-called sharing economy… and that this has led to the creation of the ‘backup estate’ which is essentially the bulk of data that could be used for competitive advantage (inside new contemporary cloud centric applications that developers create today) … but it often isn’t, as it simply sits wasting time and money as so-called ‘secondary data’.
A new notion of secondary data is needed here…
What is secondary data?
Secondary data is often defined as research data that has previously been gathered (and can be accessed by researchers) whereas so primary data is data collected directly from its source… but in the sense of information technology we extend the definition to also now suggest that secondary data does come from its source but it is unused user data, extraneous data lake data, unstructured data, peripheral IoT and edge computing data. So essentially it is all forms of additional data that is not being driven into live production systems for competitive advantage inside the business.
“Secondary data is the most under-utilised asset in your business,” asserts Veritas’ Palmer.
As this secondary data sits in legacy databases, proprietary data storage systems and data stores from the previous (less cloud centric) world of IT, Veritas starts to build its argument for its own product set.
“Backup used to be a consolidated platform back in the day, but architectures have changed and data workloads have diversified. While specialisation became a goal as different users wanted to run different workloads what we actually got was diversification and this was not a good as it leads to non-compliance and fragmentation,” said Palmer.
Has storage become sexy yet? Well, it may never quite become sexy… but the new way storage is pushing all data (and all unused data) further upwards in importance in contemporary IT systems and the modern software application development stack is real.
Veritas used this year’s show to announce new developments to the Veritas 360 data management portfolio spanning Veritas NetBackup—the company’s flagship offering, Veritas Information Map and Veritas Appliances.
“We are living in the age of multi-cloud, where organisations require a policy-driven data management strategy to protect valuable data assets across multiple datacentres, public and hybrid clouds,” said Palmer. “Today, with more than 20 new connectors in Information Map and advancements to NetBackup 8.1, organisations can now protect more cloud-based workloads, reduce storage costs in multi-cloud environments and gain increased visibility of data that historically has been hard to identify—all critical components of a successful multi-cloud strategy.”
With 2.5 quintillion bytes a day today and that figure set to rise, we had better worry about more intelligent storage and look to ways to manage this challenge with analytics and Machine Learning (ML)… and this is precisely where Veritas seeks to now develop its technology stack and so validate its customer facing proposition.