kentoh - Fotolia

Q&A: From backup to data management

Beyond data backup, extracting more metadata to improve business processes and data management will become more important, says a senior Veritas executive

This article can also be found in the Premium Editorial Download: CW Asia-Pacific: CW ANZ: Flash arrays make a splash:

In a world where data is the new gold, data management has become more critical than before. After all, if companies do not know what lies within and even outside their walls, there is no point talking about rolling out newfangled tools to draw insights from vast volumes of data.

The data protection laws passed by governments around the world does not make data management any easier, often limiting the retention of some data that might not be useful today, but could well be decades later.

Computer Weekly sat down with Scott Anderson, global senior vice-president for information protection solutions at Veritas, to discuss these challenges, and what Veritas is doing to help organisations better manage their data.

Q: Veritas is often seen as a backup company. How is it evolving to address the needs of enterprises today?

A: We’ve always been known as a backup and recovery company with products like NetBackup and Backup Exec as well as our appliances. Historically, these are storage products, founded on volume management, file system availability and clustering. What we’re trying to do is to transform into an enterprise data management company with a broader remit around managing data and information for enterprises in a hybrid world where everything sits on-premise and in the cloud.

In September 2016, we launched this concept called 360 data management. One of the things we started working on was this idea that NetBackup, in particular, has visibility into a tremendous amount of data.

We protect so much of the world’s data with our backup and recovery products. And we saw a lot of information about the systems that we back up, and we thought it’ll be a great way to leverage that data by providing global visibility to a customer. We also restore a lot of copies of data in point-in-time images, so we thought we’re going to extend the value of those images so that they are not just used for recovery.

360 data management has four elements. The first is unified data protection, which comprises backup and recovery products that are integrated with our appliances. We launched that with NetBackup 8.0, and we have had physical appliances for nearly seven years now.

It has been a very successful business for us – we’ve shipped well over 20,000 appliances to 5,000 customers. In the purpose-built backup appliances market, we’re the market leader in integrated systems, overtaking Dell-EMC, according to IDC. We’ve also focused on making ourselves relevant in the cloud, by connecting to the cloud for long-term retention. We’ve recently announced partnerships with Amazon Web Services (AWS) and Microsoft Azure to protect the workloads in those clouds.

Countries like Thailand will skip a few steps during the digitisation process, but they need to think about whether they are going to classify data and develop information management and retention strategies upfront.
Scott Anderson, Veritas

The next thing is what we call global visibility. This is where we bring the information map, where we put an agent on the NetBackup server that connects to Veritas’ cloud. The agent takes the metadata that NetBackup is collecting and puts it literally on a map that shows you where your unstructured data sits physically.

It also shows a bunch of attributes, like the owner of that data, when it was created and modified, as well as file type. This data allows people to do better data management. We launched that at the end of September 2016 and have about 100 customers, and we have analysed over 40 petabytes of primary data against over 50 billions files and objects.

That solution allows customers to do a number of things. One is to look at data and in our data genomics report, we found that 41% of file-based data has been modified on primary storage and is three years or older. So, you can imagine from a backup point of view, you’re doing deduplication but you’re still looking at a file and trying to back it up, do segment matching and say don’t put it on storage. But in some cases, you’re still sending data without modifications to tape every week.

Wouldn’t it make more sense to archive that data, which could include orphan data of employees who have left the company? We’re able identify such data, such as Outlook PST files that some companies don’t allow for compliance reasons. If we find those files, they either get deleted or archived for legal discovery. These are some of the things that we do to help companies free up storage, delay future storage purchases, better govern data and optimise backup policies. I believe these are unique capabilities that Veeam, Dell-EMC and Commvault do not have.

Another part of 360 data management is integrated resiliency, enabled through a disaster recovery solution called the Veritas Resiliency Platform. It allows customers to do disaster recovery (DR) through the cloud as well as on-premise. What we found from our customers was that one of three use cases always exists around DR: they are consolidating their data centres and want to ability to move applications in bulk.

They also want to move workloads to the cloud and with our latest release, we can migrate on-premise workloads to AWS. It also allows customers to achieve DR compliance in business continuity planning, which requires them to test and rehearse DR scenarios. The integration with Netbackup also enables them to meet different recovery time objectives (RTO) for different applications. For applications that have RTO requirements that fit the frequency of backups, organisations can now use the backup copy of not just the data, but also the business service.

The last thing is Veritas Velocity, which is a copy data management solution that integrates with Netbackup. If you have a copy of an Oracle database protected by Netbackup, you can use copy data management to provision that to other business users, and create virtual copies of the data so you’re not creating more storage sprawl. This is useful in big data analytics as well as testing and development.

Q: How will 360 data management enable Veritas to stand out from the competition, especially from the likes of Veeam, as well as hyper converged suppliers?

A: My belief is that in backup and recovery, we’re the king of scale. We’ve created a dedicated business unit that includes sales and product teams to target the SMB and commercial space, and go aggressively after Veeam. We’re also very successful in the cloud with our support for Microsoft Azure and AWS.

We’re also supporting new requirements for backup and recovery – you have to use data more effectively beyond backup, such as orchestrating higher requirements like bringing up business services. The hyper converged suppliers are doing data compression on their side, and that’s effective. We’re able to make a backup copy and do deduplication (which is a commodity technology) on the client side or in the media server on the storage.

In the hyper converged space, we have a product called Hyperscale for OpenStack, which offers a software-defined view of what some of those hyper converged solutions do, which is essentially bringing together compute and SSDs, and using an orchestration layer like OpenStack to move workloads across efficiently. What we’ve added are data nodes below the compute nodes. We can move data down every 15 minutes – almost creating a point-in-time copy of data – thus flushing data out to enable better utilisation of expensive tier-1 storage.

Read more about enterprise storage in APAC

Q: With more enterprises deploying internet of things devices such as sensors, how can Veritas help to manage the vast volume and high velocity of data that’s being created today?

A: We have a product called Veritas Access, a NAS based device that helps to manage a large number of small files. It supports multiple protocols, as well as the ingestion and export of files via Amazon’s S3 service. It gives you the ability to do information management, but I think what you’re talking about from a strategic standpoint is that whether data is at-rest or transient, extracting more metadata to enable better business processes, as well as intelligence on how long data should be stored and the different tiers of storage to use, will become more important.

Q: What are the some of the biggest challenges that customers in APAC are facing from a data management perspective?

A: There are some industries like financial services and healthcare that are very global in nature. They have similar challenges such as being heavily regulated and the need to manage large-scale data. But I do see differences in the maturity scale.

When I was in Bangkok, people were talking about Thailand 4.0, which aims to modernise the country into a digital economy. Countries like Thailand will skip a few steps during the digitisation process, but they need to think about whether they are going to classify data and develop information management and retention strategies upfront.

It’s important because as you’re creating data, you need to have good understanding of that data to manage it. In the US and other more developed economies, there are already massive amounts of data collected, but classifying all that data takes a large amount of effort, which means it’s never going to get done.

Also, organisations in countries that are spearheading digitisation efforts will also have to take into account data protection laws not only at home, but also in countries where they operate. Organisations are custodians of customer and employee data that has to be managed from both compliance and cost standpoints.

Q: How should organisations decide whether or not to keep a piece of information or data? Something that might not be useful today might well be useful in future.

A: It’s an interesting question - I have a young daughter and the health information about her today might be pertinent to her when she’s 35 years old. But I also think there’s a lot of data that isn’t medical, or is trivial and irrelevant. At Veritas, we only retain data for two years, and we have employees who have had data for 15 years and were concerned about that policy.

There’s some resistance, because people fear the need to go back to e-mails 14 years ago, which nobody does or is so infrequent. To overcome that, organisations need to understand what data they have, manage risk, use that data to generate revenue for the business, and make decisions about whether to keep a piece of data. At Veritas, we help customers do a dark data assessment, so they get a view of their unstructured data to help them make policy decisions.

This was last published in March 2017

CW+

Features

Enjoy the benefits of CW+ membership, learn more and join.

Read more on Data protection, backup and archiving

Join the conversation

2 comments

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

They have a good vision, it's now time to prove how great it is.
Cancel
btw do u know maybe Xoperos solutions? I started use their product. Actually i am satisfied , but tell me if u know more backup solutions (for small company) or maybe you have some experience with Xopero too?
Cancel

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close