Is data deduplication a good way to cut backup times from remote offices?

Greg Gawthorpe, technical operations manager at CMC Markets, says data deduplication can be expensive, so try real-time replication as an alternative.

My business wants to cut backup times from remote offices. Is data deduplication a good way to achieve that?
Data deduplication is one way of achieving this but can be prohibitively expensive, depending on how much data there is and the vendor involved. Another method to consider is real-time replication, which can be done on deduplicated data to reduce bandwidth but can equally be done with tools which will only replicate block-level changes.

I am currently running a proof-of-concept on NetVault:Real-Time Data Protector from BakBone Software, and this has proved extremely successful. It also has the added benefit of restoring 'wrapper' files first so that even if a restore is not complete, the file is usable by the end user. The technology can be used many to one, so you don't need a deduplicator at every site as the brains of the operation remain in your data centre.

For more information on deduplication methods, read this Ask the Expert question on inline vs. post-processing data deduplication.

This was last published in July 2009

Read more on Data protection, backup and archiving

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.






  • How do I size a UPS unit?

    Your data center UPS sizing needs are dependent on a variety of factors. Develop configurations and determine the estimated UPS ...

  • How to enhance FTP server security

    If you still use FTP servers in your organization, use IP address whitelists, login restrictions and data encryption -- and just ...

  • 3 ways to approach cloud bursting

    With different cloud bursting techniques and tools from Amazon, Zerto, VMware and Oracle, admins can bolster cloud connections ...