What causes slow data backups and how can you make your backup environment more efficient?

Ask the Expert

What causes slow data backups and how can you make your backup environment more efficient?

What causes slow data backups and how can you make your backup environment more efficient?

Continue Reading This Article

Enjoy this article as well as all of our content, including E-Guides, news, tips and more.
  • By submitting your personal information, you agree to receive emails regarding relevant products and special offers from TechTarget and its partners. You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy.

  • Safe Harbor

Slow data backups caused by bottlenecks require you to increase the bandwidth between the data and the backup devices. So let's look at why bottlenecks occur and how to alleviate them. Bottlenecks

Bottlenecks occur when the backup throughput required to complete the process is larger than the network bandwidth between the data and backup devices. To overcome bottlenecks, you can either increase network connections to the existing backup data servers or increase the number of data mover servers.

The first option is faster and less expensive, you have enough network port density and can use a multi-network configuration. The second option is a more expensive alternative because of hardware, license and power and cooling costs.

Data type - small files

The cycle of opening individual small files for backup is time consuming. Also, backup environments are generally unable to provide the data throughput required to ensure the backup completes in the allotted time frame.

The simplest and most cost-effective method to achieve this is to run more than one backup on a server at a time. If this is not possible, then you can eliminate a file-level backup although you may find that certain individual files cannot be recovered. Also, where possible, you can use NDMP.

If the number of streams can't be increased, nor the throughput improved, then you may be able to reduce the number of files needing backup. This can be achieved through two methods:

  • Incremental forever, where an initial full is followed by incrementals thereafter, minimising the data and time needed for back up.
  • Eliminate duplicate files with data deduplication technology.

With increasing pressure to manage costs, you would like to improve backup processes and operations at no capital cost. However, if processes and operations can be improved no further there is a compelling need to purchase more data movers, backup agents or dedupe products.

This was first published in October 2009


COMMENTS powered by Disqus  //  Commenting policy