As more and more UK data centres implement data deduplication and disaster recovery (DR) practices, the more data storage pros need to know the ins and outs of the technology. What are some of the cost, performance and scalability issues associated with data deduplication? What makes most disaster recovery tests fail? What are the three essential elements when implementing a data archiving strategy?
We'll also help you decide which data storage technology -- direct-attached storage (DAS), network-attached storage (NAS) or storage-area networks (SAN) -- works best with your virtualised environment. With data storage budgets at a critical level, you'll want to get the most out of your technology spend.
To do this, SearchStorage.co.UK has compiled our 10 best data storage methods of the past year. Be sure to bookmark our data storage tips page to stay current with all of our data storage tips coming out in 2010.
Data storage for virtual environments: Pros and cons of DAS, NAS and SAN
Data centre consolidation and cost reduction have driven the steady adoption of virtual server technologies in data storage environments. The major players (Microsoft, VMware and Xen) all provide support for using direct-attached storage (DAS), network-attached storage (NAS) or storage area networks (SANs). This tip will help you decide which data storage technology is best for your virtual environment -- DAS, NAS or SAN -- and examine the pros and cons of each.
Data deduplication: The business case for dedupe
Data deduplication is a relatively new technology that has made its way into many storage environments. But what makes it a justified expenditure in one environment will not necessarily hold true in all cases. Data storage vendors are typically better at finding a need for their technology in your environment rather than finding a technology that will actually meet your needs. This tip takes a look at the advantages and disadvantages of dedupe to you help you get the most value out of this technology.
Evaluating a data deduplicationproduct for data backup and recovery
As more organisations implement disk-based data backup and recovery to overcome the performance and reliability shortcomings of tape-based data backup, data deduplication has emerged to improve the economic feasibility of retaining data longer on disk (possibly eliminating tape). This tip examines the cost, performance and scalability of data deduplication products in terms of backup and recovery.
Formulating a remote-office data backup and disaster recoveryplan
Protecting data locked inside data centres has always been a central focus for IT, but we seem to be repeatedly reminded of just how porous the data centre has become. Lost backup tapes and misplaced laptops demonstrate that even the most rigorous data protection practices are no longer sufficient. We examine the remote-office backup technology market, including disk, data replication, continuous data protection (CDP) and data-reduction offerings.
Linux data backup and disaster recovery strategies
Linux data backup products are adding new capabilities and becoming even more mainstream. Today, most major storage management vendors such as Hewlett-Packard and Symantec have Linux versions of their storage management tools. With the Linux data backup software market expanding, we look at some of the new Linux backup capabilities vendors are offering, including the ability to back up to the cloud, handle virtualised systems and deduplicate data.
Top reasons why disaster recovery tests fail
Too many disaster recovery (DR) plans end up being a documentation exercise and never rise above the day-to-day priorities of the business. Those that make it to the disaster recovery testing phase often encounter problems that if not properly addressed leave a bad mark on the whole DR process. This tip examines the top reasons why disaster recovery tests fail and how you can prevent these problems.
Data archiving: Three key elements
There are many ideas about what a consolidated or unified data archive should look like, and preconceptions can clash when you're considering creating such a solution. However, there are three key elements to any data archiving system: archiving software, storage hardware and management software. We explore these elements and provide best practices for data archiving projects.
How to determine a NAS system's scalability
By design NAS is simple networked storage. It's easy to set up: just turn it on, set up the file system, mount it on the servers or desktops that will be using it, and it's ready to go. NAS is also effortless to operate, provides easy data protection with snapshots and mirroring, and is painless to manage with some caveats. However, NAS systems are not without their share of limitations, particularly when it comes to the scalability of a single NAS array. This tip will help you determine a NAS system's scalability by examining managed file limits, IOPS and throughput limits, and capacity limits.
Data replication tools for disaster recovery
On the surface, data replication tools represent an efficient strategy for DR. For organisations whose recovery time objective (RTO) and recovery point objective (RPO) for regaining business-critical functionality is a matter of minutes, data replication is a truly workable strategy. However, in researching vendors for replication tools, it soon becomes apparent that there are a few catches with the best and most workable data replication tools. This tip examines the caveats of this market and some replication technologies that can be easily integrated with your existing systems.
Disaster recovery and failback: Five tips
Most of the attention in business continuity (BC) and disaster recovery focuses on what to do before and during a disaster. But suppose the event has happened and is drawing to a close; what should you do to resume operations? Aside from the staffing, power and data centre space-related issues you'll face during post-event recovery, we explore five best practices for restoring your IT operations.
More data storage methods and best practices