At times, everything done at the corporate level seems to be designed to make it difficult to ensure accurate and valid back-ups.
The size both of operating systems and applications has driven the need for large-scale local storage. Workstations now ship with disc drives of 40Gbytes or more and even laptops have disc storage capabilities of over 20Gbytes.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
One of the drivers for large local storage capacities is the number of corporate users with access to the Internet who require somewhere to cache files and store information. The difficulty of having so much storage out in the user community is ensuring that critical information is backed-up.
In the past, tape streamers would have been provided to a small number of critical workers, but now the user is pointed to a corporate file server for their data storage and the servers are automatically backed-up nightly. This is a more secure method of storage, and responsibility for it lies firmly in the hands of the IT department.
With local storage and the number of users with laptops on the increase, recent years have seen a change in this approach.
Mobile users, in particular, have been encouraged to make regular back-ups of their laptops to their home directory on the network. Unfortunately, the majority take this to mean "dump the entire disc to the network", which results in large amounts of operating system, application and garbage files being stored as well.
To make matters worse, desktop users who are used to saving data locally have been encouraged by their line management, often those with laptops, to do likewise to ensure the safety of critical data. The result is that huge amounts of storage, often in the region of terabytes, is wasted. And this hits back-up strategies.
In short, the sheer logistics of moving this amount of data around is destroying the network infrastructure.
The solution is to move data away from central IT centres and back out to the departments using local file and print servers. Careful network design using subnets allows the network administrator to control the amount of data reaching the backbone and, by installing back-up solutions inside the servers, departments can make their own back-ups.
CD-R and CD-RW, while relatively cheap, are only viable for workstation back-ups of limited amounts of data, while Magneto-Optical solutions are expensive and rarely seen in most offices today. This leaves tape technology as the cheaper option, with the dominant offering still being Dat (Digital Audio Tape) devices.
A Dat tape streamer has a respectable back-up capacity of about 40Gbytes per tape, with compression. But the compression automatically lowers the back-up speed and can increase the problems of recovery if anything goes wrong. Most Dat drives have a transfer speed of about 18Gbytes per hour at best.
Digital Linear Tape
The next step up from this is the Digital Linear Tape (DLT) which is more expensive, both for media and drives. The problem here is that an autochanger system is a key component for large-scale server back-up.
Capacity is generally about 80Gbytes per tape, with compression, and speeds are up to 43Gbytes per hour for single-drive solutions. Today, DLT accounts for the vast majority of back-up solutions in regular use, particularly as a server solution, with sales figures often quoted as up to two million units worldwide.
The problem with DLT is ensuring that tapes are changed regularly, maintained in the appropriate manner, and removed off-site securely when the back-up has been completed. In addition, the capacity of the internal solutions are insufficient to effectively back-up the local hard drives without tapes being changed during the back-up process.
Such responsibility has created a problem for many ITdepartments, with data being lost. Because of this, network attached storage (NAS) solutions, such as the ATL Lanvault, or a standalone DLT library with autochanger, have emerged as the storage method of choice.
NAS enables departmental solution capable of storing over 200Gbytes, with speeds of about 100Gbytes per hour. However, this is still below the amount of storage required by the average department, and the limits of speed and storage using tape technology have been pushed for some time now.
Tape silos have become increasingly popular with IT departments, as capacities of several terabytes can be achieved. Many ISPs are already installing these to protect their customers.
Speed of back-up depends on how the computers and silos are linked together, but with the emergence of fibre channel and Gigabit Ethernet, the ability to back-up enterprise data from a network is finally matching the amount of data in the average corporate datacentre.
Unfortunately, storage silos are horrendously expensive but, despite the salesperson telling you that you're nothing without your data, finding the money to purchase the solution, redesign the network and implement new procedures is not that easy.
To find a middle ground between large-scale storage solutions and departmental requirements has always been difficult. However, recent developments mean that tape capacity and speed are both increasing. What makes it even more interesting is that the two key technologies emerging are both competing for the same market, with major players such as Hewlett-Packard and IBM currently selling both types of system.
The differentiator is unlikely to be cost, as the solutions are almost identically priced, nor is capacity or speed likely to play a major part. The main battle will be multi-supplier development of a standard versus technology owned and licensed by a single supplier.
On one side there is Quantum which has made significant improvements to DLT technology and has launched Super DLT (SDLT). This technology has been licensed by a number of suppliers, including HP and IBM.
Quantum produces several different SDLT solutions with storage capacities of 160-220Gbytes compressed and back-up speeds of 57-115Gbytes per hour. There is a partial guarantee of backward compatibility of cartridges, depending on the type used. This has a significant benefit of allowing a migration from existing solutions to newer technology.
In competition with SDLT, a collaboration between Seagate Technology, IBM and Hewlett-Packard has resulted in the Ultrium Format of Linear Tape-Open Technology. Unlike SDLT, which is solely owned and licensed by Quantum, Ultrium has a number of contributors which have all produced their own devices.
With previous back-up solutions, changing supplier could mean that tapes were unreadable, but the Ultrium format consortium is keen to show that cartridges can be read on all compatible drives - although there is no qualification process as yet.
Ultrium format cartridges can hold 200Gbytes of data, with a 2:1 compression ratio, and offer back-up speeds of about 100Gbytes per hour on the HP Surestore Ultrium 230 and 115Gbytes per hour on the Seagate Viper 200. Both of these solutions can be configured as both external and internal solutions, making them ideal for departmental back-up solutions.
Initially, it would appear that SDLT is likely to provide a smoother path for companies that have already invested in DLT, especially for IT departments with a large number of sites to support. Yet, moving forward, the higher capacity of the entry-level Ultrium devices, allied with the slightly higher speed, means that the Ultrium format is likely to prove a winner in the long term.
More on storage can be found at www.itnetwork.com