Outsourcing your storage can provide a cost-effective solution to the increasingly tricky problem of data management. Danny Bradbury investigates the options and assesses the risks and rewards
Back in the heady days of the late 1990s, storage outsourcing looked like a big market. The idea of taking storage outside the company and placing it in the hands of a shared storage provider was attractive, because it removed an infrastructure headache from the IT department's hands and allowed IT managers to get on with planning new projects. However, companies such as Storage Networks, which helped pioneer the concept, failed because the market did not grow as quickly as it had hoped.
But outsourced storage still offers opportunities for IT managers interested in managing their storage needs more effectively. Some companies offer primary storage as an outsourced resource, meaning you can connect your applications to a drive at a remote site.
"BT has never abandoned the shared storage provider model," says Correy Voo, head of storage services at the telco. BT offers both types of storage solution, but he says that many applications are not built to deal with primary storage mechanisms located miles away. A core Oracle application can probably be made to work with such storage, but in industries such as finance, where highly customised applications are built on top of a database engine, connection latency can become an issue unless the code is configured properly. For high-data, low-latency requirements such as audio and video editing, Voo says it is better to keep primary storage in house.
Primary storage is limited to being within a metropolitan area network, says Stephen Holmes, head of the technology directorate at outsourcing firm Atos Origin. He adds connectivity costs to Voo's latency concerns. As distances increase, companies not able to use fibre connectivity can look at multi-protocol label switching (MPLS) to help increase quality of service.
The other problem lies in integrating the provider's storage mechanism with a server infrastructure, says Voo. "We will nearly always encounter situations where a customer does not like our flavour of storage, or where there are incompatibilities." This will be especially true when trying to connect to San environments using IP-based versions of high-end connection technologies such as Fibre Channel and SCSI. Voo highlights Fibre Channel over IP as a potential integration challenge, and other service providers express similar concerns about iSCSI, the IP version of the SCSI connection protocol. Voo believes that SMI-S, the storage management compatibility standard promoted by the Storage Networking Industry Association, could help solve some of these issues.
For this reason, most people dealing in outsourced primary storage tend to outsource the applications as well, so that the storage and the application can be kept closer together. Service provider Star Technology outsourced the storage for a legal client's document management system, but hosts the document management software on a server in the same datacentre, says strategic technology officer Dan Scoble.
With outsourcing primary storage is still largely a niche application, outsourced back-up still presents peace of mind for IT managers. Storage services company Iron Mountain launched two electronic vaulting services in the UK this year, says business development director Jon Fowler - one handling PC desktop back-ups, and the other looking after the server. On the desktop, its client-side software initially synchronises the desktop data (something best done over a weekend or via a physical CD transfer), and then updates the back-up iteratively, sending changes on a byte-by-byte basis every day.
The company solves the problem of intermediate connections when using laptops by including a facility to automatically start a back-up on connection, including a resume feature to pick up whether back-up left off should a connection fail. All uploads happen in the background.
Software is encrypted before it leaves the desktop, he says, and because ongoing back-ups only handle file changes, the bandwidth needed is relatively small. The same goes for the server product, which works in a different way. It works on a continuous back-up basis, almost like server mirroring, but again deals with byte-level file changes. With file compression, it reduces the strain on the network.
Josh Krischer, regional vice-president of research in enterprise servers and storage at analyst firm Gartner, says the marketing for outsourced primary storage is of limited value. "The cost of telecommunications is higher than the cost of the storage," he says. He prefers the idea of managed storage, in which storage devices are located at the customer's premises but owned by a third party.
Either way, letting someone else handle storage can be a positive step, because companies do not always want to invest in capital, sometimes preferring to log operating expenses rather than capital expenditure. Moving to a managed storage business, possibly by entering into a leaseback agreement with your own storage devices, can allow this to happen.
When considering outsourcing either the primary storage system or a back-up system, be sure to think about the service level agreement. Most suppliers offered a 99.99% availability guarantee, although response times are less certain because some offices may be accessing remote storage systems via non-dedicated, public links.
Also consider pricing. For its primary storage outsourcing service, BT charges on a per-gigabyte or terabyte-per-month basis with the price varying according to the type of back-up protection. Iron Mountain charges a flat fee every month on a gigabyte per-seat basis for its desktop PC back-up service, and on a per-gigabyte basis for its server offering. It only charges for the original file stored, rather than for subsequent changes made to that data. If a file changes numerous times, it would store new versions of that file based on the changes received, but will still only charge for the space taken up by the original version. "Unless the client has a 90% change rate, which is rare, it is pretty much covered," Voo says.
Like the ASP market, which is being reborn as managed services, the storage service provider market suffered from a tarnished image after the dotcom bubble burst. But there are enough companies still offering the service to prove that it is still viable, and IT managers could benefit from taking another look.
Outsourced storage checklist
Your corporate data is one of your most precious assets. Before handing it over to a third party, there are some questions you should ask. Here are five that should be at the top of your list.
Is your motivation technical or financial?
If you want to move your storage costs from the balance sheet to the profit and loss sheet for accounting reasons, there are other options such as a lease-back agreement on your existing storage equipment.
Management or extra resilience?
If you are considering outsourcing to increase resilience, off-site outsourced storage is a good idea. If reducing management costs is your objective, consider maintaining the equipment on your premises but having it managed by someone else. This will eliminate telecoms costs.
What are the pricing options?
Having a pay-per-use utility storage agreement is attractive if you have volatile storage requirements and it may reduce your footprint from one month to another.
Primary or back-up?
Outsourcing primary storage or mirroring your primary storage in real time gets very expensive. Is this what you need, or do you simply want to transfer data in batch mode to a secure off-site location?
What is the application?
If you are replicating primary storage to an outsourced server or outsourcing primary storage altogether, your application is a key concern. Is it heavily customised with lots of complex code, or is it relatively simple with straightforward read/write capabilities and little extra processing of analysis? This will affect latency issues.
Case study: Atoc speeds up the accounting cycle
There can be few things more complex than the accounting system for a privatised rail network. The Association of Train Operating Companies (Atoc), which represents the private rail franchise holders, needed to revamp its accounting system. The Rail Settlement Plan, an Atoc subsidiary which runs the accounting system, recruited Atos Origin to develop Lennon, an outsourced accounting system that would help the company move from a four-week accounting cycle to a daily one.
Storage was a key issue from the start of the project. The system has to process roughly one million customer transactions every day, using a complex set of business rules to work out how much of the revenue from each transaction is given to each of the rail companies participating in that particular journey. The transaction processing, which is done in batch form between 8pm and 5am every night, produces output data used to populate a datawarehouse. The datawarehouse should reach 5Tbytes at its seven-year point, says Adrian Hepworth, Atos Origin's technical architect for the project. The storage requirement for the system was huge, because as part of the 2001 contract, the company had to store all data for the lifetime of the contract.
The firm uses a Hitachi Lightning 9960 Raid array at one set of offices, and a 1Gbit link is used to mirror all the data to a secondary site 125km away. The systems offer 15Tbytes of storage between them, which is enough to handle the datawarehouse and the transactional system together. But operational constraints are a problem. "A 1Gbyte link only gives you 200mbps through it, so it is a challenge getting 1gbps of data down a Gb link," says Hepworth.
The company uses an IP link to connect the two sites with Veritas Database Editions and Veritas Volume replicator. Mirroring discs at the disk level rather than the application level with an IP link enables the company to use it for anything else necessary, such as FTP sessions, for example. "
To help reduce storage requirements, Atos Origin uses the Analytic Server from Sand Technologies. Analytic Server is data management software that uses bitmapping techniques to compress database data and enables the company to store historical records on primary storage, rather than taking them off to tape or optical storage.
"Because of the compression, we can take 100Gbytes of data and store it in two," says Hepworth. "That means our cost of storage has gone down to 5% of the original cost. And we get the cost savings without having the headache of taking the media offline."