Storing up trouble - Data storage policy

Storage systems are crucial and costly items of IT expenditure. So there must be a policy behind the purchasing decisions,...

Storage systems are crucial and costly items of IT expenditure. So there must be a policy behind the purchasing decisions, mustn't there?

Fast, fact-packed and functional. What, the IT industry? No. Storage. Once in the background, storage is about to take centre stage. If the server goes down, you replace it. If the network goes belly up, redundant paths re-route. But storage can't go down. Currently, for every nine pence spent on hardware, 91 pence is spent on storage management. Analysts at IDC believe over 55% of the storage budget is spent on management. It has to be done well. Now, more original information is expected to be created in the years 2001-2002 than all the information accumulated in the history of mankind (Berkeley Study, 2000). Storage is undoubtedly the topic du jour.

Businesses are about to be driven by cost cutting, with 70% of IT spend expected to be on storage by 2003. META Group's Phil Goodwin (quoted in 'Remote and mobile data protection: hitting a moving target - part 1') points to storage-related costs constituting 70-80% of server purchases through 2004 and beyond. But as storage hardware prices continue to decline 30-plus per cent/year ($10-15 per Gb by 2003/4), the emphasis will shift to storage software, storage area networks (Sans) and services.

Doubling data
However, it's claimed we've digitised only 2% of the world's information so far. The information currently being created is doubling every year, which means the amount of data being produced is doubling. Add in Benchmark Storage Innovations' observation "f EU proposals to make companies keep records of all outgoing/incoming e-mails plus Internet URLs visited by staff for at least seven years become law, this situation will only get worse."

"Key challenges for any organisation will depend on the ability to manage and make use of vast amounts of information," said EMC marketing director UK & Ireland Nigel Ghent. "Accessing, managing and keeping this information secure and mobile is becoming the new century's major business challenge."

All companies need a storage policy.

Storage policy should not be tied to a particular network type or technology - it should be designed to meet the needs of the business, and allow improved levels of service to be delivered at lower cost. Customers need to design a system architecture, not pick a storage supplier, to ensure scalability, flexibility, and manageability. Modular disk architectures give expandability of both capacity and throughput; essential if response times are to be preserved as capacity increases. Storage hardware and software should be chosen separately to ensure one does not lock you into the other, and to minimise your total cost of ownership. That's the consensus of Sun storage marketing manager Chris Atkins.

Meanwhile, Fibre Channel Industry Association-Europe director Andy Batty believes for any company to assess which storage solution is most suitable, it needs to examine certain requirements and prioritise them in connection with business needs. Scalability; how will the choice scale in the future, and with capacity expected to increase at 60-100% per year, how will that be managed? Add in availability/reliability, performance, configuration flexibility, and cost and businesses have positive pointers.

The 87-plus member, non-profit FCIA operates a content-rich Web site which offers businesses white papers, case studies and lists of integrators, suppliers and consultants.

Increasingly businesses need to keep data available 24x7. The challenge, with mission-critical data on storage, is how to handle back-up. It's not possible to shut down the system to do this. What's happening is that Sans are being used to back up in real-time. Indeed this is the most prevalent use of Sans to date. A point picked up by Brocade, whose Paul Trowbridge maintained "any data storage policy needs to include the ability to network storage to provide the most efficient utilisation of resources, and high data availability, deliver security and flexibility in provisioning, and enable disaster tolerance architectures."

Batty pointed to a term raised by many in the industry, total cost of ownership (tco), which is claimed to be reduced, or at least contained, through applying San technology. Tco is linked to return on investment (roi), much touted by analysts at IDC, Sneer and Gartner.

The average cost of storage is put at 18-cents/Mb, with follow-on costs bringing this to $3.15 (£2.15), according to IBM storage systems group vp Dietmar Wendt and manager, business partner sales, Frederick Fabricius. Are there any specific money/storage saving policies? Well, consolidated storage will save money in the long term. Centralised management capabilities will save money also. Messrs Wendt and Fabricius maintained 40% of data loss is due to human error. Research has shown one person in an NT environment can manage 100-Gb decentralised, 200-Gb co-located, 750-Gb centralised.

Cost savings
Pursuing a consolidation and centralisation strategy can result in operational cost savings by reducing the number of NT servers by half, doing more with the same number of people, and even sharing knowledge between Unix and NT staff. Companies can also afford to deploy high availability on a broader number of applications. That's been the experience of Veritas Software's EMEA senior marketing director Dr Chris Boorman.

The consensus is end-users should not have to be managing storage. Pharmaceuticals enterprise Novartis of Switzerland has asked IBM to help with a better tco model. Novartis wants to be more cost-competitive in its e-infrastructure. No company will be competitive if it has to deal with the management issues outlined here.

To get a view on the software available to help users, the view of independent consultants distilled by Posetiv is that the best-of-breed can be defined through IBM/Tivoli's storage network manager and Veritas software. Meanwhile, storage virtualisation technology is being released by Fujitsu Softek now and Compaq (with Versastore) in first quarter 2002. Posetiv (independent storage specialist formed out of Computercentre) marketing director Graeme Rowe and senior storage consultant Jan Bo Larsen equally rate BMC's storage resource manager for application-centric storage management.

Another company singing the praises of storage virtualisation to implement policy based management is Hitachi Data Systems. With its partners, the company maintained it can help to deploy storage virtualisation tools to disconnect the physical storage management from the applications' needs, allowing a more fluid use of the resources. When information growth is not managed from a central point, more than half the capacity can go unused.

Another spin involves storage service providers (SSPs). Corporates mostly find 80-plus per cent of their data is file based, and growing exponentially. SSPs like Scale Eight maintain they can provide an unlimited amount of storage capacity, available on demand, without the need to provision, and charge for it, on a utility basis. Data is managed and monitored by a 7x24 service management centre, while a web based GUI allows customers to monitor all storage consumption and provides management tools. The major disadvantage is it's an external organisation managing an in-house system, with the company's physical storage off-site.

Storage is the largest part of the IT budget. Given this, it makes absolute sense to define a policy that relates to how you use it, manage it, control it. Anything less is basically bad business practice.

Read more on Integration software and middleware