Users at the Storage Networking World conference in Orlando this week said suppliers should work on creating more interperability among their existing products.
Representative from suppliers were keen to promote information lifecycle management (ILM), the latest strategy for storing data in an automated fashion on varying levels based on its importance.
But users were not convinced they should spend money on products some said could jeopardise reliability for efficiency and cost savings.
Thirty-three per cent of conference attendees indicated that they plan to deploy an ILM strategy in the next 12 months, while another 30% said they would begin implementing it in 13 to 24 months. Eleven per cent said they have already installed ILM components, and 26% said they did not plan to.
Steve Duplessie, an analyst at The Enterprise Storage Group, said that while ILM is a worthy goal, today it is little more than marketing jargon without products to back it up.
EMC chief technology officer Mark Lewis agreed.
"We don't think anyone is doing it well," he said.
EMC's recent acquisitions, including Legato Systems for $1.3bn and Documentum for an estimated $1.7bn, were designed solely for the creation of ILM products - including an integrated set of tools to identify data at the business application level so it can be placed on varying levels of storage for specific periods of time.
EMC is to unveil a product that ties database applications into the backup process in the first quarter of 2004, a function analysts said is paramount to any ILM architecture.
Jerome Wendt, senior storage analyst at First Data, said he is already planning his company's ILM strategy, which will include several types of storage management software, including network-based "virtualisation" applications, or software that pools storage from a heterogeneous storage-area network (San).
Wendt is also testing San management software from EMC, Fujitsu Software Technology and Veritas Software, which automatically discovers hardware on his storage network to get away from unreliable spreadsheet modelling and to understand his server and array utilisation rates.
Wendt said companies should start thinking about keeping data no longer than required by regulators, and automated systems that do that would greatly ease management headaches.
He is also working to classify his data to identify which media it should be stored on. Deploying several "tiers" of storage, such as advanced technology attachment disc-based arrays to provide secondary repositories for so-called near-line storage, is also part of Wendt's plan.
"By doing that, you free up space on [primary] storage for mission-critical databases," he added.
However, users have said they want to see suppliers begin using standards, such as the Storage Networking Industry Association's (SNIA) Storage Management Interface Specification (SMI-S), to offer centralised management of their heterogeneous storage.
SMI-S is a set of common models and interfaces intended to allow storage management applications to communicate and manage multivendor storage devices. SNIA's interoperability lab focused on demonstrations of SMI-S-compliant software.
Lucas Mearian writes for Computerworld