Over the past decade, the volume of data being stored has risen sharply – and continues to do so. Researchers at Statista predict an average annual growth rate of 42% between 2020 and 2022.
But predicting exactly how much storage any one business needs remains difficult. Businesses risk over-provisioning, and so leaving valuable capital equipment underused, or investing too little in storage capacity, making it harder to roll out new applications. This has fuelled the rise of as-a-service and consumption models of storage procurement.
“Organisations need to balance the cost per unit and the total amount of storage required,” says Naveen Chhabra, an analyst covering data storage at Forrester. “IT directors need to find the ‘Goldilocks’ balance between what they need today, and what they need over the next two to three years.”
The storage market has reacted by introducing consumption-based models for storage purchases. This is changing the way enterprises acquire and pay for storage.
The large cloud providers – Amazon Web Services (AWS), Google Cloud Platform (GCP) and Microsoft Azure – have pay-as-you-go models, and the wider market has followed suit, introducing subscription-based offerings and consumption-based pricing for on-premise hardware, cloud and hybrid storage.
The services on offer cover a wide range of capabilities, cost and performance. In this article, we break out some of the most important questions storage buyers should ask suppliers.
1. What is the payment model, and is there a base commitment?
Conventional models for buying storage, including outright purchase of hardware, leasing and other financing vehicles, are now supplemented by a range of pay-per-use and subscription models.
These range from per-gigabyte (GB) pricing for as-a-service offerings, to fixed subscriptions, typically based on one-, two- or three-year terms.
AWS’s S3 Standard storage, for example, costs US$0.024 per gigabyte, per month, for capacity in the London datacentre. This tier applies for the first 50TB, and then falls to US$0.023 per gigabyte. Amazon, along with its hyperscaler cloud competitors, has a range of storage pricing for long-term archiving and other applications.
Cloud-based storage provides the most granular pricing. Subscriptions, however, allow enterprises to plan ahead for their storage costs. Typically, there will be a base commitment – a fixed amount of storage the organisation pays for – and either a “buffer” or burst capacity that is billed as it is used.
This gives CIOs the flexibility to cope with future storage demands or unexpected peaks without overpaying, especially in the early years. Dell EMC’s Apex Flex on Demand, for example, sets out a “committed capacity” and a “buffer” capacity for likely future use. IBM, for its part, delivers physical storage capacity through over-provisioning, but only charges customers for the storage they use.
Suppliers typically work with customers to calculate the base and burst capacity, so this needs careful negotiation.
2. How do you measure and pay for storage?
Suppliers that offer storage as a service use monitoring tools to calculate consumption.
This is critical, as the manufacturer is likely to ship more storage than the customer actually pays for, to provide burst capacity and a hassle-free upgrade path. Usage monitoring also helps align storage provision to actual usage, by moving some data to the cloud, for example, or moving files to lower-cost, longer-term storage or archiving.
Most suppliers average out usage, and then bill monthly or quarterly. IBM measures usage daily, and averages that over a month. Dell EMC calculates daily averages, and uses them to work out a monthly average. HPE, for its part, states that it operates a pay-as-you-go model with GreenLake, and also points out that it has offered consumption-based pricing since 2006.
CIOs will need to research how charging models affect the pricing. They should pay particular attention to likely overage over time, as this is where costs can add up. “If you just use storage for five minutes, does that count, and how granular is the charging if you go over your committed capacity?” asks Forrester’s Chhabra.
3. Can I upgrade, and is there a minimum commitment?
Part of the attraction of subscription-based and cloud and hybrid storage is that they allow upgrades without the need to swap out hardware. In the cloud, you just need to spin up more capacity.
For hybrid and on-premise systems, how you upgrade depends on the supplier. Storage can be over-provisioned from the outset, upgraded within the plan, or combined with cloud capacity in the short term or throughout the contract.
Hitachi Vantara, for example, offers two cloud-like models for storage, through its EverFlex Consumption Utility and its Storage as a Service offering. IBM states that it delivers three years’ capacity right at the start of a contract, but only bills for it as it is used. NetApp, which has a wide range of payment models, allows customers to tier data to on-premise or public cloud storage.
Most storage-as-a-service subscriptions run for a minimum of a year, with 24- and 36-month contracts also available. According to Forrester’s Chhabra, this aligns with most CIOs’ upgrade cycles.
Longer contracts are possible, but predicting usage beyond three years is harder, and for very stable usage the balance might swing back towards capex.
4. What other charges need to be considered, and what about SLAs?
Cloud storage suppliers typically charge a per-gigabyte fee for data on their systems, and then charge an egress fee when customers take data out of the cloud. Charges to upload data are less common. Cloud suppliers can also charge separate fees for monitoring and other tools.
For subscription-based models, again the devil is in the detail. Does the pricing include the core operating system (OS) and support, or just hardware? Are customer-oriented monitoring tools part of the bundle, or does the IT team need to license these separately?
For hybrid cloud models, CIOs should also check ingress and egress fees, and any other charges to connect local and cloud-based systems. If a single monitoring tool is not available or if an organisation wants to support multiple supplier offerings, it will need to factor in robust, third-party storage management software.
Buyers should scrutinise the suppliers’ service-level agreements (SLAs) and contractual terms. Are the SLAs acceptable, especially for availability? How quickly will a hardware failure be repaired? SLAs will also cover areas such as security patches, and penalties in the event of an outage.
5. How does the offer fit the wider ecosystem, including hybrid cloud?
CIOs should also scrutinise how future-proofed any storage contract will be. Although the benefit of storage-as-a-service lies in removing the tie to capex and owned hardware, an inflexible contract might limit the organisation’s ability to make use of new technology, from software-defined storage to higher-performance systems, and especially the cloud.
Being able to tier storage to and from the cloud offers significant benefits in capacity planning, redundancy and cost. Although suppliers have moved into storage as a service, in part to defend their market against the cloud hyperscalers, their real advantage is being able to offer hybrid technology and the best of both worlds.
CIOs need to ensure subscription- and consumption-based storage offerings make use of that flexibility rather than lock them out.
More about storage as a service
- Storage as a service: Consumption models from the big six – we look at the big six storage makers’ consumption model offers, which allow customers to pay for on-premise hardware and cloud storage capacity on a pay-per-use basis, within limits.
- Big storage meets cloud in the datacentre ‘as-a-service’ revolution: Cloud is mainstream but the datacentre’s here to stay – this has resulted in a trend towards as-a-service in the datacentre where big storage array makers and the cloud giants meet.