Silver linings in the server and storage market playbook

As customers move to more agile infrastructures, virtualisation and the cloud could drive higher value sales potential for servers and more attach potential with storage, says Fujitsu's Craig Parker

Recent figures from Gartner showed server shipments in the EMEA region decreasing by 10.4% from the same period last year, whilst revenues were down 7.4%. It is perhaps important to note that this is still a big market. According to Gartner, 630,000 units with a total value of $3.8bn (£2.5bn) were sold across the region in Q4. 

It’s a similar story in storage. Numbers are not yet available for the last three months of 2012, but having put a lot of focus on disk arrays in recent months with our Storage First campaign, whilst we have seen only a relatively modest increase in the number of disk arrays we’ve shipped, the total value of those sales is considerably higher.

While any decline or flatlining in unit shipments is always seen as negative for the channel, there may be some silver linings in the server and storage market playbook right now. Current trends are creating bigger opportunities for resellers that can put forward good enterprise-class value propositions and back that up with good technical competencies.

Rising ASPs

While unit shipments of servers are down, particularly in developed markets, the average unit price of servers has increased as more customer look to acquire servers that have the scalability and resilience to support virtualised and private cloud deployments. We see evidence of this in our own company, anecdotally and in the Gartner report, which recorded Fujitsu server shipments up by 4.4% in EMEA in Q412, while Fujitsu revenues showed a 16.7% increase.

Overall, shipments in EMEA were down 10.4%, revenues by 7.4%, so right across the market, average selling prices are increasing.

We are also seeing a higher level of attach-rates for storage arrays as users look to deploy similarly capable solutions to meet their growing need for efficient capacity and risk management. Big data projects are perhaps having an influence but the underlying trend here is the explosion of data that is being created and needs to be retained. Storage needs to be fast, efficient, safe and most of all, scalable.

Agile foundations

According to IDC total disk storage systems capacity shipped reached 7,104 petabytes in Q3 last year – up 24.4% year-over-year. The increasing requirement for capacity is been chiefly by the need to store unstructured data, such as Word files and emails and customers are increasingly looking to derive some value from this information – and that’s what’s fuelling the need for big data solutions.

It is also driving the need for more agile and efficient solutions that will provide a foundation for coping with capacity growth. Getting this part of the infrastructure right is fundamental now since, if you don’t have a really scalable and effective storage infrastructure, you won’t be able to manage the growing volumes of data you need to retain, and perhaps analyse and interrogate from time to time.

Data also needs to be held safely and securely – so resilience and backup are important and so is encryption. All of this makes management of storage and data more important. Add in considerations such as licensing, warranty and maintenance and the burden mounts for the IT department.

Putting storage first

Getting storage right, putting storage first, is vital to the ultimate success of modern, hybrid, future-proofed infrastructures. For many customers, storage is something they think about only after they have bought the servers. But when you think about it, leading with storage actually makes sense – especially given the rapid growth of data that we are seeing right now.

Deciding on the storage solution later can also cause a few issues and increase costs. Organisations are often forced to implement multiple storage platforms to match different server and application requirements. They can end up with islands of disparate storage arrays, using different protocols and management standards. It’s quite conceivable that company would end up using a quarter of the capacity of several storage systems that have no ability to talk to one another and share resources.

Put storage first though, and you cut out all of that potential for squandered capacity and complex management requirements. We’ve seen some good examples of how this can work really well quite recently – University Campus Suffolk recently went public in Education Today magazine on how it is looking to the long-term by doing this and putting a new Fujitsu SAN at the heart of its IT plan.

Enlightened customers

Resellers who can enlighten customers to this reality, will create a whole series of opportunities for themselves – not only to design and deliver a storage platform that will meet all their needs and allow them to scale-up as capacity requirements grow, but also to provide the resilient, agile and dynamic hybrid infrastructures that will be needed in order to extract value from data.

Customers are now willing to invest to save in the long-term. They are spending more to acquire the dynamic, agile infrastructures that they need in order to deal with the extra demands of BYOD and mobility and the explosion of unstructured data. They may purchase fewer servers and storage arrays, but they expect to get much more out of what they do buy. The silver lining is that they fully expect to pay more to get that long-term business agility.

Craig Parker is head of product marketing at Fujitsu UK and Ireland's Technology Product Group

Read more on Servers