agsandrew - Fotolia
IT transformation is a hot topic these days as organisations of all sizes seek to put digital technology at the heart of their business.
In a recent survey conducted by 451 Research’s Voice of the Enterprise service, half of those surveyed said their organisation currently had an IT transformation initiative under way.
This proportion rose to two-thirds among large enterprises with more than 10,000 employees. And these are not trivial projects. The average organisation undertaking an IT transformation is applying the initiative to half its entire IT estate.
Of course, “transformation” means different things to different organisations, but these initiatives tend to include the adoption of newer technologies and approaches, such as cloud-based IT models and software-defined infrastructure.
Convergence is important because of the shifts taking place in the way IT departments are being organised.
In essence, 451 Research’s data shows a notable shift away from IT specialists and towards IT generalists. Overall, the data shows that smaller organisations already tend to employ mostly IT generalists who can handle multiple tasks.
This makes sense, because these organisations tend to have a smaller physical environment and don’t have the budget to employ specialists at every stage.
However, 16% of organisations across the entire sample said they planned to move away from specialist-led models and towards more generalist-led approaches. For large organisations, this proportion rose to one-third of respondents.
This shift is significant. Large organisations perhaps face the biggest challenge with moving to more agile and cost-effective IT models.
They typically have huge amount of legacy applications and infrastructure, and much of this infrastructure is divided into separate silos, often organised around technical functions, such as storage, compute and network.
The storage silo is often viewed as the most difficult to transform, mainly because of technical issues. Storage typically doesn’t scale out, different systems don’t operate well with each other, and traditional storage uses a range of exotic technologies that require highly skilled professionals to manage, such as Fibre Channel.
However, senior IT managers know that any IT transformation project that does not consider some element of storage transformation is likely to fail, and this is an area they must address.
The emergence of converged infrastructure is one of the industry’s responses to dealing with some of these issues. Implementing and managing infrastructure simply has to be easier and cheaper, and converged infrastructure is one way to achieve this.
Multiple converged models have appeared in recent years – single SKUs, reference architecture and application-specific appliances – but one model in particular, hyper-converged infrastructure, has emerged as the hottest new approach, as evidenced by hyper-converged pioneer Nutanix’s recent successful IPO.
But convergence is not the only way to transform storage and plot a path to enable more IT generalists to manage storage.
Over the past few years, we have seen a big change in the emphasis of innovation in storage, away from raw feeds and speeds and single-feature innovations such as data deduplication, thin provisioning, flash-optimisation and scale-out.
Although all of these capabilities are still important, the missing ingredient for storage has been ease of use. This is the factor that will allow those without deep skills in LUN masking, zoning and Fibre Channel to manage storage.
Hyper-converged infrastructure models definitely offer this, but they are not the only ones. The last five years have seen an explosion of capabilities from a range of startup companies, many of whom have focused on tight integration with the hypervisor, therefore placing storage functionality into the hands of the virtual machine (VM) administrator. VMware’s efforts here, with API-based innovations such as VAAI and VVols, are also helping.
Although many of these innovations are aimed at smaller organisations that already tend to employ IT generalists, these newer storage technologies are starting to percolate into larger enterprises.
The timing is good, because budget constraints mean many large enterprises are seeking alternatives to big iron storage that involve multi-million-dollar capital expenditure. They are finding that many mid-range storage systems offer good-enough functionality for all but the most mission-critical applications. If those systems can now be managed by IT generalists or a VM administrator, then so much the better.
Read more on hyper-converged infrastructure
- In the first of a two-part survey, we look at the hyper-converged infrastructure market and the startups providing VM-native servers and storage, and datacentre-in-a-box products.
- In the concluding part of a two-part survey, Computer Weekly looks at the offerings of the big players – Cisco, Dell, Fujitsu, HPE, VMware – in the hyper-converged infrastructure market.
Another route to storage simplicity is via all-flash array technologies. Although these were initially deployed for high-performance applications, more customers are starting to adopt them across broader workloads.
In some organisations, all-flash arrays are becoming the standard storage platform for all applications, not just those with random I/O patterns. Of course, most all-flash arrays offer blazing fast performance, but the reason many IT managers like these products is that they remove much of the need to constantly tweak and balance the systems. All-flash arrays offer storage that doesn’t need babysitting, and so can be managed by IT generalists.
SolidFire, the all-flash array specialist recently acquired by NetApp, explicitly embraces this in its messaging.
Founder Dave Wright says the company was designed to provide storage for people who don't want to manage storage. This idea may be too much for traditional storage buyers, but gels well with a new generation of IT developers that have grown up with cloud-based IT and have completely different expectations around how the underlying infrastructure should be accessed, provisioned and managed. If the infrastructure doesn’t behave like a public cloud, they are not interested.
So where does this all end? Are the days of the storage administrator drawing to a close?
This is where the discussion becomes much more nuanced – and obviously takes on a more human dimension – but these are issues that many senior IT managers are grappling with.
Certainly, not many organisations have chosen to go entirely down the no-storage path, but it’s a discussion that will continue to take hold as cloud-based IT concepts become increasingly woven into the fabric of the modern datacentre.
Simon Robinson is research vice-president for Voice of the Enterprise, 451 Research.