With the technical and market parameters of SDE still taking shape, CIOs are in no rush to adopt, despite the hype going into overdrive.
Just as everybody starts getting comfortable with the once derided term "cloud", the IT industry marketing machine foists a new one on us.
And although "software-defined everything" (SDE) heralds a return to the tried-and-tested three-letter acronym (TLA) formula, its precise meaning to many CIOs is as nebulous as its more descriptive predecessor used to be.
Like cloud before it, SDE is being used as a catch-all term for all the latest technologies and approaches that promise to take us to that long-touted nirvana of true IT flexibility and agility through advances in virtualisation.
Clearly, it's a vital approach for the giant cloud operators and providers looking to build out vast, scalable datacentres with industrial scale and efficiency. Indeed, it's from developments like Facebook's Open Compute Project (an initiative to drive standardisation and automation right through the datacentre) that SDE has largely taken its cue. But does the approach really translate to enterprises more broadly, and if so is now the time to be doing something about it?
While cloud has today largely come to mean a way of easily spinning up virtual servers, processing power and storage -- and the services and applications hosted on them -- SDE refers to the idea of virtualising everything in the datacentre and beyond, from compute and storage to networks and devices. This is meant to give IT departments the capability to automate all their IT provisioning and management entirely through software, using standard commodity hardware that will effectively become invisible to them. Finally, we're told, CIOs will be able to focus wholly on delivering a fast, efficient, fully scalable IT service to their organisations without the inevitable bottlenecks of integration and manual configuration commonly encountered with today's public and private clouds.
The analyst's view
While the IT industry is doing its utmost to present the case for SDE in the enterpriseas an urgent consideration for CIOs, it's still not a priority for many. Yet the concept -- and what it represents -- is an important one.
Andy Lawrence, vice-president of research for datacentre technology at 451 Research, says: "CIOs aren't fools. They're wise to the cycles of hype that occasionally engulf areas of IT. Our research indicates that actual deployments are still very low -- especially at mission-critical enterprise level -- and are likely to pick up only gradually during 2015. Nonetheless, the number intending to deploy software-defined architectures is rising rapidly despite there also being a substantial group which, for now, has no strategic plans in that direction.
"This slow start is not really surprising, for a number of reasons. Key among them is technical and market immaturity. There's still confusion around standards and, more importantly, the big suppliers are appropriating and integrating the technology. The long-term disruptive potential of these technologies is real and should not be understated. But it is happening alongside the big move to cloud, which is confusing the strategic choices. CIOs need to decide how much of their infrastructure will actually stay in-house, and how they maintain service levels and security, before they decide to carry out big strategic overhauls."
Some analysts are bullish. IDC predicts the market for software-defined networking (SDN) alone will grow from less than $1bn in 2014 to $3.7bn by 2016 and $8bn by 2018. Gartner claims that by 2017 more than half of all enterprises will have adopted an architectural approach similar to that of the cloud giants. Frost & Sullivan, meanwhile, says the hypergrowth starts here -- at least in Asia-Pacific. "2015 is seen to be the year of SDE as the software-defined revolution spreads beyond the boundaries of the datacentre," the analyst recently predicted.
Puff versus prudence
CIOs, though, are more cautious. Steve Pikett, head of IT at Harrods Bank, says: "As with so much in IT, the hype is way ahead of the reality, and seems to exclude history and current reality altogether. Marketers say we should buy the all-new product, but it makes little sense to corporations to walk away from what currently works and bet the farm on something new.
Steve PikettHarrods Bank
"There are times when IT buys into its own hype too heavily and fails to market sensibly. Cloud was one such example -- we accept and adopt it now in co-existence with other ways of working. It was sold as the answer to all our problems, but it merely introduced new ones while solving others. I've no reason to believe this will be different."
Myron Hrycyk, group CIO of Severn Trent Water, similarly says that while he is proactively pursuing a cloud-centred IT strategy, SDE is still too far from maturity to consider implementing now. "Software-defined datacentres may well come about eventually, but based on some of the conversations I've had on the topic, the suppliers are still trying to describe what it means, reminding me of the early days of the cloud, when it was hyped yet undefined. I think we need to give this new idea more time to develop before we start trying to deploy it," he says.
The hybrid reality
At present, for a software-defined environment to work without bottlenecks, the entire infrastructure has to be virtualised. But most enterprises today operate in a hybrid environment, with a mix of standalone systems, private and public cloud. Until people have transitioned to fully virtualised environments, SDE simply won't bring about the touted, trouble-free benefits. As Andrew Marks, CIO of Tullow Oil, says: "It will be a while before any IT shop with legacy (physical or virtual) platforms buys this."
Besides, as cloud services and solutions continue to mature, an increasing number of organisations may choose to use external providers rather than invest in yet more in-house technology. "If anyone uses cloud, we shouldn't care whether it's hardware or software-defined since we have no role in the platform as long as it works," adds Marks.
Yet despite the reservations, there is an increasing acceptance by CIOs that SDE is coming, and will ultimately have a profound impact on IT infrastructures. It just might take a while to manifest.
Ian Campbell, group CIO at the Highways Agency, says: "I do agree that the future will be SDE, just as I believe that datacentres will become increasingly virtual, with companies buying processing and storage on demand from suppliers providing it in true utility style, like electricity. But at the moment most systems and services are too complex or too customised for SDE. This will gradually improve and become easier as we develop and manage new systems with SDE in mind. But realistically it will be several years before a clear winner emerges, who will then create the standard that other providers must work to."
More articles on software defined IT
Software defined everything: Throwing more software into the mix
Cisco is missing the transition to software-defined networks
Why a storage reference architecture matters in software-defined IT
Convenience foods producer tastes success with software-defined datacentre
Likewise, Ian Cohen, group CIO at insurance firm Jardine Lloyd Thompson, says of SDE: "Like so much of the recent vendor/analyst-led tech hype there is an underlying capability that has value and potential. The problem, as ever, is that it's already being oversold. Those who jump too early or move without suitable thought, planning and insight will inevitably get burned."
That said, while 2015 is unlikely to be the year when the technology makes significant inroads into the enterprise, Cohen says CIOs would be foolish to dismiss it. "It's definitely time to test the water, gain some experience and learn fast, because there is an element of genuine pace here. Equally, the world isn't greenfield, one size doesn't fit all -- be it DevOps, private cloud, public cloud or SDE -- and we don't all have the luxury of working in unregulated environments. I think SDE will grow rapidly and become part of the next-generation compute capability, but only part -- for now at least.