3dmentat - Fotolia
The end is nigh for privately owned datacentres. While it won’t happen this year, this decade or probably even this generation, the fact is the long tail of its death has started.
It used to be a given that, at some stage, an organisation would outgrow its datacentre and would need to build a new one, as concerns about the security, performance and availability of third-party facilities meant few enterprises were willing to go down the colocation route.
As a result, they usually opted to build a new one, or heavily adapt an existing site.
That was before virtualisation and hybrid cloud raised their heads, paving the way for enterprises to downsize their facilities as increased utilisation rates and equipment densities free up 50% or more of the existing IT equipment.
The result is a facility using the same overall power distribution and cooling capabilities, but aimed at a platform half the size, which is uneconomic – and businesses have started to realise this.
It is highly unlikely we will see many enterprises move all their in-house IT to the public cloud, however, regardless of the perceived benefits of doing so.
As result, there will still be a need for many businesses to retain some of their IT infrastructure in-house for a long time yet.
But owning the infrastructure does not mean that an organisation must own the facility, and this is where colocation comes in. A third party builds, owns and manages the facility, and a number of different organisations then share this space to gain greater flexibility and the improved economics of a shared model.
The onus is then on the colocation provider to invest in ensuring its services – primary and backup power, internet connectivity, physical and technical facility security, and so on – are above the levels at which an organisation could economically provide in-house.
Read more about colocation trends
- Every colocation deal involves a detailed contract and service agreement, but the true colocation cost is buried in myriad supplemental charges and fees in the fine print or price sheets.
- Rather than spending another budget cycle struggling to fund the data center, start outsourcing selectively – and improve at the same time.
To set themselves apart from all the other colocation suppliers out there, a supplier also needs to be able to offer additional functions, such as overall facility monitoring and architectural advice make sure those sharing the facility are ‘good neighbours’ and so on.
But how will these colocation suppliers adequately monetise their offerings when faced with what seems to be continued price cuts in the public infrastructure-, platform- and software-as-a-service marketThe cost models around colocation have traditionally been pretty opaque and based on a mix of the space occupied, power consumption and the amount of data transmitted across the site’s internet connections.
This can cause the overall cost of colocation to vary considerably, which can be a problem as cost predictability is essential to business during periods of economic instability.
At the moment, the colocation providers are lucky, because calculating the cost of using public cloud platforms is a difficult task, thanks to the wide variety of offerings being touted by a single supplier. This can be stressful to wade through, and that’s before you even try and build an effective platform.
Things are changing, though, as the likes of IBM move towards becoming more cloud-focused, meaning the cost models of cloud have to become more transparent and easy to understand. This has prompted some colocation providers to take steps to simplify their pricing too.
As colocation and cloud contracts become increasingly similar, this prompts the question as to whether an organisation should own any part of its hardware platform, or move everything to the cloud anyway.
Cloud vs. colocation: Which is better?
This circles back to the beginning of the article. Many decisions will be made based on the perception – rather than the reality – that public cloud is somehow less secure, more outage-prone and harder to control than a privately owned platform. These discussions are likely to drive organisations more to a colocation environment than public cloud.
Other decisions will be driven by business leaders who want to move away from a capital expenditure model to an operational expenditure one, or the need for greater levels of business availability, based on the use of warm images, provisioned across multiple platforms.
This can be achieved via colocation, but means replication of all the hardware across multiple facilities, which can work out expensive.
It may be that a hybrid model could meet the requirements for many – a colocation centre as the primary site with public cloud acting as the failover environment – as well as providing burst resources as necessary.
Whatever approach an organisation opts for, the IT platform they choose has to adequately support the business in its aims. This is increasingly where a fully in-house facility/platform model is failing. As yet, the end game is not decided. Colocation and public cloud both have their parts to play in any system – just do not write one or the other off for any ivory tower reasons.