In the good old days, a data centre facility ‘just’ had to be a place where servers, storage and network equipment could be housed. Sure, it had to ensure that enough power and cooling was available, but beyond that, most data centres were built to last for around a decade, and the IT team didn’t pay much attention to what was going on at the building level itself.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
The facilities team had responsibility for the data centre facility – and it didn’t care about what was really going on at the IT equipment level. If problems occurred, then it would react: a bit more power distribution here, a new cooling system there.
However, the chasm between the two teams, facilities and IT, remained pretty much unbridged.
The world has changed. The increasing density and complexity of IT equipment, along with implementation changes including virtualisation and private cloud computing has led to a need for a far more dynamic facility to meet the needs of the equally dynamic platform. Software defined constructs aim to abstract the operation of an IT platform further away from the physical level to a more virtual software one, yet this requires far greater understanding of the interplay between all aspects of the platform. A bridge has to be built between facilities and IT – and also with the business so that the whole system can be made far more responsive to the business’ needs.
The answer lies in data centre infrastructure management (DCIM). Providing tools that monitor everything to do with the data centre, DCIM in conjunction with existing systems management tools offers a means of building those bridges. DCIM has been around for some time now, emerging more from the facilities management side of existing building information systems (BIS) tools. However, it soon became apparent to DCIM vendors that to ensure that their tools did what they wanted, they had to understand the overall data centre – not just the facility, but the equipment within as well. DCIM tools have morphed to providing an understand of what impact new equipment has on the facility, and to include ‘what if?’ capabilities as well, so allowing data centre and facilities managers to better work together to ensure that changes do not cause problems.
However, the increasing use of virtualisation and private cloud is forcing further changes to DCIM. Root cause analysis in a virtual world is harder than in a physical one – a workload causing problems in a virtual image could be using resources from across a much larger set of physical items than before. A workload in a private cloud environment could be borrowing and lending resources such as compute power, storage and network bandwidth on a dynamic basis to other workloads. DCIM now must be able to understand the contextual dependencies between the virtual environment and the physical equipment, ensuring that the movement of virtual workloads around and beyond the data centre does not lead to unintended consequences that could bring the IT platform – and therefore, the business – to its knees.
This is leading to a major change in how the DCIM vendors – including Nlyte, Schneider, Emerson, and CA, along with smaller vendors such as GreenField Software – are developing their systems. Although the majority are still oriented more toward the physical world, Nlyte and CA are have advanced the most in providing a better understand how the virtual world is impacting the physical.
The advent of ‘software defined everything’ under the umbrella of the software defined data centre (SDDC) is providing DCIM vendors with the opportunity to make more of a play for managing the overall environment. SDDC needs to have a complete understanding of everything to do with the IT platform; it cannot work with just a knowledge of the IT equipment. DCIM can pull everything together and make SDDC more workable through providing the contextuality between the IT equipment and the facility itself: it provides that much-needed bridge that adds that extra intelligence to ensure that everything runs smoothly.
Quocirca has written a paper, “When Data Centre Layers Converge” covering what a person requiring such a toolset should look for.