PUE - the benevolent culprit in the datacentre

avenkatraman | No Comments | No TrackBacks
| More

Internet of Things, big data, and social media are all creating an insatiable demand for scalable, sophisticated and agile IT resources, making datacentres a true utility. This is making big tech and telecom companies to drift a bit from their core competency and build their own customised datacentres - take Telefonica's €420m investment in its new Madrid datacentre.

But the mind-boggling growth of computing infrastructure is occurring amid shocking increases in energy prices. Datacentres consume up to 3% of global electricity and produce 200 million metric tons of carbon dioxide, at an annual cost of $60bn. No wonder, IT energy efficiency is primary concern for everyone from CFOs to climate scientists.

In this guest blog post, Dave Wagner, TeamQuest's director of market development with 30 years of experience in the capacity management space explains why enterprises must not be too hung up on PUE alone to measure their datacentre efficiency.


Measuring datacentre productivity? Go beyond PUE
-by Dave Wagner


In their relentless pursuit of cost effectiveness, companies measure datacentre efficiency with power utilization effectiveness (PUE). The metric measures the total amount of power coming onto the datacentre floor, divided by how much of that power is actually used by the computing equipment.

PUE = Total energy
            IT energy

PUE is necessary but not a sufficient indicator to gauge the costs associated with running or leasing datacentres.

While PUE is a detailed measure of datacentre electrical efficiency, it is one of several elements that actually determine total efficiency. In the bigger picture, focus should be on more holistic and accurate measures of business productivity, not solely on efficient use of electricity.

Gartner analyst Cameron Haight talked about how a very large technology company owns the most efficient datacentre in the world with a PUE of 1.06. This basically means that 94% of every watt that comes into the floor actually gets to processing equipment. This remarkably efficient PUE achievement does not detail what they do with all of that power, and how much total work is accomplished. If all that power is going to servers that are switched on but essentially idling and not actually accomplishing any useful work, what does PUE really tell us? Actual efficiency in terms of doing real-world work could be nearly zero even when the PUE metric indicates a well-run datacentre in isolation.

Datacenter

Datacenter (Photo credit: Wikipedia)

Boiled down, what companies end up measuring with PUE is how efficiently they are moving electricity around within the datacentre.

By some estimates, many datacentres are actually only using 10-15% of their electricity to power servers that are actually computing something. Companies should minimize costs and energy use, but nobody invests in a company solely based on how efficiently they move electricity.

Datacentres are built and maintained for their computing capacity, and for the business work that can be done thereupon. I recommend correlating computing and power efficiency metrics with the amount of useful work and with customer or end user satisfaction metrics. When these factors are optimised in a continuous fashion, true optimization can be realised.

I've talked about addressing power and thermal challenges in datacentres for over a decade, and have seen progress made - recent statistics show a promising slowdown in datacentre power consumption rates in the US and Europe due to successful efficiency initiatives. Significant improvements in datacentre integration have helped IT managers control the different variables of a computing system, maximising efficiency and preventing over- or under-provisioning, both having obvious negative consequences.

An integrated approach to planning and managing datacentres enables IT to automate and optimise performance, power, and component management with the goal of efficiently balancing workloads, response times, and resource utilisation with business changes. Just as the IT side analyses the relationships between the components of the stack--networking, server, compute, and applications--the business side of the equation must always be an integral part of these analyses. Companies should always ask how much work they are accomplishing with the IT resources they have; unfortunately, often easier said than done. In the majority of datacentres and connected enterprises, the promise of continuous optimisation has not been fully realised, leaving lots of room for improvement.

As datacentres grow in size and capabilities, so must the tools used to manage them. Advanced analytics have become essential to bridging IT and business demands, starting with relatively simple co-relative and descriptive methods and progressing through predictive to prescriptive approaches. Predictive analytics are uniquely suited to understand the nonlinear nature of virtualised datacenter environments. 

These advanced analytic approaches enable enterprises to combine IT and non-IT metrics in such a powerful way that the data generated by the networked computing stack can become the basis for automated and embedded business intelligence. In the most sophisticated scenarios, analytics and machine algorithms can be applied in such a way that the datacentre learns from itself and generates insight and models for decision-making approaching the level of artificial intelligence.

 

No TrackBacks

TrackBack URL: http://www.computerweekly.com/cgi-bin/mt-tb.cgi/52355

Leave a comment

About this Entry

This page contains a single entry by Archana Venkatraman published on August 18, 2014 10:54 AM.

What's making Oregon the datacentre capital was the previous entry in this blog.

Microsoft Azure goes down for users around multiple regions including Europe and Asia is the next entry in this blog.

Find recent content on the main index or look in the archives to find all content.

-- Advertisement --