Power to the People?

| No Comments | No TrackBacks

Energy usage is a focus for many at the moment.  For IT, it seems to be a big focus - mainly as organisations become more aware of how much energy is wasted in their data centre facilities.  However, it is likely to be brought into even greater focus in the not so far distant future, as the looming energy deficit starts to become more apparent.

A mix of short-sightedness and prevarication by politicians means that the UK is now at a position where it is unlikely that it will be able to meet all its consumers' energy needs in just a few years - the UK's energy market overseer, Ofgem predicts that the UK's current energy generation over-capacity of 14% could fall to 4% in just 3 years.  The failure, or the need to take down for even planned maintenance - of only one generation plant could lead to insufficient power being available for all the country's needs.

Therefore, planned outages will be required to be put in place - and the biggest energy users will be targeted where overall country needs will not be adversely impacted.

So - steel and aluminium production is unlikely to be hit.  Retail may be asked to cut down on lighting and heating.  But the one place where politicians can really point to is the use of IT - and how many organisations could be asked to reduce their energy usage here - or risk having it cut off for periods of time.

It is widely accepted that data centres are inefficient when it comes to usage of energy - the average utilisation of a server is around 10-20% of cpu, and of storage around 30%.  Sure - a move to virtualisation can drive up these utilisation rates and so lower the amount of equipment being used and so lower the energy being needed - but is this the best way to address the overall need?

To take a bigger picture, it is necessary to look at the whole data centre facility and its energy usage.  There is a means of gaining a measure of the overall energy efficiency of a facility through the use of power usage effectiveness, PUE.  This is a comparison of the total amount of energy used by a facility divided by the amount that is used to power the IT workloads - i.e. that used by servers, storage and network equipment.  The rest of the energy is used in peripheral areas, such as lighting, cooling, and uninterruptable power supplies (UPSs).

A theoretical perfect data centre should therefore have a PUE of 1 - all the energy is used in powering IT workloads.  However, in practice, the PUE for an "average" facility is around 2.0 - for each Watt of power used for IT workloads, another Watt is used for peripheral items.

So - only 50% of the facility's total energy is reaching the servers, storage and networking equipment.  Running at 20% IT equipment utilisation means that at a rough estimate, around 90% of a facility's total energy input is essentially going to waste.  Upping IT equipment utilisation rates to 40% and getting rid of excess equipment could mean a saving 10% of a data centre's energy usage - which is wonderful - but still only means that 20% of a data centre's energy is being used for useful IT work.

However, the majority of data centres utilise UPSs to support pretty much all the energy used across the facility.  Unfortunately, many of these devices are pretty old, and will be running at 94% efficiency or less.  Modern UPSs run at 98% efficiency or greater.  But, is a 4% improvement in energy efficiency at a UPS worth the bother when a 10% improvement at the server and storage layers is possible?

Back to the maths.  If all the facility's energy goes through the UPS, then a 4% improvement across all systems (servers, storage, networking, cooling, lighting) is a 4% savings in energy bill - without having changed anything but the UPS.  Now, introduce the virtualisation mentioned above.  The server utilisation rates are upped from 20% to 40% as before, and the saving is 10% of the data centre's energy bill.  But, because we have improved the overall data centre's energy usage as well, we get a greater saving.  Every time we improve the equipment in the data centre - IT or support - then we gain that extra energy efficiency as well.

Modern UPSs also provide a host of other capabilities - as battery technology and battery management systems have improved, a well-implemented UPS can help in bridging some breaks in energy provision without the need for auxiliary generators to switch in.  They can also better deal with low voltage situations ("brown outs"), ensuring that an optimised energy feed gets to all equipment.

Should Ofgem be right, there will be planned brown outs and power cuts around the country within a few years.  Organisations can help in many ways - improving their data centres so that they are more energy efficient could put this back by a few months.  However, ensuring that their data centre facilities have newer, more effective UPSs in place can help in not only providing a far more energy efficient facility, but also in dealing with the problems that an energy deficit could present.

Quocirca has written a report on the subject, which can be downloaded for free here: http://quocirca.com/reports/773/powering-the-data-centre

Enhanced by Zemanta

CA versus Symantec

| No Comments | No TrackBacks

Two back to back events recently saw Quocirca talking to veterans of the software industry; CA and Symantec. The high level message from both is pretty much to same; we help to secure and manage your data and IT infrastructure. Yet, it is rare to find these two head-to-head; because in reality they are more different than they are alike.

True, they are both US headquartered (more or less) pure software companies with annual revenues of a similar order (CA circa $5B, Symantec circa $7B) and both with profits of around $1B. Their current share price and market-cap are similar and their stock market history has followed similar ups and down over the last decade. Both are now 30-something; CA founded in 1976 and Symantec in 1982. Symantec's higher revenue is reflected in its head count, 20K employees opposed to CA's 14K, but that gives them remarkably similar productivity of about $350K per head.

Furthermore, both sit on similar piles of cash of about $13B. This ability to accumulate cash has been key to the way each has grown, through aggressive acquisition; both have acquired tens of companies over the years, in Symantec's case almost doubling its size when it merged with Veritas in 2004 to move into the storage market.

So, for two companies appearing so similar what are the differences that allow them to operate side by side in the IT industry without too many dogfights? The most obvious is their legacy; CA comes from a background of providing software for mainframes (the ultimate in enterprise computing), whilst Symantec's origin lies in its consumer focussed Norton anti-virus technology (probably still a more recognised brand than Symantec itself). The main target market shared by both vendors is supplying software for mid-market and enterprise businesses to manage and secure Windows and Linux based systems.

Even here, whilst they may still sound similar their products have historically not overlapped much. When it comes to management Symantec's main focus is end-points (via its 2007 Altiris acquisition) and storage, whilst CA is listed as one of the big 4 systems management companies (along with BMC, IBM and HP - or 5 if you include Microsoft), focussed on broad management of enterprise IT (in CA's case including those mainframes).

In security, historically the overlap has also been limited. Many still think of Symantec as primarily a security company, but over the years its acquisitions have taken it beyond its roots in anti-virus to included email security, web security, data loss prevention (DLP) and so on. Few think of CA in the first instance as a security company but it also always operated in this space, more focussed on identity and access management (IAM), despite also having its own anti-virus.

However, that is changing - CA has been acquiring more and more security assets, for example it moved in to DLP in 2009 when it acquired Orchestria. And Symantec is now moving into IAM with its O3 platform that includes single sign on (SSO) via a partnership with Symplified, secure web access and compliance enforcement/reporting. Whilst Symantec remains by far the bigger of the two in IT security, it can expect to see more and more of CA going forwards.

Both vendors are keen to be seen as innovators (or keeping up depending on your viewpoint) with the key IT trends; cloud, mobile, social media, big data etc. However, this week they were both as keen to talk about people as products and solutions. Symantec has recently replaced its CEO of the last 3 years, Enrico Salem (whose blood was said to flow yellow, the vendor's corporate colour) with Steve Bennett who joined the board from Intuit in 2010. In a session on strategy, Symantec had little to say except the new CEO's pronouncements could be expected in January 2013. John Brigden, Symantec's head of Europe, Middle East and Africa (EMEA) for the last 7 years will be keen to see what that means for his organisation.

CA has already shaken up its EMEA operations bringing a new head Marco Comastri just over a year ago from Poste Italiane (he has also worked at IBM and Microsoft). Comastri is bringing new faces and trying to get CA EMEA more focussed on solution selling than technology.

Whether it is at the global or European level, these two software juggernauts have a momentum all of their own and management may find is frustrating to change direction. They should not try too hard, both have huge legacy customer bases and healthy finances, shareholders will not be happy to see either compromised.

Recent Assets

  • logo_computer_weekly.gif

Find recent content on the main index or look in the archives to find all content.