The CA World conference and exhibition is staged this week in Las Vegas in a market currently populated by a number of large consolidated-via-corporate-acquisition (and OK yes, some organic growth) enterprise-level software-centric data management firms.
Opening keynotes on the Sunday night of the show featured CEO Mike Gregoire as well as Peter Griffiths, exec VP and group exec in the enterprise solutions and technology group.
The CA VP team suggests that the role of the CIO is changing and getting closer to the so-called “dynamic nature” of real business.
The focus now for CA is the management of applications, infrastructure, security and applications in the face of what is a massive explosion in data quantities…
Cisco predicts that we will use as many as 50 billion online connected devices by 2020 – and that is leading us to produce as much as 7.9 zetabytes of data globally by 2015.
This will mean that we will between us produce 35 zetabytes by 2020… and if you find 1 zetabyte hard to picture, then consider that…
1 zetabyte = 250 billion DVDs
Freeform Dynamics technology analyst Tony Lock held a pre-keynote press session at this CA World to explain that he was not interested in predicting major changes, but more interested in seeing what firms are really doing inside their datacentres.
The datacentre evolution happening right now is typified by six major factors says Lock, in light of reflections to the CA roadmap:
Virtualisation – as a broad brush trend
Management – driven by increasing levels of accountability
Automation – tools can do much of the routine processing for us
Service level monitoring – how can we monitor end-to-end from the datacentre to see whether what customers get and see if they get enough… or perhaps even not enough
Business reporting – explaining things to the board is not simple, getting reporting right is not simple
Service delivery management – are you managing your IT infrastructure? Or are you managing direct services… and the latter can allow IT to be managed and be perceived as more valuable to the wider business as a whole.
“The cloud has a problem, very small companies do not understand what third party services can do for them (as they come in the shape of cloud) and if they do to a degree, then they fail to understand the mechanics and process behind how these new services should be INTEGRATED into the company,” said CA’s Griffiths.
Big data news now making really the headlines tends to be “research projects” in the main he says.
But the crucial point now is that you have to know WHAT QUESTIONS big data can answer before you start to implement any level of analysis.
Big data needs to be “productionalised”
Big data needs to be “productionalised” so that it can be engineered into the business by means of management tools and automation so that value can be extracted from it on an ongoing basis.
We need “excellent integration” with the way things are already, says CA.
So where IT goes next really comes down to management… getting hold of (and developing) the best practices to be able to get to the point where these technologies can now be productively implemented is our major challenge.
So how is CA working to differentiate itself?
“Innovation is extremely important to CA Technologies. We have approximately 5,900 engineers globally who design and support our software. Our engineering organisation focuses on continuous innovation and making sure that customer insight and market intelligence play a key role in our development activities. We spend over $600 million a year on R&D,” says the firm.
NOTE: One of CA Technologies largest research and development sites for mainframe technologies is in Prague — the centre has 300 developers and the site also works with three local universities to help promote mainframe knowledge and skills.
This somewhat fragmented summary of first thoughts relating to this year’s conference set the scene for some interesting discussion and they may also through up a few questions.
“Analysis is set to the become THE big data application of choice,” suggests CA’s Gregoire…
… but surely it already is!
i.e. big data without analysis is just white noise in the form of unstructured or semi-structured data.
Is CA taking us back to basics on this the first day of its annual convention or has it taken too simplistic an approach? One suspects it is only the former, but interesting stuff is to come for sure.