When I started my IT career in the early 1980s the deployment of applications was a relatively simple affair: everything lived on the mainframe in the company’s data centre. If you needed to update an application then you knew exactly which environment it was going to run on: a green-screen dumb terminal, with complete certainty about which version of the operating system, database, TP [Transaction Processing] monitor and compilers that you would need to cater for and test. Those were the days.
For more on the cloud for business applications
Check out this SearchCRM mega guide to business applications and cloud computing
Jeff Kaplan explains how SaaS BI can inform business in this video
The maze that is business applications licensing
Today the picture is very different for those developing or implementing applications, whether they be internal IT staff, consultants or software vendors. The spread of desktop computing and distributed systems means that you need to take into account a wide range of potential environments in which your application will run, and so needs to be tested. The problem is worse if you are a software vendor, who has to deal with multiple customer environments, but even within a single enterprise there is likely to be a range of server operating systems, web server software, desktops and now even mobile devices to cope with.
Every time that you add another layer of complexity, the task of testing out that the software actually still works becomes greater: what worked just fine under a certain version of Linux in combination with Apache and Windows 7, may not work in the same way when deployed in an environment using Weblogic and some subtly different release of the operating system.
The complexity of the operating environment in a large enterprise has driven up the cost of deploying software. Seventy per cent of the enterprise software budget now typically goes on maintenance, not delivering new business applications, and a sizeable chunk of that is associated with testing out and fixing problems due to software not working as expected in the spaghetti of operational environments that are the reality in most enterprises.
To add to this issue, IT operations managers have to struggle with forecasting demand for processing capacity, memory and disk in a world where data volumes are booming and where there may be unpredictable spikes in demand. Many companies have peak periods, such as Christmas time in retail, and have to allow for enough capacity to cope with such spikes in demand.
Enter the cloud
The rise of “cloud” computing promises a way off the treadmill of constant hardware upgrades. Pioneered by companies like Salesforce.com, more and more enterprise software is becoming available via a web browser, with the software vendor assuming the responsibility of dealing with peaks and troughs in demand, and with ensuring that the software actually works in the chosen technical environment.
Connected with this has been a change in the pricing model for such software. Instead of buying the software up-front the software is typically rented, usually based on the usage made of it, whether in terms of number of users, sessions or the amount of data accessed. For an enterprise this pricing model change has many benefits: you no longer have to fork out a large sum of money upfront, only to discover that the project using it gets cancelled or the software does not work as advertised. There are no more issues with capacity upgrades, as that is the vendor’s problem.
Smaller companies have adopted cloud deployments more enthusiastically than large ones. There are, naturally enough, downsides and risks with cloud deployment to offset its benefits. People worry about the security of their data, and of just how reliable that third party cloud infrastructure really is. There have been well publicised glitches, such as Amazon’s web services outage in June 2012 due to storms in the mid-west US causing power outages affecting several data centres, which brought down several well-known mobile and web applications that used the Amazon cloud.
Whilst there will doubtless continue to be issues like this, companies should consider just how secure their own internal data really is, and how reliable and carefully backed up their own data centres are in the case of a major problem – are you really any better at running a data centre than Amazon and Google? Security of data is a complex issue, and it is at least as likely that important data may be compromised by an executive having their laptop stolen as it is by a security issue in a managed cloud environment. Moreover, albeit at extra cost, it is possible to have a “private cloud” whereby the data centre is run by a third party, but a set of infrastructure is dedicated to a particular customer.
Large enterprises still seem nervous about adopting cloud technologies wholesale, but there are now a few examples of data quality and master data management technologies becoming available in a cloud form, although the vast majority of vendor revenues in these areas are still from the traditional on-premise model.
The success of Salesforce has shown that many companies are willing to put sensitive data (what is more sensitive than their sales pipeline?) into a cloud environment, and the world has not ended yet as a result.
After initial denials that the cloud was relevant, Oracle showed that cloud computing was not mere water vapour by purchasing RightNow in late 2011, with SAP similarly buying a cloud company called Success Factors (for a little matter of $3.4 billion) shortly afterwards.
Even that most conservative of institutions, government, is getting in on the act. The UK government plans to shift half of new government spending to cloud computing services by 2015, with its own government cloud-based application store launched in February 2012. Even if it falls short of this ambitious target, it shows that cloud computing has clearly moved beyond early adopters and small businesses.
The combination of shifting infrastructure problems to someone else, combined with the nearly ubiquitous leasing model (rather than up-front payment plus maintenance) for cloud software is compelling. Every company need to do their own cost/benefit analysis and risk assessment, but as the market matures it is likely that more and more organisations will steadily move in this direction.
Andy Hayler is co-founder and CEO of The Information Difference and a keynote speaker at conferences on master data management, data governance and data quality. He is also a restaurant critic and author (www.andyhayler.com).
Likely future for social business applications includes the cloud