The trouble with the term cloud computing is that it encompasses such a huge range of technology offerings: software-as-a-service (SaaS), storage on-demand, remote server capacity, to name a few. One thing all cloud services have in common, however, is the way in which they deploy or relocate potentially sensitive corporate data beyond the firewall and that causes a serious consternation to information security professionals.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
In a recent blog, Amrit Williams, former Gartner analyst and now CTO of security management company BigFix, outlined the inherent dangers: "When we allow services to be delivered by a third party, we lose all control over how they secure and maintain the health of their environments - and you simply can't enforce what you can't control," he writes. "The 'experts' will tell you otherwise, convince you that their model is 100 per cent secure and that you have nothing to fear. Then again, those experts don't lose their jobs if you fail."
Business and IT executives in UK organisations seem to concur with Williams, according to the results of a recent survey conducted by IT consultancy firm Avanade. It found that, by a 5-to-1 ratio, respondents trust existing internal systems over cloud-based systems, due to fears about security threats and loss of control over data. It's an issue that pan-industry security thinktank the Jericho Forum has placed high on its agenda for 2009. Paul Simmonds, Jericho Forum board member sees this as a natural evolution from the group's work on deperimeterisation and collaborative open architectures, which also focuses on computing that takes place outside the protected boundaries of the corporate infrastructure.
The goal, he says, is to come up with a framework that enables companies to determine how cloud technologies can be used securely. "We don't argue for a minute that there are good business drivers for using cloud services, especially in a downturn, where reduced cost and faster time-to-market are so important," he says. "What we are challenging is the notion that the provider will handle security to your satisfaction as a matter of course. You simply haven't got that guarantee."
These efforts, his Jericho Forum colleague Andrew Yeomans hopes, should lead some executives to consider questions that they hadn't previously considered in the headlong rush to adopt cloud computing: When you repatriate data from a cloud provider, taking it back into your own internal systems, how can you be sure that no trace of that data resides on their own systems? What leaks might exist between the cloud service back into our own infrastructure? Does the provider adhere to the same physical, logical and personnel controls that are applied to our own internal systems? What will happen if the provider goes bust?
The results of their work, due to be unveiled in a paper this month, will tackle the high-level security aspects of cloud computing. It takes the form of a three-dimensional cube that attempts to map out in graphic form the key decisions that companies will have to make when deciding which tasks and data can be handled in the cloud, which should be confined to internal systems, and how to tie data residing in both the cloud and internal systems together in a way that is safe and secure. The model takes into account the huge variety in forms that cloud computing services can take: whether they are open or proprietary; perimeterised or deperimeterised; internal or external.
"People need to be aware that the cloud isn't just one thing," explains Simmonds. "You can have internal, proprietary, perimeterised clouds, for example, or external, open and deperimeterised clouds. The trick will be in deciding which model fits the risk profile of your organisation, depending on the task at hand." In the long term, he says, it may be necessary to tag data with metadata describing where it can and can't reside within the wider cloud model.
By the time Infosecurity Europe 2009 rolls around in late April, the Jericho Forum hopes to have drafted a self-assessment methodology to enable companies to ascertain if they are fit and ready for cloud computing, based on its previously published "11 Commandments" for deperimeterisation projects. "We hope this will give organisations a handle on the kinds of nasty questions they need to be asking of providers if they are to proceed with cloud computing in a secure fashion," says Simmonds.
Cloud's Illusions: Jericho Forum Future Direction (A paper on cloud computing given by Jericho Forum board member Stephen Whitlock, December 2008) >>