If you outsource your virtualisation, thoroughly check your provider’s security

How secure is the current practice in virtualisation? In seeking to provide a detailed response for the above question, views have been sought from...

How secure is the current practice in virtualisation?

In seeking to provide a detailed response for the above question, views have been sought from the wide community of experts that make up the BCS Security Forum Strategic Panel (SFSP) as well as the BCS membership itself, writes Andrea Simmons, consultant forum manager at the BCS Security Forum.

Apart from an obvious search for a definition of virtualisation or anything related on Wikipedia, which comes up with some pointers, no results directly bring in the issue of security. This is not surprising given how, sadly, security can still be seen as such an afterthought, in spite of the huge focus there has been in the past couple of years.

As ever, with an industry overloaded with acronyms and conflicting terminology, the term cloud computing is felt to be better to use than virtualisation - where cloud means internet or intranet and computing means computer technology.

Virtualisation is the process organisations go through to consolidate their numerous physical servers into fewer physical servers using VMware, Virtual Server, XenServer, etc. A similar concept is used to consolidate multiple Sans, firewalls, load balancers, etc, into fewer physical components, using virtualisation to deliver these services. In many cases, virtualisation is done internally within an organisation, to reduce and consolidate its internal IT infrastructure.

The astute IT professional will apply similar concerns to any other information storage system. Several different layers of virtualisation can be looked at, ensuring adequate controls are in each layer:

Network - virtualisation through the network fabric with the use of virtual firewalls and VLans (this is common)

Server and application layer using products such as VMware

SAN storage, which has been around for some time but is often overlooked.

Bringing these together in the cloud is, from a security perspective, still an immature area of development. But this will need to be embraced to meet business demands and environmental restrictions, as long as traditional controls are still built in and enforced in virtualised environments.

Lots of servers are not yet ready to go virtual because their processors do not support the technology, especially VMware, which leads the technology. However a scary security issue comes into point when a hacker can get control of your hypervisor.

Full virtualisation is preferred to para-virtualisation because with para-virtualisation the OS kernel is modified and all we know that Windows does not accept any modification of its kernel, which Unix/Linux do. Studies have found that the hypervisor can be hacked and when this is done, the hacker gets control of all your virtual machines. Although other security could be implemented to avoid any leak, the real threat is to secure the hypervisor.

In terms of implementing cloud data storage, a key element that often gets overlooked is that of the data control element in terms of ensuring that contracts with providers have addressed these kinds of issues, particularly if the point of virtualisation is that you, as the data controller, have less kit to manage and have outsourced a lot of the provision to the cloud.

So if your data is "out there", how is it being looked after, protected, handled, stored, shared... you name it, do you really know what is going on with it? If not, you definitely should have answers to these kinds of things and definitely should not leave virtualisation projects up to either solely IT folk or solely procurement folk. There has to be some engagement with your resident security specialist - or with an externally engaged consultant - to ensure that all legislative and regulatory requirements have been met in terms of protecting any data that is subject to virtualisation.

Under the Data Protection Act, all data controllers should ensure the data is protected and only used for the purposes for which it was provided. If they don't, they are laying their organisation open to prosecution. For example, one control may be to only allow access to information in the cloud to authorised personnel via a secure SSH access link.

When virtualisation is provided as a service, the users often have full privileges over a networked virtual machine. They are on your network with a machine that can be fully compromised. You wouldn't think twice about carefully ensuring that there are access policies and firewall rules in place to make sure two dedicated machines are protected. Why should there be an omission when it comes to two virtual machines on a virtual network? If you outsource your virtualisation to the cloud, can you be sure the virtual hard disc file is stored in a secure place - ie. on a NAS treated as a real PC - with full password protection, firewall rules, updated software, no weak accounts etc?

The main problem with virtual machines is that they are hard to "see" as dedicated machines and it becomes a case of out of sight, out of mind. You cannot easily split them mentally into logical units and work on isolating the risks to each one.

It's also worth bearing in mind the consequences for your organisation if you are an FSA regulated entity storing data in an outsourcing arrangement or if you are storing payment card data. While using virtual infrastructures may consolidate certain aspects of your IT estate, it could increase the scope of your compliance.

On the positive side, it could be used to both consolidate IT and decrease the scope of compliance, if planned for correctly. Get the right people in the room to sign off the design - IT, information security, internal audit, and your QSA - if required to be assessed for the PCI DSS.

If one looks at the various audit standards such as ISO 9000 (quality), ISO27001 (security) and BIP 0008 (evidential weight of electronic information), these are all focused on the processes of the organisation, and require a physical audit of the organisation. Indeed, security gets into questions of the grade of lock on the computer room door and vetting of staff.

Where the contract also provides persistent storage - for example, of e-mails to comply with Sarbanes-Oxley, then one should also look at the checklist for repositories. Here is a list of questions that should be considered:

1. What service and support do they provide?

2. What are the service level agreement terms?

3. What guarantees are in place in the event of a problem: data, hardware or to ensure that information remains the property of the service user in the event of failure of the service business?

4. Do they provide online access to service data and metrics, such as dashboards, flexibility, and what do they monitor (ie. anything important to you)?

5. Does the service conform to the requirements of ISO27001/2?

6. What is the security expertise and reputation of the service provider?

7. At what maturity levels are the service provider's controls (ie. not just box ticking)?

8. What about their business continuity plans?

As is often the case, the challenges faced are not about technology, but about business and law, and not the aspects of law that procurement ordinarily deals with. So this is likely to be a technology that gets more visibility among senior management given that it is often sold on the basis of cost savings possible now, and improving ROI on legacy software. The security professional needs to ensure that their voice of reason and sanity is heard above the maelstrom!

Read more expert advice from the Computer Weekly Security Think Tank >>

Read more on Hackers and cybercrime prevention

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close