James Thew - Fotolia

Modern IT underlines need for zero-trust security

The increasing complexity of supply chains and interconnectivity of IT systems means the attack surface is widening and security has to evolve accordingly, warns British computer scientist

Modern IT environments with high levels of interconnectivity, little segmentation and increased use of third-party services are raising the risk of data breaches, according to the head of virtualisation-based security firm Bromium.

“Organisations need to consider adopting a zero-trust approach, applying security right down to the application level,” Ian Pratt, co-founder and president of Bromium, told Computer Weekly.

Bromium’s core micro-virtualisation technology that was developed to enable users to open any executable file, document or web page without fear, has been evolved to enable organisations to wrap critical applications within hardware-enforced virtual cages, so that even if the network, device or third-party partner is compromised, high-value assets are still protected.

However, Pratt emphasises the first step in applying a zero-trust approach to security should not be about acquiring any new technology.

“The zero-trust model is about following good, time-honoured security practices that every organisation should be encouraged to follow to reduce the attack surface by identifying what things companies care most about and treating them differently.

“The zero-trust model is not enabled by any particular technology or applications,” he said. “It is about moving away from the traditional model of trying to keep the bad guys out at the perimeter, while having lax controls internally.”

Virtual private network

Without investing in new technology, Pratt said organisations can do things like putting the corporate network outside the firewall and forcing all users to connect through a virtual private network (VPN) to get away from the traditional setup that once things get onto the corporate network, they can move freely

“So many of the problems that we have today stem from the fact we do not have appropriate segmentation or isolation, which dates back 30 years to the days when multiple users logging into the same machine was the big new thing in computing.

“But now we are in this world where users are running all sorts of applications on their machines,” said Pratt. “Those applications have been downloaded from a multitude of different sources across the internet and even browsing to a website is resulting in code from that website running on your machine.”

The problem is that from an operating system concept point of view, all of that just gets lumped together as running as the user, so there is no fine-grained segmentation of permissions to do things.

“Even at the machine level, as soon as you get system privileges on the machine, you have full control over that machine and represent anything it can do on the network. The networking protocols we have also date back 30 years and there is nothing to identify the particular application or authority associated with that application.”

Privilege escalation

As a result, if an attacker succeeds in exploiting a bug in an application, Pratt said that instead of being limited by the permissions associated with that application, it ends up running as the whole user. “Then with privilege escalation it ends up with the power of the whole machine on the network.

“That’s what makes breaches so much worse: attackers have that ability to move laterally and access all of the data on the machine and possibly all the data a legitimate user can access on the network.”

Trying to fix all the bugs in the applications to stop these compromises of the applications from occurring or the escalation of privileges on the operating system is “basically futile”, said Pratt.

“So you have got to look at other ways of doing the containerisation. On the end point, that might be virtualisation, and from a networking point of view, wanting to try and take micro segmentation approaches by identifying individual applications and controlling the authority of the network flows associated with that application to limit the impact of  the compromise of the application.”

At the most basic level, Pratt said zero-trust is about isolating things to reduce the trust associated with each of the things that are being isolated by giving them access to only the resources that are needed.

“The primary focus has been on the network, because that is the one where, using traditional technologies, you can make the most progress on quickly, but ultimately control needs to be extended into the endpoints, servers and clients using containerisation approaches.”

Read more about the zero-trust security approach

In terms of technology offerings that support a zero-trust approach, Pratt said most suppliers are offering something that sits on the network that is aimed at controlling flows across the network.

“But ultimately it is about wanting to do it in endpoints, including the server and the client systems you are using to access them, and identifying individual applications and individual roles of users so you can associate the authority required by that particular application and role so that you can enforce that right the way through the network to the server that you are talking to.

“It is a far more fine-grained authority you are trying to achieve, but that is the real goal of zero-trust architectures.”

However, for any organisation looking to move to a zero-trust approach, Pratt said the first thing to do is to think about what they are actually trying to protect.

“Identify the crown jewels by going through the whole process of working out what is the most important data to protect and whether you are protecting it against theft, misuse or destruction. Once you have identified those assets you care about the most, it is then a case of designing an architecture where you treat those things differently.

“The next step towards zero-trust might be to take the file server that is holding that information off the standard corporate network and putting it on its own isolated network or putting a terminal server or VDI [virtual desktop infrastructure] session in front of it so that the application accessing that data is running on that terminal server and users are accessing it remotely over an ICA [independent computing architecture] or RDP [remote desktop protocol] session.”

Two-factor authentication

As a follow up to that, organisations may want to implement another level of authentication using a two-factor authentication method because the data is the most important data they have.

“After that, it is just a case of applying these kinds of principles repeatedly to reduce the amount of stuff that you have to trust to be working to achieve a given security aim.

“You just keep applying that iteratively, and eventually you will get to the point where can start treating the corporate network as though it were public Wi-Fi, where you require the machines to individually authenticate themselves and for the traffic to go over a VPN between strongly-identified laptops using hardware elements such as a trusted platform module (TPM).”

While a zero-trust approach has obvious security benefits, the main reason most organisations have still to move to that model or have not done so sooner, said Pratt, is that it is easier and more convenient to put everything on the same network.

“Whenever you try to move to more fine-grained permissions model you are always running into things that break because you had not realised you needed to associate a given permission with a given task, but you have to be prepared to work through that.

“If you are moving to more fine-grained model, there is always going to be some work associated with it. You have to understand what flows of information are and what the process workflow is within the company, for example.”

Protecting the most important assets

The key thing is to identify what matters most and then work outwards from there, said Pratt. “It should be obvious to the business that the most important assets should be protected.

“You have to do it around what the business need is and the business assets that you are protecting rather than coming in with a particular security policy that you want to implement because even if it’s a good idea, that is not the way to build a business case to get it done.”  

Most organisations are trying to make things better, but security teams are often overwhelmed with the scale of the problems they have and the number of alerts all of the existing tools are generating.

“The often have no idea whether they are false positives or true positives and no time to investigate all the alerts to find out. Organisations are trying to do the right thing. It’s all about giving them appropriate resources and also finding good security professionals, which is something all organisations are struggling with.”

While a zero-trust approach will not stop all security incidents, Pratt said it means that when they do occur, they will be contained.

“You still have got a problem if somebody clicks on a malicious link in an email and gets their endpoint system owned, but at least the attacker can’t move laterally and get access to the crown jewels, which buys you time to deal with the issue on the compromised endpoint.” 

Read more on Hackers and cybercrime prevention

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

  • How do I size a UPS unit?

    Your data center UPS sizing needs are dependent on a variety of factors. Develop configurations and determine the estimated UPS ...

  • How to enhance FTP server security

    If you still use FTP servers in your organization, use IP address whitelists, login restrictions and data encryption -- and just ...

  • 3 ways to approach cloud bursting

    With different cloud bursting techniques and tools from Amazon, Zerto, VMware and Oracle, admins can bolster cloud connections ...

SearchDataManagement

Close