kalafoto - stock.adobe.com

Overcoming the container security conundrum: What enterprises need to know

Container technologies continue to shake up enterprise IT, but security concerns abound

Container technologies have shaken up the IT world in the past half a dozen years, and the pace of change has created many opportunities for IT managers who are enjoying the benefits the technology has brought.

Hand in hand with this, however, is the effect on existing implementations. Companies adopting containers will often have to cope with ways of working, causing a sense of disruption.

Tim Mackey, principal security strategist at Synopsys, says companies have to start by asking what security improvements will containers bring to the organisation and what other changes are needed before going full steam ahead.

What these companies must realise is that there will be effects and these can have a deleterious impact on the way a business is run, but with proper preparation, much of this friction can be avoided.

According to a survey from security company Aqua, when companies were asked about their main concerns when deploying containers, security was the most frequently reported challenge, so it is clear that most organisations are fully aware of the issues facing them.

Mackey says there are some particular steps that could be taken, including:

  • Reviewing patch management, application scalability and disaster recovery plans when containerising applications.
  • Understanding data access and logging changes imposed by container deployments.
  • Ensuring that only explicitly required binaries are included in the container image.
  • Limiting the lifespan and resource allocation for a container replica to only what is required.
  • Ensuring that interactive logins of any form are blocked and that container images operate without elevated privileges.

Patching it together

Patch management is a particularly important for container security, says Mackey. “Logging into a running container in production introduces an attack vector which doesn’t need to exist,” he says. “Installing patch management software into a container image increases the attack surface and complicates application management. Dynamically patching a running container means that the patch management system is at odds with the orchestration system.”

With this in mind, security managers need to ensure processes are followed in the right order to be effective. “The correct solution in this context is to patch the container image and then have the orchestration system destroy the existing replicas, thereby replicating them with patched versions,” says Mackey.

But poor patch management is not the only issue that container adopters need to be aware of. Gavin Millard, vice-president of intelligence at cyber risk exposure company Tenable, says the biggest container security threat is the bundling of insecure libraries in containerised applications.

“Containerised applications can include hundreds of open source libraries, each of which represents an attack vector and a potential security vulnerability,” he says.

Millard says these vulnerabilities can be made even more harmful because of the work practices of corporate DevOps teams. “Unfortunately, a favoured container version can be used many times, introducing and deploying these vulnerabilities at a massive rate,” he says.

“Ensuring containers are up to date is critical to reducing the deployment of vulnerable libraries, but this has to be verified and validated continuously to ensure no old issues slip through the build process or emerging flaws are identified and remediated.”

One of the major changes facing many organisations has, of course, been the growing acceptance of cloud as a vehicle for managing IT workloads. This development has brought many advantages, but when it comes to containerisation, moving into the cloud means additional risks – not so much from the use of cloud itself, but there will be a need for accurate monitoring and rigid governance.

Read more about containers

As Millard points out: “Containerisation within cloud environments poses a range of cyber risks. As a result, it is critical that cloud providers implement identity management policies. Ideally, containerised applications and security precautions, such as container image scanning and software composition analysis, should be baked into the build system.”

Mackey endorses this view. “As with all systems, it is entirely possible to deploy an insecure system into production,” he says. “More often than not, such systems are the result of using simplified or default deployments. This is why a thorough review of the security options available is critical.

“Questions like ‘is SELinux or AppArmor enabled?’ or ‘how does my network fabric white list traffic?’ are effectively configuration questions, the answer to which will govern how secure the deployment is.”

But Mackey stresses that the ultimate responsibility lies with the users of cloud-based services. “They will always be responsible for the security of their application,” he says. “That means the entire attack surface of the container and its contents, the default configuration of the application, the threat model associated with that attack surface, and the task of patching the container images.

“Cloud providers will manage the orchestration system and will generally have imposed, or at least minimally made available, a set of strict security policies. By transferring the risk of operating the container system to a cloud provider, users can then focus on the security of their applications.”

Another possibility

The environments discussed above are all based on the premise that containers are server-based, distributed from a host computer. But there is another possibility.

Droplet Computing has pioneered the use of containers within an end-user environment so that there is little danger of a host computer being compromised. Droplet CEO Peter Von Oven says: “Docker and Kubernetes are very different from what we do. We are desktop-focused, so could run Droplet containers offline if necessary.”

This means, for example, that companies can call up genuine online versions of popular applications, even if they are for a different platform, he says. “So, an Apple user can use Microsoft’s version of Visio – rather than the offline one – which is lacking in some of the functionality. In this way, Mac users can have access to exactly the same features that Windows users do.”

But, says Von Oven, the key difference is the level of security that Droplet offers – and how its approach differs from the one taken by the traditional container companies.

“The key to it is isolation,” he says. “The container runs as an app and runs abstracted from the host devices underlying operating system, in a similar way that an OS [operating system] is abstracted from the hardware in a virtual machine environment using a hypervisor. This means the container has no dependencies on that underlying OS and talks directly to the hardware.”

There are certainly plenty of options for companies looking to go down the container route and stay secure. The important thing is to recognise the potential vulnerabilities and act on them. Containerisation is not going to go away and it is good to be prepared.

Read more on Containers

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close