Binkski - stock.adobe.com

Collaborative security approaches underpin container success

Containers are helping organisations to accelerate age-old software development approaches, but success is underpinned by a constant and team-wide attention to security

This article can also be found in the Premium Editorial Download: CW Asia-Pacific: CW APAC: Expert advice on container security

In today’s fast-paced world, the way organisations deploy and scale applications is often fundamental to their success. Containerisation has emerged as a popular means to do those things quickly.

This technology makes it easier to move software between different environments – such as the public cloud, a developer’s laptop or test network – and ensure it runs smoothly during the process. Containers sport all the fundamental components of working applications, including tools, libraries, binaries and the default operating system.

Adoption of containers is growing rapidly, with Gartner predicting that 50% of businesses will be using them by 2020. But this is a new way of working for many organisations, requiring new skills around provisioning, network management and security.

There’s certainly a degree of risk involved, with a recent study from security firm Tripwire finding that nearly half of companies with containers in production have experienced vulnerabilities.

Preparation is key

Making the transition to container infrastructure is far from an overnight job, but it can prove transformative when done right. For data analytics firm Sumo Logic, 70% of its compute estate now runs on containers – processing more than 100 petabytes of data every day.

Chief security officer George Gerchow explains how the company had to change its approach towards security and software development in a bid to get the most out of the technology.

Sumo Logic’s decision to invest in Kubernetes and Docker containerisation technologies reflected new challenges posed by a growing customer base of more than 1,600 organisations. The firm needed an agile approach for developing and maintaining software, as well as handling vast amounts of data.

“We implemented containers as it was a way to manage scaling up quickly. We process a lot of data every day. More than 100 petabytes of data across multiple companies in aggregate, but each of our customers will produce a different amount of data every day,” says Gerchow.

“Each of those companies has to have their data kept separate, for compliance purposes. Deploying containers was a great approach to designing the services and systems that would process this data over time.”

The move to containers has taken years of intense work for the company. Gerchow admits that, like any new technology, containerisation brings a certain level of risk.

In the Tripwire research, CIOs voiced concerns including inadequate container security knowledge among teams, limited visibility into security of containers and images, and the inability to assess risk in container images prior to deployment.

Gerchow believes that such risks can be mitigated through careful preparation. “To start with, you have to get visibility over the systems that are deployed. Second, you have to look at the chain of custody that can exist around these systems over time,” he says.

“This change in ownership and responsibility can easily lead to potential threats, especially when your software development team is acting on their own without any oversight from your IT security department.

“Next thing you know, a large vulnerability gets discovered and you end up backpedalling trying to get the environment fixed and pushing new processes in after the fact. Thinking about this upfront is so much easier, especially if you start scaling up the number of developers or teams involved.”

Battling new risks

Although there is no silver bullet solution for identifying vulnerabilities caused by containerisation, Gerchow says IT leaders must work closely with developers to understand and tackle challenges. “This can help you bake security into your container images and into your CI/CD [continuous integration/continuous deployment] pipeline from the very beginning,” he says.

“It then enables you to carry out code and vulnerability scans as standard across all your deployed containers, as well as putting strict identity and access management [IAM] processes and policies in place to follow. There is no reason why IAM should not be in place around your container images – this is necessary for control over all those images in production.”

To Gerchow, one of the biggest challenges faced by companies is that security teams are commonly not involved from the beginning of these deployments. “Consequently, they will have no idea what containers are and why they are being used. Because of this, security gets zero visibility to how developers make use of this emerging technology,” he warns.

The end result is that basic security processes tend to get ignored or overlooked. “For example, everything can get deployed as a root or superuser function allowing for the highest amount of access. This makes life easy for developers, but ignores all the benefits that can come from limiting access based on roles. It is harder to bolt all of this on after the fact,” says Gerchow.

A collaborative approach

Since adopting container technology, Sumo Logic has fostered a collaborative approach between security and software development. Gerchow explains that security has been treated as a critical consideration from the start, allowing his team to run standard processes and ensure they are as secure as possible.

“We have full scans on containers as they are being built and tested, and we then monitor each of these images after they are submitted to production. On a practical level, it is important to make collaboration the norm,” he says.

 “My security team has developers on it, and we put time and effort into ensuring they stay way out in front of emerging technology. We can therefore understand what those new technologies offer, and we can then work very closely with our engineering teams to meet their goals. The result here is that we make sure we remain agile, but with control.”

As adoption of containers continues to grow, it is likely new challenges will appear in the foreseeable future. Gerchow notes how large vulnerabilities have already been exploited in Kubernetes through public application programming interfaces (APIs), and he believes this trend will not stop unless security professionals educate themselves on the new stack developing around containers.

“We have to start leveraging better agile processes and tools. On the other hand, container suppliers need to provide richer data via logs for visibility. Without this information, it is difficult to get that accurate picture of what is taking place in real time,” he says.

It is clear that containerisation offers immense benefits for development teams, but close attention to security is paramount.

“Organisations should realise that automation and orchestration at scale can be great. However, if you release unprotected workloads with that kind of agility, with no visibility, you will be at risk. If your process is bad, automation gets you to bad quicker,” says Gerchow.

Read more about container security

  • Inside DevOps, containers and enterprise security.
  • Startups are developing technologies that fill in some of the security gaps, including better controls for container orchestration.
  • Established IT security suppliers add containers to their repertoire, and IT pros must decide between trusted platform integration and the latest features from startups.
  • Despite their name, containers are by no means impervious to threats. With their use expected to grow, the time is ripe for determining the best security for containers.

Read more on Hackers and cybercrime prevention

CIO
Security
Networking
Data Center
Data Management
Close