zhu difeng - Fotolia

Inside DevOps, containers and enterprise security

Global corporates are waking up to containers and orchestrated containerisation for software development that is fast and safe. Computer Weekly looks at the best approach to ensure security is not compromised along the way

The IT infrastructure needed for today’s globalised, agile digital businesses is changing rapidly. Gone – or fast-disappearing – are the days of slow-moving IT development and applications that are rarely upgraded apart from periodic security patching.

In a recent security-focused note by Gartner analyst Neil MacDonald, this adoption of cloud-native architectures – whether on-premise or public cloud infrastructure as a service (IaaS) – is painted as particularly significant because it offers a chance for information security leaders to “rethink, reimagine and redesign” their approach to security and risk infrastructure.

“We can’t protect tomorrow’s services with security infrastructure designed for yesterday’s applications. Our goals for security are the same – namely, resilient protection of service delivery, protection of sensitive data and minimisation of the financial impact of attacks when they do occur,” says MacDonald.

“Yet how we deliver these capabilities can be redesigned to take advantage of the unique capabilities of cloud computing.”

This is a good moment, MacDonald says, to reimagine information security in a world of cloud-native and applications developed in a DevOps environment, with a goal of “improving our overall security and risk posture and improving security and IT resilience”.

It sounds promising. But for large enterprises, in particular, reimagining IT is easy to talk about but difficult to do in practice. What is the best way to think about and approach DevOps and containerisation, and to roll it out safely? And why go there in the first place?

In this context, “containerisation” refers to the adoption of a modular approach to programming in which lightweight software packages that include everything they need to run are stacked on top of the same operating system.

Why containers?

The why of containerisation is easy enough to address. Adoption of microservices and containers has been dramatically fast, particularly in the past two years, because the conditions are right.

Corporates already understand the benefits of virtualisation and Open Source, while cloud applications are firmly in the mix. So the leap to containers is a natural one in which most will instinctively have confidence – in contrast to the way corporates were slow to adopt server virtualisation when it first landed, despite the clear benefits.

The other side of the demonstrable confidence in containers lies in who is doing it already. The likes of Google have been at the forefront of the movement, being the original developer of the open-source container-orchestration system Kubernetes, for example.

Kubernetes has quickly become a de facto standard for automating the deployment, scaling and management of containerised applications, and is now evolving with purpose in an entirely open way outside of Google.

Adoption expected to grow

Chris Smith, a developer and containers-advocate for the fast-growth tech-led utility company OVO, and who also facilitates a knowledge-sharing tech community in Bristol called Bristech, says the case for containers is clear.

“At OVO we use containers a lot, and adoption will grow from here. It’s the same with most progressive technology-led businesses, but it’s entering the mainstream, too.

“Google has been running hundreds of millions of containers a day for many years, long before the rise of container technologies like Docker, and you could say that shows the potential for enterprise adoption. Once an organisation explores the prospects, they find a space that makes sense if you want rapid delivery of software that’s more agile, speedy and cost-effective than running virtual machines.”

When it comes to security, Smith says adoption of DevOps and containers comes down to initially fostering the right organisational culture in a dedicated niche team, while also setting the right restrictions and keeping a watchful eye on how the security tools are maturing.

“At OVO, we use popular open source tools to scan our container images for known vulnerabilities. Our systems and practices are good because one of the threats with containers is breakout.

“If a company were to expose the Docker daemon, for example, that would allow the running of arbitrary code on containerised resources, and that is the kind of risk you can avoid if you have a robust process in place, securing endpoints, before embarking on DevOps,” says Smith.

“You can mitigate vulnerable endpoints with tactics like IP address whitelisting, HTTPS, client certificates and using a shared secret to sign messages. You can also continuously scan infrastructure,” he adds

How to be secure

Smith and others say the danger with containers is most pervasive if segmentation and defence in depth is not implemented up front.

“If a host gets compromised by rogue package or a rogue container – network segregation, security groups and tight firewalling rules will mean the rogue software cannot spread problems.

“So one simple danger is when a corporate leaves some developers to have a play with containers and some projects then come to fruition before protocols are properly set. In other words, every organisation has to think about things up front.”

A training need?

Interest in cloud infrastructure, containers and container security is widespread in the developer community right now, as Smith can see through his involvement with Bristech.

DevOps naturally demands a different skill set of developers. With the traditional set-up, coders code and testers test, and the production environment is somewhere the developer needn’t be involved. But with a DevOps team, everyone writes and tests and spins up and generally takes ownership.

That’s a shift in mindset and not all developers are ready to make the leap, of course. But there’s a body of individuals out there who can make the transition with the right support, no doubt – and with a proper understanding of security as integral and built-in.”

Simon Minton is founder of the DevSecOps start-up Ruggd. Like Smith, he says what matters most in relation to security and DevOps is getting the initial approach and culture right, and then sticking with it.

“Containers are convenient for development,” he says. “They let you stand up any development environment, and give you version control and more. But the security approach has to be baked in, too, because if a container is exposed we all know it could give widespread access.”

At the same time, Minton says this should not necessarily be a cause for concern, because the evidence shows that those companies that have embraced a DevOps culture have also embraced security in a big way. It is understood how the two go hand in hand.

“The research shows how adoption of security tooling for containers is highest in those organisations that have the most mature DevOps culture. On one level you would expect that, but it’s still telling. These mature DevOps companies are typically creating roles for ‘security champions’ who are embedded into dev teams to help build security awareness and best practices. And that’s what’s needed.”

The security smarts of Smarkets

One company that has adopted containers for most of its business functions is the betting exchange Smarkets, with 110 staff, £25m-plus of revenues and offices in London, Los Angeles and Malta. The company’s vice-president of engineering Mika Boström says containers are the right option for speed and agility, but also impose certain disciplines.

“There is the container-related supply chain, for one thing. Pre-built container images are easy and inviting to use, but may come with surprises, so that’s something we make sure never to overlook. There is an attack surface to a container you need to understand. Just because containers are often equated with being lean, it doesn’t mean that stripping them to the bare essentials is trivial. It really isn’t.”

Boström says that addressing this challenge from the bottom up within the organisation is down to education.

“We want security and we also want usability, so the challenge is to make learning meaningful and work in a way that pushes our development teams towards better practice,” he says. “That’s how we approach things: ensuring that computer security should be built in, but not an impediment. If you can make the secure more convenient you are on the right track.

“So many of our projects do complex things, but hide the complexity. That’s what we are aiming for. Among the container tools, Docker is easiest and has the most momentum, with great tooling. But the future lies in orchestration and complexity. That’s what’s coming and we know we have to learn at every step.”

Embracing risk tolerance

Boström says that for Smarkets DevOps is also about embracing a culture of risk tolerance. “If you engineer systems for reliability in the face of failure it gives you confidence and the ability to experiment more freely.

“Security has a bad reputation among software developers, because it is routinely seen as an active deprivation of something – mostly usability.

“DevOps may be associated with the use of bespoke tools, but anyone who focuses only on technical matters will miss the bigger picture. The tools are, in the end, usability improvements.”

Boström adds that security is not a commodity but the result of an ongoing process. In this it has a lot in common with DevOps, which is essentially a culture of running a development process in a certain way.

“In merging the two, we have to remember that in the end we’re aiming to change how people behave. Not by force, but by gentle nudging. We don’t want to fight human nature; we want to make use of it.”

First steps

So how exactly should an established corporate begin with containers and maintain security? Is there a simple way in?

One man with a plan is Colin Domoney, who is a senior transformation consultant with CA Technologies , and focused on application security and secure DevOps. Domoney also spent four years working in this area at Deutsche Bank.

“I think there’s a positive message here. Many large enterprises are struggling with the governance and control of their infrastructure already, often deploying obsolete operating systems and patching less than perfectly. The promise of containers and orchestration, in security terms, is good when set against this reality,” he says.

Read more about container security

The flexibility of containers brings a wealth of benefits from security perspective, says Domoney, who believes that with the right approach, an enterprise can deploy images that are verified and tested and hardened.

“If you approach things with this mindset the opportunity is clearly there to build a more secure infrastructure.  Yes, you’ll be placing trust and responsibility with developers, but you do anyway, and the orchestration technologies like Kubernetes are solving a lot of issues.”

Domoney says organisations should start in a small way with a plan that incorporates easy-to-apply policies with points of enforcement to ensure containers are running within given parameters and granular network controls.

“The message is that the tools are out there now. It’s not perfect. Does it stack up as enterprise-grade security? In some respects, maybe not. But it’s a moving target and enterprise adoption is possible.

“This is a space where the rate of development in the past 18 months has been unparalleled, so there’s lots of reason to believe that any gaps will be plugged fast.”

Containers help Netflix go global

For the media giant Netflix, containers have enabled faster development, scaling and deployment to its globally available cloud platform, which is based on Amazon EC2 [elastic compute cloud] virtual machines.

The container management platform Titus is Netflix’s infrastructural foundation for container-based applications. It provides scale cluster and resource management as well as container execution with deep Amazon EC2 integration.

When Netflix switched on Titus in December 2015, it launched a few thousand containers a week across a handful of workloads. By early 2018, it was launching more than one million containers, representing hundreds of workloads, which represents a thousand-fold increase in container usage in about 12 months. The pace of growth has not slowed up since.  

In early 2017, Netflix’s stream-processing-as-a-service team decided to use Titus to enable simpler and faster cluster management for its Flink-based system. This usage has resulted in over 10,000 service job containers that are long-running and re-deployed as stream processing jobs are changed.

In addition to a consistent environment, Netflix emphasises how containers help developers to push new application versions faster than before by using Docker layered images and pre-provisioned virtual machines ready for container deployments.

According to Netflix, deployments using Titus now can be done in one to two minutes versus the tens of minutes taken using virtual machines.

The theme that underlies all the improvements is what Netflix calls “developer innovation velocity”. In practice this means both batch and service users can experiment locally and test more quickly. They can also deploy to production with greater confidence than before. It is this speed that drives how fast features can be delivered to Netflix customers and is therefore key to business.

In terms of security, Netflix says it chooses to “deeply leverage” existing EC2 services. It uses Virtual Private Cloud (VPC) for routable IPs rather than a separate network overlay. It uses Elastic Network Interfaces (ENIs) to ensure that all containers have application-specific security groups.

Titus provides a metadata proxy that enables containers to get a container-specific view of their environment as well as IAM (identity and access management) credentials, according to Netflix. Containers do not see the host’s metadata and Netflix implemented multi-tenant isolation (CPU, memory, disk, networking and security) using a combination of Linux, Docker and its own isolation technology.

Read more on Hackers and cybercrime prevention

CIO
Security
Networking
Data Center
Data Management
Close