Fotolia

Why Kubernetes is driving a groundswell in containerisation

Kubernetes is fast becoming a de facto standard, but CIOs may be unaware of its footprint in their businesses

This article can also be found in the Premium Editorial Download: Computer Weekly: Take the pain out of software patching

Kubernetes is establishing itself as the platform of choice for developers who want to embrace DevOps and cloud-native computing.

According to analyst Gartner, the market has chosen Kubernetes as the de facto container orchestration technology, and at the recent KubeCon conference in Shanghai, several businesses joined the user community for Kubernetes.

Retailer JD.com has one of the world’s largest Kubernetes clusters and currently manages tens of thousands of bare metal servers to run numerous containers including online applications, middleware systems, databases and offline computing jobs. The company serves as the largest user of, and is a significant contributor to, the Prometheus, Vitess and Kubernetes projects. It also has its own internal and homegrown projects.

New user community members include Amadeus, Atlassian, Mastercard, PostFinance, State Street, Swiss Mobiliar, Walmart, WeWork and Workday.

These companies join the community, which includes companies like Capital One, eBay, GitHub, Goldman Sachs, NCSoft, The New York Times, Ticketmaster, Twitter, Vevo and Zalando in CNCF’s End User Community, which meets monthly and advise CNCF governing board and technical oversight committee on key challenges, emerging use cases and areas of opportunity and new growth for cloud native technologies.

The evolution in the enterprise of Kubernetes is coming from the bottom up, as developers look to embrace new techniques to develop and deploy code faster and more efficiently. Certainly, the majority of delegates at the Shanghai event, which was the first KubeCon conference to be held in China, were mainly developers.

From an IT leadership perspective, the first aspect of Kubernetes to tackle is that it comprises a lot of different elements, at various levels of maturity. Some are ready, at least for early enterprise adoption; some are incubator projects that may eventually be promoted to the stable Kubernetes stack. Then there are the projects that fall somewhere between, which tend to be deployed by early adopters in the Kubernetes community, who want to test out and make use of the latest Kubernetes tools and components.

Work in progress

From a CIO perspective, it may seem that Kubernetes appears as a work in progress. Liz Rice, technology evangelist at Aqua Security, said: “The goal is to create a full stack to run applications cloud natively. End users should be able to run scalable applications in a modern, dynamic environment such as public, private and hybrid cloud.”

For Rice, the first step is to containerise applications,use continuous integration and continuous delivery for application development and deployment and use an orchestrator. This is where Kubernetes fits in.

There are a number of container orchestration platforms, but Kubernetes has somehow managed to rise above the crowd, to become the preferred choice for many organisations.

According to Todd Moore, vice-president of open technology at IBM, its success is down to adoption by developers, the fact that it is based on open standards, it has an open API and it is managed by the Cloud Native Computing Forum (CNCF).

From a developer perspective, this means Kubernetes can be regarded as open and cloud agnostic, attributes that give developers the confidence they can deploy code on Kubernetes without fear of being locked in.

Infrastructure provider independence

In its Using Kubernetes to orchestrate container-based cloud and microservices applications report, analyst Gartner stated: “Kubernetes achieves infrastructure provider independence explicitly by engineering support for a wide variety of underlying compute platforms and testing the integrations.”

While it was originally developed by Google, the Kubernetes project supports Linux variants and Windows; server virtualisation environments like vSphere, the OpenStack private cloud platform and IaaS providers.

In the report, Gartner said Kubernetes has built integrations for GCE, AWS and Microsoft Azure in the public cloud space; and Vagrant, vSphere and OpenStack among others, in the private cloud infrastructure space.

But while it offers a great deal of flexibility, Kubernetes does require a significant learning curve to master. Janet Kuo, software engineer at Google, said: “Enterprises don’t want to learn the ins and outs of Kubernetes. They just want it to work.”

Since its inception in 2015, the community behind Kubernetes has been working to achieve this. “Today Kubernetes just works and is good for mainstream users,” said Kuo.

The benefits, according to Kuo are that Kubernetes focuses on open standards and extensibility; it can run multi-cloud environments and even provide serverless computing for edge devices such for internet of things applications. In fact, there are numerous demonstrations on the internet of how to manage a cluster of Raspberry Pis using Kubernetes, which provides a way for developers to gain experience of Kubernetes at relatively low cost.

Hardware as a service

As well as  its many uses for managing containers and microservices architectures, Kubernetes can also be used to provide developers with managed access to expensive hardware, like graphics processing units (GPUs), as a service, Igor Khapov, head of virtualisation development at IBM, said in a lightning talk session prior to the main Kubecon event.

In fact, GPU resource isolation is essential for applications like machine learning, Hui Luo, software engineer at VMware explained during his lightning talk at the same event.

“Everyone wants to run machine learning really well,” he said. “You need to support a lot of hardware resources in Kubernetes, and you take care of the GPU drivers and have GPU monitoring with a rich set of metrics.”

As Computer Weekly has previously reported, eBay is among the organisations using Kubernetes for machine learning and AI. Speaking at a KubeCon panel discussion on AI, Xinglang Wang, a principal engineer at eBay, said AI had a high barrier to entry, but packaging tools in a Kubernetes cluster made it easier for businesses to get started on an AI project.

Getting to grips with monitoring

One of the main goal of Kubernetes is to make it easy for enterprises to monitor complex IT infrastructure and dynamically allocate resources across private clouds, public clouds and bare-metal servers dynamically and programmatically, through application programming interfaces (APIs).

Given the complex environments that can be managed and orchestrated through Kubernetes, IT departments need to assess how best to tackle monitoring. Monitoring a Kubernetes environment needs to go beyond analysing the log files that show what the various software components are actually doing, Pryanka Sharma, director of Cloud Native Alliances at GitLab, said during her KubeCon keynote presentation.

Keeping a Kubernetes environment healthy requires what Sharma described during her keynote presentation as “observability” for today’s enterprises. “Observability is the new name for monitoring, but in a more complex software environment with Kubernetes as a backbone, observability consists of metrics, tracing and logging,” she said.

She said the open source ecosystem is teaming with projects for observability. “The business needs to react to the market. Speeding up the cycle time [for new code releases] is critical. Observability will enable you to ship [code] fast and reliability.” For Sharma, projects like Prometheus, Jaeger and Isto can play a big role in observability.

From the people Computer Weekly has spoken to, it seems Kubernetes should be regarded as an ecosystem of software components for managing containers on-premise or in private or public clouds.

The open source components that make up this ecosystem are constantly evolving and new tools are being added. As such, there is no such thing as an off-the-shelf “Kubernetes solution”, but open source distribution providers have realised there is a market opportunity to provide a single point of contact above the complexities that sit in the Kubernetes ecosystem.

Arguably, they represent the starting point for any CIO wishing to start building a strategy around Kubernetes and container technologies.

Read more about Kubernetes

Read more on Containers

CIO
Security
Networking
Data Center
Data Management
Close