Containerisation in the enterprise - SUSE: The deployment surface shifted, pack up for portability

As businesses continue to modernise their server estate and move towards cloud-native architectures, the elephant in the room is the monolithic core business application that cannot easily be rehosted without significant risk and disruption. 

These days, it is more efficient to deploy an application in a container than use a virtual machine. Computer Weekly now examines the modern trends, dynamics and challenges faced by organisations now migrating to the micro-engineered world of software containerisation.

As all good software architects know, a container is defined as a ‘logical’ computing environment where code is engineered to allow a guest application to run in a state where it is abstracted away from the underlying host system’s hardware and software infrastructure resources. 

So, what do enterprises need to think about when it comes to architecting, developing, deploying and maintaining software containers?

This post is written by Sheng Liang in his capacity as president of engineering & innovation at developer-centric (increasingly cloud-native) platforms and tools-focused open source software company SUSE. 

Prior to SUSE, Sheng was co-founder and CEO of Rancher Labs and CTO of the cloud platforms group at Citrix Systems. He started as a staff engineer in Java Software programming at Sun Microsystems, where he worked on the Java Virtual Machine (JVM).

Liang writes as follows… 

The Covid-19 pandemic has greatly increased the need for organisations to transform their business practices. IT teams are under pressure to develop and deploy new business applications that cater to the needs of a remote workforce and online customer base. Unfortunately, developing and deploying applications is more challenging than ever.

The deployment surface has shifted

Liang: The deployment surface has changed, welcome to the new planet.

Unlike the old days when most applications were developed on standard platforms like Windows .NET or Java and deployed on servers running in corporate datacentre, today applications are written in a myriad of languages and frameworks and can be deployed in different computing infrastructures:

  • the on-premise datacentre
  • the cloud
  • the branch office
  • the edge computing environment

How can developers make sure their applications, once written, can run on any computing infrastructure? How can IT administrators gain visibility and control over which developer can deploy which application to which infrastructure? How can IT leaders ensure that no corporate governance and security policies are violated in the process?

Software containers are a standard means for packaging and distributing software applications so they can run on any infrastructure. Software containers were originally developed on the Linux operating system, although now they work on Windows as well. Docker was the original industry standard for containers. Today, there are multiple ways to package containers, such as containerd or CRI-O, that are all compatible with the original Docker format. Developers today have a wide choice of container packaging and distribution tools.

Kubernetes as the orchestration standard

Modern applications often require multiple containers. An application may consist of multiple services and for a service to scale, it may be implemented using many running instances of the same container image. To meet scalability and failover needs, an application often requires multiple servers or virtual machines to host its containers.

Kubernetes is the industry standard container orchestration platform. Kubernetes manages the underlying server and virtual machine nodes, schedules the containers onto these nodes and coordinates the interactions and lifecycle of services.

Originally developed by Google, Kubernetes has gained mainstream adoption. The popularity of Kubernetes has, in fact, made it synonymous with containerisation.

Most enterprise IT organisations lack the skills to implement upstream open source Kubernetes technology on their own. They often turn to a commercial vendor to provide a software subscription, which entitles customers to support needed for running mission-critical workloads and gives:

  1. Third-party software vendor and hardware vendor certifications
  2. Necessary security and compliance certifications
  3. Additional certifications needed for highly regulated industries like government and telecommunications
  4. Training, consulting and custom development work

As provider of the enterprise Kubernetes platform SUSE Rancher, we look forward to working with organisations of all sizes as they embark on their Kubernetes journey.

CIO
Security
Networking
Data Center
Data Management
Close