CloudBees DevOps guru: Do microservices, Docker & DevOps make a perfect trio?

The is a guest post for the Computer Weekly Developer Network written by Brian Dawson in his capacity as DevOps evangelist at CloudBees — CloudBees describes itself as the ‘hub’ of enterprise Jenkins and DevOps, providing software solutions for continuous delivery.

Dawson’s areas of focus centre on tools, technology and pipeline development, project management, licensing, business development and process improvement.

Dawson writes as follows…

1aaeaaqaaaaaaaapqaaaajgnkyzeymjy5ltywyzetngy1zc1hyji1ltjjnjrkmdvhy2u1oqMore organisations recognise the value of adopting DevOps practices to accelerate delivery cycles while improving quality, reliability and security. As a result, the disparities between organisations who have begun a DevOps transformation and those who have not are becoming more pronounced as highlighted in the most recent State of DevOps Report.

In addition to noting that DevOps practices “improve organisational culture and enhance employee engagement”, this report cites findings on high-performing organisations: compared to lower-performing peers, they have better employee loyalty and spend less time on unplanned work and rework. They deploy 200 times more frequently, recover from failures 24 times faster and have 2,555 times shorter lead times for changes.

Many of the organisations surveyed started down the road to DevOps by bridging the chasms that exist between upstream development and downstream delivery across three planes:

  • People and culture
  • Process and practice
  • Tools and technology

They are interdependent and establishing an effective DevOps culture requires addressing all three.

Figure 1. Bridging the chasm between upstream (development) and downstream (operations) across the three planes of the DevOps trinity.

Figure 1. Bridging the chasm between upstream (development) and downstream (operations) across the three planes of the DevOps trinity.

On the third plane – tools and technology – high-performing organisations have long recognised the value of using tools to automate the delivery process, but automating legacy architectures, using legacy technologies only goes so for. Accordingly, there is a growing interest in microservices architecture and container technology.

Not your father’s architecture: microservices are not a mashup of SOA

Because the microservices concept has its roots in Service Oriented Architecture (SOA) the two concepts share many characteristics, but there are important distinctions. Like SOA, microservices work by decoupling components of a complex system and defining interfaces or contracts between them.

With microservices, the communications between components tend to be lighter weight and the interfaces and contracts less rigid, often implemented through RESTful APIs. Many also view microservices as more focused on user-facing functionality rather than back-end services, but that is not a hard-and-fast rule.

Microservice components can also be deployed independently, making it easier for relatively small teams to apply iterative processes to build, test and deliver a microservice as an individual component.

Google Trends shows increasing interest in microservices

Google Trends shows increasing interest in microservices

Several factors contribute to the above shown acceleration in interest. Small components can be built independently by teams of 8-12 (2 pizza teams) who have end-to-end control over development and delivery.

Decoupling system functionality into smaller components makes it possible to reliably and frequently update individual components with reduced risk and impact on the overall system.

A cross-functional scrum team with development, QA and operations expertise can rapidly develop, test and deploy a complete microservice component, and then react faster to unexpected issues once it is deployed.

How do containers fit in?

Docker has revitalised decades-old container technology and this has captured everyone’s attention. It is difficult to find a mainstream development and delivery tool provider that has not adopted some level of Docker support.

The appeal of using Docker is that a Docker container lets you encapsulate an entire environment into a single lightweight image rather than building and configuring a new physical server. Docker containers provide fast access to infrastructure, a fundamental requirement of DevOps and continuous delivery practices.

As the industry continues to move towards the ideal of building and testing software with every change made, we need environments available on-demand to support the increased number and frequency of builds.

Containers are a perfect fit for small agile teams because they provide fast access to immutable infrastructure without interfering with other development streams. Containers are also a perfect fit for microservices because they are well-suited to hosting smaller, self-contained components.

There are, of course, some caveats to consider before jumping in with both feet with microservices and containers. Container technology is maturing and evolving rapidly and new Docker releases arrive frequently.

Aggressive adoption

If you’re going to be aggressive about container adoption, keep vigilant about changes that may affect your specific use cases. Rather than spending time and resources breaking a legacy monolithic system into microservices, consider leaving that software in place and use a microservices architecture only when implementing new capabilities, gradually replacing legacy architecture.

DevOps requires a marriage of culture and process as well as tools and technology.  An organisation’s ability to employ tools (including Docker) and technology (including containers and microservices) in support of a collaborative culture and proven practices is a leading indicator of its ability to differentiate itself by developing software more quickly, from concept to customer and delivering that software with increased quality, reliability and security.