Essential Guide

Essential guide to application modernisation

A comprehensive collection of articles, videos and more, hand-picked by our editors

Microservices: Small parts with big advantages

Microservices go hand in hand with containers and the idea of software portability – but how do they work and why should you care?

The microservices concept is not really anything new – it is the implementation that has evolved.

The idea is to break down traditional, monolithic applications into many small pieces of software that talk to each other to deliver the same functionality.

This will give those who lived through component-based software, web services and service-oriented architecture (SOA) in the early 2000s a sense of déjà vu.

They were meant to do something similar. So what’s the difference? “Microservices are much lighter-weight than SOA, with all of the standards which that entailed,” says Ben Wootton, co-founder of Seattle-based DevOps consultancy Sendachi.

SOA was a supplier-driven phenomenon, with an emphasis on complex enterprise service buses – a form of middleware needed to communicate between all the services.

“Message standards are looser and are exchanged over lightweight message brokers,” says Wootton. “The tooling has evolved from the open source community rather than big enterprise.”

Speed and agility Companies are interested in microservices because they can bring speed and agility and encapsulate relatively small business functions, says Wootton. A currency conversion service is a good example, or an e-commerce shopping cart.

Companies can develop services like these more quickly and can change them more readily, because they are dealing with smaller code bases. This is not something that traditional, monolithic applications with code millions of lines long were designed for.

Read more about microservices and containers

Microservcies are independently deployable bits of code can be used to build agile applications. We look at some of the main players.

Applications built out of independently deployable modules are the future of flexible solution development

The testing overhead is immense when changing such vast code, because of all the interdependencies involved. The other advantage is scalability. Microservices are designed to work in cloud environments, which can increase and decrease the computing resources needed for particular applications at will. If you need more computing power, simply start up another microservice on another cluster of computers. You can’t do that easily, if at all, with monolithic software that is designed to scale up on one piece of hardware.

Distributed computing like this also makes it easier to recover from infrastructure failures. Microservices are designed to be easily replicable, so many of them can be run to pick up the slack should a particular service stop working.

Cloud-native software model

All of this makes microservices useful for cloudnative applications, which are designed to run in cloud environments with lots of commodity hardware resources that can dynamically respond to fluctuations in demand for certain applications.

These infrastructures are designed to fail over quickly. If a server dies, there is another one in the infrastructure to take its place. For microservices to operate that way, they need to interact differently with the IT infrastructure, says Wootton. “You need to lean on automation a lot more,” he adds. “You might find your previous application turns into 50-100 independent services. Maybe they have to be duplicated for resilience. You are quickly left with hundreds of processes to be managed.”

 To automate the management of the microservices and the provisioning of the infrastructure supporting them, the whole computing stack needs to change. The microservices software itself, or the software layer that manages it, must talk to the IT infrastructure to provision CPU cycles, networking and storage.

This calls for software-defined infrastructure, which has underpinned the management of cloud-based resources for a while. Companies ranging from IBM to OpenStack are proposing IT environments in which computing, networking and storage resources are accessed and controlled via software application programming interfaces (APIs), rather than from a systems management tool or command line.

How can you prepare your infrastructure for this? Suppliers of converged and hyper-converged systems would like you to throw away your expensive  storage array and replace it with dumb drives that their software will control for you. But that isn’t necessary, says Donnie Berkholz, director of development, DevOps and IT operations at analyst firm 451 Research. You can typically integrate existing supplierspecific infrastructure with these new cloud management systems, he says: “A lot of cloud environments already have plug-ins.

OpenStack networking has all different kinds of back ends.” Automating the creation and deployment of applications like these requires some form of software container that shields it from any idiosyncrasies in the platform.

A microservice may be deployed on a server running a different network driver, Linux distribution or version of Python than the one on which it was developed, but the container shields it from that. “Microservices can be built in any language and any stack, as long as the boundaries can be defined,” says Kamesh Pemmaraju, vice-president of product marketing at Mirantis, which creates software and services to help IT departments using the OpenStack private cloud management system. “They can be thrown into a container, and it is portable.”

Talk about Docker

Most people describe Docker when talking about containers. This open source project shares elements of the operating system between different containers, but bundles all the application’s dependencies and libraries in the container itself. But this isn’t the only show in town. Containers have been around for a while, from Solaris Containers (Zones) to the Linuxbased LXC containers on which Docker was originally based. RunC is a container runtime designed to implement a container specification standard created by the Open Container Initiative, while Virtuozzo has its own Linuxbased container technology called OpenVZ. And then there’s CoreOS, which has its own rkt container runtime.

Suppliers have been quick to jump on this. VMware launched its own technology, called vSphere Integrated Containers (VIC), last August, which is designed to let developers connect to virtual container hosts using a Docker command line interface. VMware containers run alongside standard virtual machines.

Microsoft announced support for Docker containers on Linux virtual machines (VMs) from within Azure in June 2014. Since then, it has worked on supporting Docker containers on Windows Server, and also announced its own container technology for the Hyper-V hypervisor, along with a Nano Server minimal footprint installation of Windows Server designed for container use.

Where to start?

It may all sound exciting, but ripping out and replacing existing applications with microservices is not a realistic proposition for anyone, so where should a firm start?

Gently does it, says Sendachi’s Wootton, who recommends an iterative approach. “You just dip your toe in and get used to that lifecycle,” he says. “I would pick off new functions and slowly bring them into microservices. I would never re-architect a whole application for the first time.” A company with an established inventory management system might steer clear of replacing it straight away, but it might consider implementing other new functions on its website as microservices, such as a customer chatroom or a product recommendation service, perhaps.

Cut code quickly

Areas where this makes sense are where you need to cut code quickly and innovate rapidly. Mobile apps are a good example, as are customer-facing services that you expect to be used at scale. The legacy batch order processing software that has been doing its job reliably for years may not need the microservices treatment, however.

This practice phase is important, because microservices involve a deep change to the software development and deployment process. The biggest mistake, says Wootton, is companies trying to implement microservices without changing their old ways of working. “You want to move to a DevOps model where people are working more collaboratively,” he adds. DevOps involves a meeting of minds between developers and operations staff, says 451 Research’s Berkholz. “It involves operations staff learning what it looks like to be software developers, but also developers learning what it looks like to do production. You can’t do microservices until you’ve done both of those.”

Work together

This approach enables the two parties to work together in a world where infrastructure is provisioned using software interfaces at a moment’s notice, tailored for the development, testing and production deployment of many tiny applications. In practice, that means operations staff might be checking configuration instructions out of GitHub instead of just writing their own batch files. And while operations staff may be responsible for the platform that code is running on, the developers become responsible for their own code’s operation. “They don’t get to hand it off any more and be done,” says Berkholz. “They have to say ‘I’m on pager duty, and if my code breaks at 2am, I get woken up’.” This might give developers and operations staff alike pause for thought. Microservices is not a free lunch. It needs sophistication in technical infrastructure, along with a highly mature IT team. Many firms will have their work cut out before they are ready. 

This was last published in February 2016

CW+

Features

Enjoy the benefits of CW+ membership, learn more and join.

Essential Guide

Essential guide to application modernisation

Join the conversation

3 comments

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

Microservices bring new testing challenges but also great testability advantages. A problem is that a functionally correct code does not guarantee a successful product.
Cancel
There’s an excellent piece out on Martin Fowler’s website called Testing Strategies in a Microservice Architecture that provides a very detailed analysis of how microservices affect the testing types: http://martinfowler.com/articles/microservice-testing/
Cancel
Microservices has been the talk of the town and they make developers, software organizations life simple and easy from a large monolithic complex codes to simple structured smaller code bases.
Cancel

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close