Cloud holds the promise of changing the way IT supports the business by using hybrid private/public cloud services.
The days of the enterprise application are numbered. Buying enterprise resource planning (ERP), customer relationship management (CRM) or other monolithic apps has not served many organisations well.
Some have found that the only way to get the systems to work is either to go for massive modification (and then suffer the consequences when it comes to upgrading) or to change the way the business operates to fit the application.
The SAPs and Oracles of this world have realised that the future is more componentised. Even they are moving away from the monolithic application.
Cloud changes the way IT supports the business.
In essence, every action within an organisation is part of a process. These processes need to change to reflect market conditions and the needs of the business. Any monolithic application will struggle to meet this overriding requirement – and this is where cloud comes in.
A well-architected cloud platform enables services to be picked up from across a hybrid private/public cloud ecosystem. By the correct use of application programming interfaces (APIs), data can flow across the service boundaries to fulfil the needs of the overall process. As needs change, any one or more of the services can be unplugged and replaced with a different one.
More articles on microservices in the cloud
However, there are lots of ‘devil in the detail’ issues here. The first is that not all services will be equivalent. For example, do Google and Bing Maps use the same API? The answer is, unsurprisingly, no.
Ensuring that services can be plugged out and a new one plugged in requires knowledge of the APIs in advance – so you will still need a library of ‘glue’ to allow each service to be used effectively.
This then means that you also need a library of services that you allow people to use. Someone, or a group of people, will have to ensure that each one to be used is not only capable of fulfilling the needs of the process, but that they do so in a secure way that meets performance requirements.
You will also need to ensure that a full audit path is maintained of which services have been used at any time. You may have to prove how a customer was sold to, or offered advice, some time in the future.
Then there is the question of granularity. Just what is a ‘service’? Looking at how some organisations (and service providers) have implemented their cloud platforms, it is tempting to regard a whole hosted application as a ‘service’. Then there is the whole raft of ‘XaaS’ offerings: storage, disaster recovery, systems management – you name it.
Future based around microservices
From Quocirca’s point of view, the future will be based around microservices. These are still discrete pockets of functionality, but carry out single functions, rather than trying to take on too much. As a process is broken down into its constituent parts, you are left with a set of tasks – and it is surprising how often these tasks appear in other processes.
As a simple example, let’s take a calendar function: you want to order something and want it to arrive by a certain day. You want to book some vacation time; you need a calendar. You want to time-stamp a contract; you need a calendar. By using the same microservice each time, everything is tied together, and there is less chance for confusion or error between functions. Also, should a function need updating, this can be done in one place, and all dependent processes get the update automatically.
As a process is broken down into its constituent parts, you are left with a set of tasks – and it is surprising how often these tasks appear in other processes
Dynamic creation of an overall microservice-based approach to process requires complex orchestration. Here, Quocirca is seeing a range of options coming through that show promise.
In the early days of virtualisation, the move was towards virtual images (VMs). These took the required application and stack parts and created an image that could be implemented more easily, although a high degree of manual intervention was still required in many cases. This then moved on to virtual appliances – again, a self-contained stack of bits and pieces that could be implemented (in an easier manner) against a virtual infrastructure.
Enterprise application model
However, these approaches were still pretty much stuck in the enterprise application model. Newer approaches have included Docker containers, which can have better degrees of granularity, as well as Parallels Virtuozzo containers. These enable most of the configuration tasks required by a service to be wrappered within the overall image, making implementation far easier.
In most cases, such containers have still been focused at the larger end of the problem – full applications are still being put together with the rest of the required software stack and used to provision an overall capability more rapidly.
The move has to be toward the capability to rapidly identify and pull together smaller functional components on the fly – to be able to create a means of fulfilling the dynamic process through a fully audited and managed composite application.
Suppliers are emerging in this space. StackIQ offers its Boss 5, which enables what it calls wires (a programmatic framework) and pallets (an automated approach to a container-style system) to provide a warehouse-grade automation capability. The system is based on highly granular capabilities – a pallet can be built up from a set of smaller components – and each of these smaller components can be managed separately, with any pallet that is using the component being updated automatically.
Virtustream xStream offers cloud automation and management capabilities using what Virtustream terms its micro-VMs (µVMs). These micro-VMs are pretty much what they appear to be – much smaller functional components that can be brought together, as needed, in an automated manner to fulfil a particular need.
OnApp provides a system aimed mainly at service providers that enables the providers’ functions to be packaged and implemented more easily – and to be maintained flexibly. This approach is becoming more important as the role of a cloud aggregator or cloud broker emerges, where a cloud provider or a master contract manager (such as Cisco or Dell, respectively) needs to be able to pull together multiple offerings from other service providers to meet customers' needs.
For this approach to succeed, it is imperative to have a master orchestrator that enables self-service and fully managed dynamic integration
For this approach to succeed, it is imperative to have a master orchestrator that enables self-service and fully managed dynamic integration.
The incumbents are not letting the impact of the new suppliers pass unnoticed, however. BMC, CA, Dell, CSC and others have their own offerings that build on their heritage to provide tools that can create, monitor, manage and move packages and workloads around a hybrid cloud, as needed.
IBM, now working far more to a ‘cloud first’ message, is using its SoftLayer acquisition to provide a functional platform for creating and using microservices alongside full APIs to make the integration of private and public cloud services easier to manage.
In some ways, we are harking back to the 1990s and the much-maligned business process re-engineering approach to business. The problem in the 1990s was that it was too difficult to break down the monolithic applications and to plug and play with new functions. Organisations found that all they were really doing was tinkering round the edges.
With a microservices-based approach to facilitating an organisation’s process needs, it becomes far easier to try something out; to see if a change of just one part of a process – a single task – can have a major beneficial impact on the overall outcome of the process.
Another example of harking back is to the use of a service-oriented architecture (SOA). Again, this was a concept that was probably a little too early for the world. Quocirca saw far too many implementations that replaced hard-coded monolithic applications with hard-coded integrations of large service modules – something that was not far enough removed from the original monolithic application itself.
Microservices need to be loosely coupled. The secret will be in how well the APIs work alongside and how well the orchestration tools can use these APIs. By providing a true capability for libraries of microservices to be utilised in a dynamic and open way, organisations may find that the concepts and promises of BPR and SOA can now be fulfilled – as long as they can break away from the mindset constraints of the monolithic application.
How to safely use microservices
Advice for .NET developers learning to work with microservices