buchachon - Fotolia

What ‘born in the cloud’ means for developers

When it comes to building cloud-native apps, developers must code with an inherent appreciation for where processing, storage and analytics happens

There’s an emotional and technological leap of faith taking place among the software application developer community. The need to embrace the cloud model of service-based computing now drives developers to a space where they will increasingly need to evidence cloud-first architectures and application execution paradigms. The age of cloud native is here.

But the software world has been here before, in a way. Around a decade ago, programmers were compelled to realise their applications had to exist as mobile first – if not at least mobile optimised – to serve the new generation of smartphones and tablets driven by innovations at Apple. 

The shift to cloud-native applications is similar in some senses. Developers must declare themselves cloud-first, and build code with a new inherent appreciation for where processing, storage and analytics happens – and in most cases it is not on-premise, or even on virtual servers hosted in the public cloud.

Rather, developers must get used to an entirely new stack of eminently composable and compartmentalised application components based on concepts such as microservices and containerisation. 

In fact, SumoLogic’s State of modern applications in the cloud report, based on a survey of 1,500 of its customers, found that not only is Docker on Amazon Web Services (AWS) the most popular container technology in production, its installed base grew from 17% in 2016 to 24% in 2017.

Computer Weekly has spoken to a number of industry experts for the CW Developer Network (CWDN) blog, to uncover some of the grittier truths being experienced at the coalface of cloud application development.

Mental and programmatic abstraction

Chris Gray, technical director at Amido, which consults on customer identity, search and cloud services, says cloud native means being serverless: “You can essentially move into a serverless architecture and choose to go with a managed service provider like Azure App Service or AWS Elastic Beanstalk/ECS. By doing so, you remove the overhead of having multiple teams – a team in-house writing code to create the application or solution, another team making sure it operates well – and instead have a number of skilled workers who can develop and operate simply by using platform tools.”

Coping with monolithic code when taking the containerisation route

Monolithic applications, favoured by traditional enterprises, are not as well suited to containerisation, owing to the considerably different tooling required by microservices. Containers are far better suited to a microservices environment where large projects can be broken down into a set of manageable, independent and loosely coupled services.

For some legacy or monolithic applications, the decision to containerise software needs to be considered carefully. 

Containers are valuable when monolithic applications can be split into smaller components which can be distributed across a containerised infrastructure. 

But this is not to say that any application will work, just that care needs to be taken to see whether or not it is suitable for the containerised method of deployment.

Source: Chris Gray, technical director at Amido

As a result, he says IT departments no longer need to hire as many specialists to implement more complicated elements of data processing – for example, data pipelines or machine learning algorithms, as these are all now drag and drop interfaces – so developers can focus on the insights the data brings, rather than the infrastructure and algorithm build.

But it can be difficult to remain supplier-agnostic when embracing some cloud-native platforms because they work in very different ways and do not offer feature parity. A cloud-native application can be built around microservices to isolate chunks of functionality. 

In Gray’s experience, the path to adopting a microservices architecture does not require businesses to make a wholesale digital transformation as it doesn’t have to be an all or nothing proposition. For instance, he says it is possible for retailers to dip their toes in the microservices world without starting from scratch. “This is likely to be music to businesses’ ears as they look to keep up with, and exceed, their customers’ growing expectations and demands when it comes to online capabilities,” he adds.

The use of microservices tends to go hand-in-hand with containerisation. Rather than deploy a  virtual machine (VM) for every instance of an application, a business can use containers, which are lightweight, meaning they use less operating system resources than VMs. “This combination is powerful, as it means we can issue an upgrade/change that will take effect almost immediately, without disruption to the general use of the portal,” says Gray.

Remove server drudgery

Ben Newton, analytics lead at SumoLogic, a provider of cloud-native machine data analytics, describes how the old world of IT used to comprise servers that administrators gave names to. “In the pre-cloud world, Oracle clusters were lovingly configured and nurtured – hours were spent deciding on network device specifications and hardware configurations,” he says. 

The cloud-native world does not understand such drudgery. “Why would you build a database cluster when you can stand one up in seconds with a simple command – and scale it by the mere power of thought? In the time it took you to ponder that thought, the cloud native has already scaled their database cluster to three continents,” says Newton.

Scaling in the cloud-first world can also be taken for granted, he says, as can the location of physical datacentres. “In the olden days, moving to a new datacentre was like building a new town from scratch. It was painful and lasted for years. The cloud native sees no boundaries to their international ambitions.”

Reinvent no more

Eric Sigler, head of DevOps at digital operations management platform PagerDuty, says most businesses he talks to are moving to the cloud.

Sigler notes that cloud-native means developers no longer have to keep reinventing the wheel, and “going cloud native acts as a ‘forcing function’ for how applications are built on top of infrastructure”. By standardising on the behaviour of lower-level components such as compute and networking, he says businesses are effectively telling individual teams working on these smaller, more agile units of software to stop wasting their time on changing everything below the application layer.

This, says Sigler, is different to the approach previously taken with traditional or virtualised application designs, where developers tended to spend lots of time reinventing how they would ship the software. Not only is this a painful process, it is one that does not often result in useful business value, he says.

Sigler urges developers to build a tolerance to failures into their applications to make them more resilient. “If, for instance, the network breaks, it’s assumed that the software had nothing to do with that failure – but that’s not the case,” he says. “Failure happens at all parts and in all aspects of operating a cloud-native infrastructure. In my experience, you ignore this reality at your own peril.”

Rethink data distribution

One of the architectural considerations IT departments should think about in cloud-native applications is the location of data, says Patrick McFadin, vice-president of developer relations at data platform provider DataStax.

McFadin says cloud scale means dealing with hundreds of thousands or millions of users, all creating data all the time. “Storing data in one place on a relational database can be difficult when it involves sharing data into multiple locations, all of which are filling up rapidly. So a new approach around data is needed to help applications running in the cloud work,” he says.

McFadin urges developers to assess how data can be distributed consistently and look at whether data should be stored in multiple locations, with copies of each record available in these locations. “Without such an approach, it will be difficult to scale out successfully,” says McFadin.

Nurture a collaborative culture

Beyond the architectural considerations, there are also cultural issues developer teams must take into account. 

“Shifting to a cloud-native approach absolutely changes an organisation’s collaboration among its developer and operational teams,” says PagerDuty’s Sigler. 

“With cloud native comes the move to a consistent, standardised set of primitives a team can use. This is closer to total ownership of that service, which is uncomfortable for some teams.”

These primitives may not necessarily be internal to the company. “You can’t ignore external application dependencies, such as DNS providers or SMTP delivery services,” Sigler warns.

Understand lightweight development

Did things just get easier? Will computing on cloud-native terms be more straightforward? The answer, unfortunately, might be both yes and no. 

Cloud offers (almost) infinitely more flexibility, power and scalability, but it also requires platform and tools feature balancing alongside new integration demands for an increasingly hybrid world. At the same time, with so much more analytics and processing muscle at the datacentre back end, and with microservices providing large, functional, discretely definable chunks of computing power, there’s a lot of heavyweight firepower available on tap to those developers entrusted with enough responsibility to use it properly.

Moving into cloud-native territory requires a clear appreciation of the term “lightweight”. With abstracted service layers sitting above any number of precision-engineered, high-performance engines all tuned for specific terrains, keeping the front-end chassis – or in this case, the virtual machine build – truly lightweight will be key to getting cloud native applications running on the open road.

Founded in technology precepts and algorithmic logic, native cloud is also a cultural shift for virtual teams working with virtual tools inside virtual workflows. It’s time to drink from a virtual water cooler. 

Read more on Containers

CIO
Security
Networking
Data Center
Data Management
Close