Cloud Native Series: Interana on a case of the common cloud

The is a guest post for the Computer Weekly Developer Network written by Lior Abraham in his role as founder of Interana.

Interana is a specialist in interactive behavioural analytics software for digital businesses — its technology works to process the ‘temporal nature of data’ with tools designed to visualise the nature of data metrics change over time in order to attempt to understand complex user behaviour.

As a former Facebook software engineer, Abraham has a strong background in app development.

He is the creator of SCUBA (Facebook’s most popular internal tool for data analysis) and after tailoring Facebook’s infrastructure to accommodate its growth rate, he turned his attention to behavioural analytics on a broader scale. 

Abraham writes as follows…

With each passing month, going cloud native seems to get easier.

But before I go any further, let me give my definition of a cloud-native app.

“A cloud native app is an ‘aware application’ that, whether a backend or user-facing service, is written to be deployed in a virtualised environment, versus specialised hardware.”

[Looking at the industry we can see that major players like] AWS, Google Cloud Engine (GCE) and Azure definitely do their best to make moving to the cloud as simple as possible — minimising the extent to which you [the developer] have to rethink your entire approach to developing software that must be based on microservices and containers.

Cloud means optimality

But even so, there are some key things to keep in mind if your organisation is adopting a cloud-native approach: you must be aware of what virtual resources you have and how you can use them, optimally.

Sometimes that’s easier said than done, however.

The big public cloud providers often won’t tell you exactly what’s under the hood, so it’s not possible to write software specifically to known quantities of CPU, memory and other variables.

In this instance, if you’re writing your own cloud-native applications, making them as cloud-aware as possible means the better able they are to take advantage of these environments — whether it’s AWS, GCE, Azure or another public cloud stack.

Workload matters, matter

At the same time, you need to know what sorts of workloads you’re going to be moving to the cloud. For example, we all know that providers like AWS can lower the cost of starting things up, so if I have a simple microservice and create 50 instances… and the demand for that service (let’s say it’s a news site and a particular article) goes up by 500 percent, it’s quite easy just to expand those resources.

Certain applications are fairly simple, such as a news site or submitting marketing forms and as such are easy to replicate.

Other cases are not so simple.

Consider analytics, where usually it’s not as straightforward as creating 50 copies of a server.

Workloads in analytics are more complex and interconnected.

In an enterprise use case, such as running some AI or analysis for the business, you don’t know on any given day what the demand for those resources will be. But when demand does spike, all of these servers need to be interacting with each other given the complexity of the task.

Optimising options

Whereas on bare metal hardware you can optimise for resources you know quite well, elastic clouds, on the other hand, allow you to optimise in other ways — and particularly with distributed, uneven and interconnected workloads.

This is the advantage of cloud native and one that we don’t always fully exploit.

An elastic approach allows you to optimise globally in a way that’s potentially far greater than that of optimising bare metal for single nodes or consistent workloads.

If you’re a startup and building an application from scratch, realise that even companies you’d expect to be on bare metal — like Apple and others — have already begun a large scale shift to the cloud.

So you should assume that the users of your app will be running an elastic environment, even if it’s Open Compute Project instead of AWS.

As such, you have to think about the programming architecture of the applications you’re developing, because your competitors certainly are. Cloud native may be easier than ever but it also offers opportunities to optimise further than anyone had expected.