From code to cloud: securing your software supply chain starts with cloud observability

This is a guest blogpost by Ash Kulkarni, CEO of Elastic.

As organisations increasingly move their infrastructure, applications, and services to the cloud, managing information, maintaining security, and protecting data integrity all become more challenging.

At no time in history has this been more apparent than during — and now in the aftermath — of a global pandemic. Just a few months into the pandemic, Microsoft CEO Satya Nadella said that the company had seen two years of digital transformation in two months as its customers raced to adopt cloud solutions.

It’s clear this momentum is continuing. Recent survey data from Gartner confirms this trend, noting that 51% of IT spending on infrastructure, software, applications, and services will shift from traditional solutions to the cloud, and 66% of spending on application software will be directed toward cloud technologies in the next three years.

This evolution marks a sea change in truly transforming the IT function – no longer can security and developer teams remain siloed.

Protect while you observe

For both developers and security pros, having access to the right data at the right time is necessary to make decisions about priorities.

Observability — a term familiar to developers and now coming to light among security practitioners — gives teams an understanding of what code does in production, how it works, how it fails, and how end-users are affected.

Observability data gathered while assessing application and infrastructure performance and availability can double as a key resource for cybersecurity initiatives. For developers, observability data shows the performance of an application and can help diagnose issues within an application. For security analysts, this same data can be analysed to show anomalies that may represent a security threat.

Making strides towards integrating security with the development, deployment, and monitoring of technology — in essence, protecting while observing — offers mutual benefits to developers, security teams, and the business overall.

So why is this more important now than ever before? Software supply chains.

The rapid shift to the cloud means modern IT applications now run on high-velocity, container-based microservices architectures that are complex because they isolate functions to increase the separation of components and speed application development. Developers are able to launch updates quickly and frequently, but the trade-off is a vast increase in the number of moving parts and a monumental increase in the attack surface of these applications.

The big headlines earlier this year around the Log4j vulnerability shone a spotlight on how companies use software within their businesses to manage their operations.

The vast majority of developers today don’t develop software from the ground up. They rely on third-party resources when creating applications, including components developed internally, pre-built libraries and open source code, to fast-track development, reduce production costs and bring products to market faster.

The challenge for developers and security teams is to understand the processes and dependencies to keep their software secure over time.

Maximising the possibilities of data with DevSecOps

Many organisations are adopting DevSecOps frameworks to address these challenges and integrate monitoring and security tasks into their application development workflows. Whether the context is maintaining system uptime and availability or investigating suspected malicious activity on a network, developers and security teams need to work fast to identify and respond to issues.

Quickly investigating an anomaly requires data that tells a complete story of what happened. Too often, these teams need to piece together the story by manually correlating and analysing metrics, logs, and traces – losing precious time as they struggle to find the root cause and sift through disparate data from multiple tools.

The ideal state for both teams is automatic correlations and advanced analytics that are easy to access from a common source — maybe a single operational store for a developer or a security information and event management system for a security analyst.

Imagine the potential benefits if these teams and processes were more collaborative.

Observability data could add more context for security teams as they work to quickly detect and respond to threats. At the same time, developers could reduce friction in development by securing applications from the start.

Breaking down silos and simplifying workflows across developers and security teams may help these professionals who rely on speed reach their objectives — and those of the business.

Development and continued uptime of secure, reliable technology ensures an organisation can continue to serve its customers. At the same time, securing IT can help prevent a data breach and all the challenges that entails, from the compromise of valuable assets to potential damages to a company’s reputation.

Creating a virtuous cycle

If the pandemic has taught us one thing, it’s that resilience requires adaptation. This starts with understanding the key questions and challenges an organisation wants to address and identifying the insights needed to solve them.

Whether a company is born in the cloud or managing the migration of legacy systems, creating a real-time continuous feedback loop forged by observability and underpinned by security provides a foundation for IT leaders to address cloud complexity before it stifles innovation.

Data Center
Data Management