ag visuell - Fotolia
Insurance company Generali is decommissioning its datacentres. It is moving to the cloud and will use Amazon Web Services (AWS) and Microsoft Azure partners to host its non-core IT systems.
Generali aims to develop and maintain the software that differentiates the business itself. These applications will be hosted on Azure.
Like many organisations, Generali is adopting a multicloud strategy for its IT infrastructure. But this approach requires IT decision-makers to have the right information in place to understand where best to run a given workload.
“I am seeing movement in the industry to simplify workload management,” says Yanna Winter, CIO at Generali. This, for Winter, is about deciding where to place the pieces on the chess board that represents Generali’s IT infrastructure.
“My approach is to analyse the value chain, retain IT where we have a competitive advantage,” she says. For everything else, the company will use an outsourcer or software-as-a-service (SaaS) provider, says Winter.
Such conversations are commonplace in IT. IT departments are now looking to run multiple public clouds, alongside private clouds and on-premise systems. Businesses want to reduce the risk of buying everything from one public cloud provider – they don’t want to be “locked in”, they may have applications that need to remain on-premise, and there may be some workloads that can take advantage of new functionality that only one public cloud can offer. They also want to run workloads cost-effectively.
This means that deciding where best to place the workload on the virtual chessboard that represents hybrid, multicloud IT, becomes increasingly complex. Even in situations where a company has just one cloud provider, monitoring workloads can help to keep budgets from getting overspent.
For example, Dominic Maidment, technology architect at Total Gas & Power, says the company used a tool called Beam from Nutanix to identify AWS costs. “We had a significantly underutilised set of AWS resources left over from a professional services engagement,” he says. “Using the Beam reports as a guide, we deactivated those resources and reduced the monthly spend by roughly 50%.”
According to Nick McQuire, vice-president, enterprise research at CCS Insight, the market for multi-hybrid control and management and orchestration tools will accelerate over the next few years. “We are in the early phase,” he says.
Looking at what the public cloud providers offer in terms of a control plane for managing workloads, McQuire says Azure focuses on hybrid to edge and on-premise workload management. “Google Cloud has been a latecomer in the enterprise and going full board with cloud management,” he adds.
Read more about intelligent workload management
- IT administrators have to manage an increasingly complex and varied range of application types and IT environments, and intelligent workload management tools offer a helping hand with that.
- With complex hybrid environments, there is a need to apply IT asset management best practices across cloud and on-premise systems.
For the moment, says McQuire, the focus is on orchestration and control, security and governance. He says this is a reflection of where IT organisations are in terms of how they are using multiple public clouds. “There is a need to understand the economic impact of moving workloads around,” he says.
“Not only do you have a need to understand the performance of different IT environments, whether to deploy on-premise, in a private cloud or use one of the three public clouds, there is also a requirement to understand the economics associated with those decisions.”
It is now not uncommon for IT decision-makers to standardise on one public cloud for specialist workloads such as artificial intelligence (AI), and use another for infrastructure as a service (IaaS). McQuire adds: “Two years ago, companies started running machine learning workloads with a single cloud provider. But in the last year, over half of the companies that have AI in production are taking a formalised multicloud approach, to bring in innovation with advanced technologies.”
Cloud management platforms
Tools that support intelligent multicloud management come from a number of different areas, including traditional IT asset management, expense management and data lifecycle management.
In January 2020, Gartner published its Magic Quadrant report on cloud management platforms. It covers the tools that manage multicloud services and resources and offer governance, lifecycle management, brokering and automation for managed cloud infrastructure resources.
Gartner’s assessment looked at functionality, including: provisioning and orchestration; service request; inventory and classification; monitoring and analytics; and cost management and workload optimisation.
What is interesting from Gartner’s analysis is that a number of major software companies have acquired cloud management technologies to bolster their product portfolios. But there are several much smaller, lesser-known companies that appear in the Magic Quadrant for cloud management platforms.
Flexera acquired RightScale in 2018, and in 2019, Snow Software acquired Embotics. Both Snow and Flexera are known for their software asset management tools, and the acquisitions gave both companies a foothold in cloud operations management.
According to Gartner, Embiotics gives Snow Software a tool that can automatically deploy workloads to ideal cloud environments. Similarly, Gartner’s Magic Quadrant report notes that RightScale’s Optima provides Flexera customers with a cost management and resource optimisation product that can be used standalone or integrated with other cloud management functions.
VMware is the other major software company in Gartner’s Magic Quadrant for cloud management platforms. In 2018, it acquired CloudHealth for cost optimisation. This built on VMware’s 2017 acquisition of Wavefront, which provides a real-time metrics monitoring and observability platform.
As Computer Weekly has previously reported, VMware also provides vRealize Operations, which it describes as AI-based support for IT admins who need help managing applications and workloads across private, hybrid and multicloud environments.
Scalr and Morpheus Data are two lesser-known companies recognised by Gartner as leaders in the cloud management platform. Scalr is marked down by Gartner for having less developed plans to expand outside North America, while Morpheus Data loses points because of a lack of brand recognition compared with more established rivals.
Other options for workload management
As well as cloud management platforms, some third-party tools that companies provide monitor software, typically distributed as SaaS products, and are able to manage workloads running in the public cloud. For instance, AppDynamics, Dynatrace and New Relic provide application monitoring on the public cloud.
Assuming that the trajectory for cloud-native architectures, containerisation and microservices continues to climb, IT departments will find it increasingly difficult to grasp all the different components that make up their corporate IT infrastructure. In the Forrester report Unify application portfolio management and cloud tagging hybrid systems require common management, principal analysts Charles Betz and George Lawrie discuss how IT departments can tag IT assets to improve visibility.
Whether it is a cloud service, an application, a virtual machine or a container, using a descriptive tag can help IT administrators quickly identify who owns a workload, and its role. This information can drive automation of certain IT admin tasks.
As IT infrastructure grows in complexity, businesses are likely to find that no single product will provide all the data they need for intelligent workload management. Increasingly, data will need to be pulled out of different reporting tools. This information would then drive systems that automate the migration and redeployment of IT workloads.
Rob Tribe, senior SE director at Nutanix, says: “Automation is the only realistic way of delivering a service to the scale at which many organisations are now operating workloads in the public cloud.”
Case study: How Total Gas & Power avoids costly errors
Total Gas & Power has been using Nutanix Beam to gain insight into expenditure, security and governance in the public cloud. Dominic Maidment, technology architect at Total Gas & Power, says: “The first steps were about knowing what we have and quantifying that in terms of provisioned services, knowing where it is and then laterally how much that costs and its position relative to our security posture.”
In Maidment’s experience, public cloud usage can be hard to visualise. He says: “The amount of detail which operational teams need to know is difficult to absorb and see in context.”
This complexity is increased as organisations buy services from multiple cloud providers. Maidment describes this challenge as “fog of war”, where IT admins can only drill down on details on the part of the IT estate they are currently looking at, which can lead to costly errors, both financially and reputationally.
In regulated environments, says Maidment, it is crucial for confidence around public cloud usage, to be able to prove that the IT environment meets a standard and the operating environments are able to pass an audit. IT decision-makers also need to assess whether the IT infrastructure they select is the most economical place to deploy a workload. Non-functional requirements should also be included in any assessment, as they are commonly overlooked.
“You may trust all the third parties that are working on your cloud estate, but errors occur, actors change and the vision doesn’t necessarily match the integrity of the end result, so technical assurance remains of paramount importance for delivery projects in cloud environments,” says Maidment.
For Maidment, a tool like Nutanix Beam effectively removes the “fog of war” – the parts of the map in a strategy game which are obfuscated from the viewer until they are explored. He says: “The ability to bring all this information together in an abstracted, visual format with the ability to report and tune into higher levels of fidelity is highly desirable and has helped us eliminate waste, maintain our security posture and plan ahead for project resource consumption.”