Enterprises are increasingly adopting Red Hat’s OpenShift Virtualization, driven by Broadcom’s VMware licensing changes and a desire for a stable, AI-ready platform, company executives revealed at the Red Hat Summit this week
Red Hat is seeing greater adoption of OpenShift Virtualization as enterprises look for alternative infrastructure platforms, company executives said at the Red Hat Summit in Boston this week. The open source giant is positioning its virtualisation offering as a key component of its broader hybrid cloud and artificial intelligence (AI)-ready platform.
“We’ve seen incredible progress in OpenShift and OpenShift Virtualization and the customer adoption we’ve seen in that area,” said Red Hat CEO Matt Hicks during a briefing for Asia-Pacific media. He noted that these platforms, augmented by Ansible automation, are setting customers up to pursue AI initiatives.
The growing customer interest in OpenShift Virtualization has been driven by recent moves by Broadcom, particularly its VMware licensing changes and price hikes. “Once the Broadcom impact happened in the market, there was a lot of uncertainty among customers, and obviously concerns around price increases,” said Ashesh Badani, Red Hat’s chief product officer, noting that these concerns have spurred customers to seek alternatives.
Badani said that as customers adopt OpenShift Virtualization, they are looking for capabilities similar to those they had been using in other virtualisation environments. He added that Red Hat has been enhancing OpenShift Virtualization’s capabilities to meet these requirements in the latest version of the platform, which offers better support for networking, storage-like migration and an improved administrative-centric view.
“Customers are also used to having choices for storage, backup and disaster recovery, so we’ve been working closely to make sure we’ve got first-class integration with those technologies,” he added.
Red Hat executives talked up several customer successes, including Ford and Emirates NBD, one of the top banks in the United Arab Emirates, to underscore the platform’s maturity and the benefits to users. For example, Badani noted that customers that moved from other virtualisation platforms to OpenShift Virtualization running on AMD chips can expect improvements in total cost of ownership of as much as 77%.
Andrew Brown, Red Hat’s senior vice-president and chief revenue officer, noted a shift in customers’ strategies as they adopt OpenShift Virtualization. “A year ago, customers were saying, ‘I’m going to move at all costs’,” he said. “Now, they’re taking a learned approach and saying, ‘I’m moving, but I’m going to do specific workloads that I can modernise, and I’m going to pick a platform that I’m going to stick with for a significant period of time’.”
Customers see a trusted brand in Red Hat that gives them confidence that we can scale as they scale, and with a platform that they can have a trusted, multi-year tenure with
Andrew Brown, Red Hat
Brown noted the trust customers have placed in Red Hat, stating: “They see that there is a trusted brand in Red Hat that gives them confidence that we can scale as they scale, and with a platform that they can have a trusted, multi-year tenure with.”
This OpenShift platform is critical to Red Hat as it expands its AI offerings. “The OpenShift platform is the same one that we build our OpenShift AI on top of,” said Badani. “So, if you’ve made this leap and say, ‘I’m going to combine my worlds of virtual machines and containers on OpenShift’, we’re going to give you a solid advantage for the next step in your AI journey.”
Red Hat made two key announcements during the summit to bolster its AI capabilities. These include virtual large language model (vLLM), an open source inference server that speeds up the output of generative AI applications by making better use of graphics processor memory. “You can think of that as running a model on a single machine and being able to ask it questions and get answers at a really low unit price,” said Hicks.
Hicks also pointed to llm-d, a new open source project that can “take a single instance of an LLM and run it across a cluster”.
“The power of that is that it allows our customers to run their own models on any accelerator and in any environment that they want,” said Hicks, likening it to the company’s previous progression from Linux to Kubernetes. “This is now the LLM to llm-d for us in AI.”
Read more about open source in APAC
SUSE CEO Dirk-Peter van Leeuwen warns against suppliers diluting open source to lock in customers, and touts the company’s commitment to providing choice and support across multiple Linux and Kubernetes distributions.
Elastic’s chief product officer, Ken Exner, talks up the company’s expansion into observability and security and how it balances innovation with community contributions and monetisation.
Nutanix’s APAC CTO outlines the lessons learned by enterprises adopting Kubernetes, from avoiding cloud lock-in to mastering lifecycle management, and bridging the gap between legacy and cloud-native skillsets.