GP - Fotolia
With growing adoption of containers, companies can no longer rely on proprietary software to manage application microservices that are powering critical business transactions, according to Red Hat CEO Jim Whitehurst.
“There are also hurdles to be cleared that we don’t even know about yet that will rise up as container adoption becomes even more mainstream,” Whitehurst wrote in a blog post.
He said while containers open up limitless opportunities for applications to interact with each other, they also create problems in maintaining and supporting such dynamic environments.
“If you have four million microservices talking to and updating each other on a rapid-fire basis, how do you monitor those interactions and perform application performance management? How do you diagnose issues when something goes wrong? In short, it will require a fundamental rethinking of all the technology and functions involved in running an application portfolio.”
To address these issues, Whitehurst said Red Hat is investing in the infrastructure behind the data and application control centre of the future, “because we recognise how the adaptability of open source can play a critical role in this regard”.
“We also continue to take leadership positions in open-source communities to help drive the roadmaps that will help solve these problems and pain points organisations will soon face,” he added.
Red Hat unveiled OpenShift Application Runtimes at the Red Hat Summit in Boston. It enables developers to not only build container-based applications using microservices, but also helps them transition from delivering monolithic Java applications to creating microservices using the Java language.
To assuage any concerns on the security and performance of containers, Red Hat also introduced the industry’s first Container Health Index that inspects and grades all of Red Hat’s container products as well as those from certified software suppliers.
Red Hat said it will certify 20 software products, including those from F5 Networks and Cloudbees, in the next 90 days.
Read more about container technology
- Risk aversion could slow the growth of container technology in Asia, but a dual-mode operating environment comprising production and beta testing might help.
- The promise of portability has many IT shops experimenting with containers. But enterprises have yet to deploy complex containers, and migration remains difficult.
- Docker containers can help secure cloud applications, but malicious traffic can still move to and from those containers.
- Docker’s service-centric approach to container-based virtualisation takes virtualisation technology to a whole new level, but still has some serious drawbacks.
Experts have pointed out that organisations in Asia tend to take a more cautious approach towards containers and microservices in a region that has been saddled with legacy systems and other business priorities.
“In the context of Asia, swapping legacy systems for new technology requires significant effort, and the business – not just the IT department – must be willing to prioritise this with other potential initiatives that help drive business expansion or growth,” CIO Academy Asia’s Glen Francis told Computer Weekly in February 2016.
That said, he noted that CIOs in the region want to be agile – a key benefit of containers and microservices – when managing business expectations, although the technology may not be well understood by IT leaders.
In April 2016, Red Hat announced that it was launching open innovation labs around the world – including one in Singapore – to help enterprises develop and integrate applications using microservices, deploy them in containers and deliver them using DevOps methodologies across physical, cloud and mobile environments that can quickly scale up or down on-demand.
Read our in-depth history of containers to learn how the technology has evolved over time