Alibaba: Containers will power the future of AI

This is a guest post for the Computer Weekly Developer Network written by Mark Yi in his position as Alibaba Cloud principal engineer, director of engineering for container service.

Yi writes in full as follows…

While it’s well known that cloud computing and artificial intelligence (AI) are fundamentally changing how business is done, less attention has been given to the role of containers in fueling this trend. As a portable and functional way to deploy applications, containers are well-suited to meet the needs of businesses looking to get the most out of emerging innovations.

Container technology isn’t new, but it’s finding novel value in the era of cloud computing and AI.

Originally introduced in the 1970s as a sandbox tool to help developers test their work without disrupting other services, it’s now one of the most widely used technologies due to record-high demand for better application development capabilities and rising expectations around computing efficiency.

Prioritising portability

Containerisation is as popular as it is because of the rise of microservice architectures requiring developers to prioritise application portability, efficient resource utilisation, cost savings and standardised deployment processes in their work.

Because containers excel at breaking down large applications, they enable developers to update specific parts without having to overhaul the entire module. This means that if a particular application fails, it won’t necessarily hinder the broader operation it supports from running.

Pair this benefit with the ability of developers to “write once, run anywhere” with their applications and containers are especially well suited to support the digital transformation ambitions of today’s businesses.

In addition, the emergence of serverless containers is further simplifying the hardware side of application development. As enterprise applications grow in complexity, this method of deployment decouples workloads from the underlying hardware, allowing developers to focus on building applications without worrying about server or device configurations.

Another aspect of enhanced efficiency lies in resource optimisation.

The packaging of applications and their environments into portable images enables enterprises to scale up usage as needed while saving on overhead costs. Moreover, during cloud migrations, which can otherwise be slow and costly, containerisation helps encapsulate and deploy legacy applications in new cloud environments.

Containers… and AI

The era of AI is opening up new applications for container technology.

Examples abound, with containers being used to power everything from autonomous driving vehicles to digital medical diagnosis to e-commerce recommendations.

Given that the machine learning and training models that underlie these applications rely on specific libraries, frameworks and software across different environments, standardised containers are ideal for addressing incompatibility and version control issues. They also give enterprises a high level of scalability while managing costs as training parameters expand, as well as make debugging easier by isolating issues to be fixed, an advantage for streamlining the process from application development to deployment.

These benefits can be seen in the rapid increase in the use of this technology.

Already, almost 50% of AI deployments utilise containers, with Gartner predicting this to rise to 75% by 2027.

On top of this, AI is also being used to improve the efficiency and performance of containers – a true win-win. For example, cloud service providers are offering AI container images with built-in hardware acceleration libraries, AI runtimes and AI frameworks to meet deployment requirements in various scenarios.

Matching expectations

New capabilities often come with higher expectations and this is certainly the case for containers. Developers are looking for seamless, scalable, cost-effective and secure solutions that enable them to work quickly and efficiently.

Serverless delivery models, as one example, allow users to focus less on underlying node and cluster management which frees their attention for application development.

Another issue is standardisation, which can be complex when applying containers to AI workloads. Kubernetes’s high technical barriers and intricate operations remain a bottleneck for many enterprises and developers interested in using containers. Innovations such as the Alibaba Cloud Container Compute Service (ACS) have sought to overcome this by incorporating cloud computing as an underlying layer in Kubernetes software. This allows businesses to move beyond mere container orchestration and experience a new paradigm in container computing.

Finally, using container computing technology for AI workloads enhances the agility, scalability and cost-efficiency of each deployment while simplifying operational management. This aligns well with DevOps best practices and cloud-native architecture, further reinforcing containers as an ideal option for deploying and managing complex AI systems.

Uncontained growth

Enterprises and developers everywhere are seeing the value of containers, which explains the rise in the adoption of this technology.

One example is Moka, a HR management Software-as-a-Service company in China, which uses the maintenance-free nature of ACS clusters to focus on core business priorities rather than infrastructure management. Additionally, the on-demand scaling and pay-per-use model provides Moka with the needed flexibility, especially during peak recruitment seasons.

ACS was designed to enhance ease of use, elasticity and cost-efficiency for typical enterprise-level workloads such as web applications, CI/CD pipelines, big data computing tasks and AI inference. It also supports diverse scenarios for enterprises across various sectors, including internet, gaming, retail, automobile, transportation and manufacturing.

It’s safe to say that the market for container technology is only going to expand, especially given the explosive demand for AI applications. For those on the front lines of application development, it’s certainly an exciting time to be building.