monsitj - stock.adobe.com

Singtel, Nvidia to help scale enterprise AI deployments

Singtel and Nvidia have teamed up on a multimillion-dollar facility to help organisations scale enterprise AI deployments, tackle extreme datacentre power densities, and prepare for the era of embodied AI

Singtel has teamed up with Nvidia to launch an artificial intelligence (AI) centre of excellence (CoE) to help organisations overcome the infrastructure and skills bottlenecks that stand in the way of scaling up AI deployments.

Announced today, the multi-million-dollar facility is expected to provide a deployment pathway for organisations struggling to move their AI initiatives beyond the experimental phase into full-scale production.

Bill Chang, CEO of Singtel Digital InfraCo, the telco’s digital infrastructure business, said unlike other AI CoEs in Singapore, Singtel’s CoE is focused on applied AI, where enterprises bring real-world problem statements and collaborate with an ecosystem of large language model (LLM) makers, application providers, and systems integrators.

To help customers scale up their AI deployments, Singtel has also architected the CoE to serve as a testbed that mirrors its commercial infrastructure, which Chang likened to a national power grid comprising AI datacentres acting as generators, fixed networks as transmission lines, and edge locations as substations.

“Think about this centre of excellence for applied AI as a micro AI grid,” said Chang. “Not only do you experiment and solve bottlenecks, but when you go for full-scale deployment, you seamlessly flip over to the main AI grid and get the resources.”

Marc Hamilton, senior vice-president of solutions architecture and engineering at Nvidia, described the partnership as providing the five -layer foundation for AI deployments first mooted by Nvidia CEO Jensen Huang.

The foundation comprises physical land, power, and datacentre facilities provided by Singtel’s Nxera datacentre arm, followed by Nvidia’s graphics processing units (GPUs). The third layer is the broader AI infrastructure, including networking and cloud orchestration, followed by AI models, and finally, the applications.

Building AI applications

Hamilton said Nvidia plans to tap into its network of 40,000 AI startups to help Singtel’s customers build AI applications. He also stressed the importance of open models, such as Nvidia’s Nemotron, for sovereign AI, ensuring that “Singapore’s data and unique competitive advantage stays in Singapore, and controlled by Singapore companies”.

In addition, Nvidia will work with Singtel to prepare datacentres for extreme power densities without running afoul of strict new sustainability metrics.

Singtel’s Nxera datacentres currently operate Nvidia GB200 Blackwell systems running at roughly 200 kW per rack. However, Chang noted that the CoE is already preparing for the 2027–2029 deployment of Nvidia’s next-generation Rubin Ultra chips.

“The next generation I’m talking about is 600 kW to one megawatt per rack,” he said. “That’s insane in terms of power density – 60 to 100 times more than the average datacentre today.”

Read more about AI infrastructure in APAC

Hamilton pointed out that this extreme density requires less datacentre real estate. Because Blackwell chips are 50 times more energy-efficient at running AI models than the previous Hopper generation, massive compute power requires vastly less physical space. “The GPU datacentre of today, versus many football fields, is much more like a basketball field,” said Hamilton.

The efficiency of AI datacentres will be critical as the Singapore government prepares to tighten regulations on the sector. Later this year, the Singapore government is expected to table the new Digital Infrastructure Act in parliament.

The proposed legislation seeks to establish baseline energy efficiency requirements for all datacentres, including new and existing facilities, as well as mandatory cyber security measures and incident reporting requirements for major cloud service providers and datacentres to ensure economic resilience.

Beyond helping enterprises deploy and scale AI applications in AI datacentres, the CoE will also focus on edge AI and low latency 5G networks as the industry moves beyond generative AI into embodied and physical AI such as autonomous robotics, humanoids, and drone swarms.

The CoE will be located at the Punggol Digital District, which is currently being transformed into a precinct-scale testbed to help companies trial and commercialise real-world robotic applications in live operational environments.

Chang said the CoE is expected to open its doors in about three months. In the interim, large customers in sectors such as government, healthcare, banking, and transportation are already defining use cases and tapping into Singtel’s existing GPU reserves to kickstart their AI journeys.

Read more on Artificial intelligence, automation and robotics