They need us to buy their debt-fuelled AI dream
So it seems Nvidia boss, Jensen Huang is telling financial analysts that datacentre spending will increase 10-fold to 4 trillion within four years. And, those who heard his keynote at the GTC event last year, will have seen a presentation where he claims the most efficient way to process tokens for AI is by using the company’s most powerful (and most expensive) computer.
Industry commentators have even coined the phrase “Huang’s Law” to depict the phenomenal 10-fold increase in compute power that is being developed in what seems like a perverse attempt to entice AI developers to create more and more powerful models. Yes this is the field of dreams that exists in the minds of the hardware makers. “If we build it, they will come up with an AI model that pushes our hardware even more.”
At least in 1965, when Intel co-founder Gordon Moore, spoke about the power of integrated circuits doubling every 18 months to two years, the industry had a bit of certainty. There was certainty that performance of existing software increased. The latest relational database application benefits from the improved processing power now available. While providers of such software eventually build-in the hardware optimisations new processors offer, what they don’t do is expect every database application and hardware on which it runs, to be upgraded to the latest systems.
Yet, just a few weeks ago, during the company’s AI Tour London event, Microsoft CEO, Satya Nadella spoke about the performance gains the newest systems provide in terms of AI acceleration.
Look at the GB200 Grace Blackwell high-end AI accelerator, which Nvidia describes as a “superchip”, and claims it provides a unified memory architecture designed for large language models. It would seem from Nadella’s remarks, such hardware is offering AI developers the power to do more. “There’s an unbelievable renaissance happening with these systems and workloads, whether they’re training workloads or inference workloads, they are unlike anything we’ve seen in the past,” he said.
First in line to get these systems are the hyperscalers who are trying to convince their shareholders that they offer a massive growth opportunity for the business. The financial markets are responding to their massive capital expenditure to fund AI-heavy datacentre builds, which is fuelling a surge in the debt these firms are taking on. Bloomberg likens the situation to that of the dotcom bubble. And we all know how that ended for the tech businesses buying into the latest craze.
Hyperscalers’ business models are predicated on customers paying more for the AI capabilities of these new datacentres. Their CEOs separately want us to believe we need it but I’m not sure we really do.
