Photonics - WeAreDevelopers: Dawn of a major new computational power, or just faster plumbing?

This is a guest post for the Computer Weekly Developer Network written by WeAreDevelopers CEO, Sead Ahmetovic.

Ahmetovic writes in full as follows…

There is a popular narrative that frames photonics as a successor to silicon.

The short story: Moore’s Law is stalling, GPUs hit an energy wall, light-based computing takes over.

Seems clear, but it does not match what is actually happening.

GPU realities

Current GPU architectures deliver dramatic efficiency gains with every generation and silicon is not dying, at least not very soon.

But here is what I find telling: the same companies building those increasingly powerful chips are simultaneously investing heavily in photonic interconnect for their datacentre architecture -Nvidia included.

When the dominant GPU maker treats photonics as essential infrastructure, the “real compute vs. plumbing” debate kind of answers itself. This is not photonics versus electronics. It is both converging into a single systems architecture.

A jaunt with Jensen

I spent last week at Nvidia GTC in San Jose… and one thing was obvious across dozens of conversations: energy is the constraint everyone is working around. Not compute power, not model architecture – energy.

Chips keep getting faster and more efficient, but AI compute demand is growing even faster than those efficiency gains. You can build the most powerful processor in the world. If you cannot move data between thousands of those processors fast enough and at sustainable energy levels, the system bottlenecks. That is a physics problem, not an engineering problem. It is exactly the problem that photonics is able to solve very well. Light moves data at a higher bandwidth, lower latency and a fraction of the energy cost of electrical interconnect.

At the scale AI infrastructure operates today, that gap is not marginal and it changes the economic game.

On the compute side of performing operations, we are positioned even earlier.

The path to coexistence

Ahmetovic: The future needs to be built with both light and electrons, each doing what it does best.

Photonic processors that run real AI workloads exist and are beginning to ship, but they still lack optical memory, require conversion between light and electricity at certain stages and are not replacing GPU clusters anytime soon. The more realistic path is coexistence: photonic acceleration for specific operations alongside conventional processors, similar to how GPUs themselves became a co-processor tier next to CPUs over the past decades.

For the enterprise software stack, most of this will stay invisible. And frankly, that is the point. If photonic interconnect and acceleration get integrated at the infrastructure level by cloud providers, enterprises benefit through lower cost and better energy economics without touching their applications. The technology disappears into the platform. That is actually a strength and not a weakness. Aren’t seamless infrastructure shifts what we actually want?

Pragmatically looked at, photonics is and are already inside the roadmaps of the companies that define enterprise compute. The question worth asking is not whether photonics matters, but whether we stop treating it as a standalone category and recognise that high-performance computing is going hybrid.

The future needs to be both light and electrons, each doing what it does best.

WeAreDevelopers is a global platform and community for software developers that hosts a developer congress, providing expert-led training, networking and career opportunities to for technology engineering talent.