First Light Fusion

Computer simulation at work for the future of nuclear fusion

Nuclear fusion works, just not yet well enough. Learn how software simulations running on modern supercomputers and data science are lighting up possible paths forward

“A sun of our own and it’s made in Britain!” crowed the headline. The UK Atomic Energy Authority (UKAEA) believed its 120-ton experimental reactor Zeta was almost certainly generating neutrons from fusion, the nuclear reaction that powers the sun.

It claimed it could soon start generating electricity safely and cheaply from deuterium extracted from seawater, producing the heat of 10 tons of coal at a cost of just two shillings. “In 20 years, with luck, Zeta will herald in the beginning of a social millennium for mankind,” said the Daily Sketch – on 25 January 1958.

Humanity’s luck was not in. Zeta was not generating the hoped-for reaction and, despite much research progress, the advent of fusion – as opposed to fission used in existing nuclear power stations and weapons – as a commercially viable power source has remained two or three decades in the future ever since. But there are reasons to hope it is getting closer, with computing playing an essential role.

Fusion works – just not yet well enough. In December 2021, the Joint European Torus (Jet), run by UKAEA at Culham in Oxfordshire since 1983 on behalf of a group of European countries, generated 11MW (megawatts) of power from a fusion reaction – twice as much as its own world record from 1997. At the heart of Jet is a tokamak, a doughnut-shaped vacuum chamber where extreme heat and pressure turn hydrogen into plasma, which can start a fusion reaction that throws out heat.

Jet’s record-breaking run required more energy to get the reaction started than it produced, but it was built for experimentation and gathering data, where computing’s role has greatly increased during its four decades of operation. Rob Akers, head of advanced computing at UKAEA, likens it to the two Voyager probes launched by Nasa in 1977, which after numerous close encounters with planets in the 1970s and 1980s are still sending data back from interstellar space.

In December 2021, the Joint European Torus (Jet) generated 11MW of power from a fusion reaction

“It doesn’t really matter that the instrument is not shiny and new, it’s still producing data that has huge, unique value,” says Akers. “Modern data science techniques allow us to extract more value than we could have done 10, 20 or 30 years ago.”

Jet’s recent experiments support design work for Iter, a much larger experimental tokamak under construction in southern France. Funded by 35 countries, Iter aims to be the first fusion device to generate more power than it uses, with operations phased in from 2025 to 2035. The UK government is an Iter partner, but is also funding UKAEA to develop and build Step, a prototype tokamak fusion power station planned to open by 2040 at the site of the soon to be closed West Burton coal-fired power station in Nottinghamshire.

Computing is central to these efforts. “We can’t deliver commercial fusion without computing,” says Akers.

In the early days of fusion, small-scale physical experiments helped to establish how it could work, but the route to viability has involved a process of test-based design, constructing a series of increasingly large tokamak reactors.

“We can’t just keep building lots of them and seeing what works,” says Akers. “The only way we can progress now is through huge amounts of simulation using modern supercomputers and through huge amounts of data science.”

At the heart of Jet is a tokamak, a doughnut-shaped vacuum chamber where extreme heat and pressure turn hydrogen into plasma, which can start a fusion reaction that throws out heat

This includes the development of so-called digital twins of fusion systems, as well as the use of artificial intelligence and natural language processing, such as to mine valuable data and information from decades of research held in big data repositories and text-based formats such as journal papers.

Simulations have a number of applications. In both plasma and material science, ensemble runs – simulating the same processes many times with different variables – can be used for uncertainty qualification that highlights how accurately predictions can be made of how a reactor will behave, helping to decide what physical prototypes to build. This can help in working out how to prolong the life of the steel used within a fusion reactor so it lasts for decades, something which is essential for making fusion economically viable.

For plasma physics, simulation can help improve performance, such as through managing turbulence, something Akers describes as “an exascale problem” requiring the very fastest supercomputers that can carry out a quintillion (10 to the power of 18) operations a second to handle the calculations required.

Simulations should also help in the overall design of what amounts to massively complex pieces of engineering which have to cope with radiation and electromagnetic fields as well as enormous structural forces.

“At the moment, there are also too many human beings in the loop – automation in engineering design is urgently need for compliance and repeatability”
Rob Akers, UKAEA

“If you change one small part of the design, one subsystem, it has the potential for knock-on effects,” says Akers. “Indeed, managing the complexity of the design is itself a computing challenge that’s never been tackled at the scale needed to engineer a fusion power plant.”

Work needs to be done to democratise this effort, making it easier for engineers to run simulations on supercomputers, he says, adding: “At the moment, there are also too many human beings in the loop – automation in engineering design is urgently need for compliance and repeatability.”

They can also support operational work within nuclear environments, both fusion and fission. UKAEA’s Remote Applications in Challenging Environments (Race), which opened on the Culham site in 2014, built a remote handling system with haptic feedback for 12-metre booms equipped with cameras and tools within Jet’s vacuum vessel. This lets operators carry out rehearsals then see what they are doing from virtually generated viewpoints during actual operations, with use of Epic Games’ Unreal Engine, a 3D graphics software package designed initially for gaming.

Race hosts a robotics test facility for Iter and is supporting decommissioning at Fukushima in Japan, a nuclear station which leaked radiation over a wide area after being flooded by a tsunami in 2011, as well as at the Sellafield nuclear site in Cumbria.

Matthew Goodliffe, a control system engineer, says simulations meant that those working on Jet expected it would break its record. “They had modelled it so much that they basically knew it was going to do it,” he says. “It gives you such confidence we are heading in the right direction.”

Open source model

To support its work, UKAEA uses some of the world’s largest supercomputers administered by other organisations such as the UK’s research councils, as well Microsoft’s Azure cloud service and its own on-site clusters. Although it uses some commercial software, most of its software tools are open source, with programming languages such as Python and C++ being ubiquitous.

Akers says collaborating on software development is the only way to cope with the increasing scale of data from fusion work, with Iter eventually expected to generate as much data daily as Jet has produced in 40 years. “It’s effectively the Linux model, that’s the way science is done – we leverage an army of contributors around the world,” he adds.

Data screens of 100,000th pulse

This includes power companies, with EDF managing its own “simulation stack” of software provided in a similar fashion to open source to encourage its use and development. Things may change as fusion technology is commercialised and intellectual property becomes increasingly important, but Akers adds: “There is still sufficient work needs to be done that we’ll be open source for some time to come. We’ve got to be.”

First Light Fusion, a company spun out of the University of Oxford, is developing the use of projectiles fired at immense speed to hit carefully designed targets to produce the immense heat required for fusion reactions.

The company also takes a different approach to computing to UKAEA, with its hardware, including a recently acquired high-performance Dell with 10,368 cores, on-site and offline to safeguard the intellectual property of work including its target designs. And although it makes use of open source, including the OpenVDB library developed by film studio DreamWorks to animate water and clouds, First Light has custom-written some 400,000 lines of code.

Nathan Joiner, head of numerical physics at First Light Fusion, describes computing as “absolutely critical” to its work. “It underpins the current aim of the company and will also be part of the core business,” he says.

Its use is clear given the scale of First Light’s equipment for physical experiments. Its base on an industrial park a few miles north of Oxford is home to the Big Friendly Gun, a £1.1m 22m-long tubular launcher that can accelerate a 38mm diameter projectile to 6.5km per second (14,500mph). In November 2021, this was successfully used to generate a small fusion reaction, a result later validated by UKAEA.

First Light Fusion’s Big Friendly Gun can accelerate a 38mm diameter projectile to 14,500mph

Nearby, a 16-metre by 16-metre enclosure houses Machine 3, a £3.6m 40-tonne hexagonal array of 192 capacitors which channel pulses of power lasting a couple of microseconds into a central vacuum chamber. This electromagnetically accelerates a projectile towards a nearby target at speeds of up to 20km per second (45,000mph).

Vital as it is, using such equipment is expensive and time-consuming, with Machine 3 only able to run a test every other day given the need to clean, check components and re-establish a vacuum. Joiner adds that there are limits to the information that can be collected from high-energy events that in some cases take place in nanoseconds across nanometres. For each physical experiment, First Light may carry out as many as 10,000 or 20,000 virtual ones.

“Simulation models allow you to probe in and understand in much more depth what’s going on in these systems,” says Joiner, although physical work is needed to validate them.

First Light Fusion’s Machine 3 is a 40-tonne hexagonal array of 192 capacitors which channel pulses of power lasting a couple of microseconds into a central vacuum chamber to electromagnetically accelerate a projectile at speeds of up to 45,000mph

The company carries out extensive verification work on its simulation software, with tests to check the impact of code changes, to ensure that the same answers come out regardless of hardware used and use of ‘method of manufactured solutions’ where it knows what results a set of parameters should produce.

“Testing, getting the tests set up and getting them working, probably takes the majority of our time, compared with actual algorithm implementation and development,” says Joiner.

First Light uses two separate simulation codes. The first, Hytrac, works in two dimensions and focuses on material interfaces to test designs for targets. The second, B2, works in three dimensions and was originally developed to model electromagnetically launched projectiles, but has been rebuilt to model targets as well.

“It’s useful that we can get a second opinion from in-house code,” says Joiner, particularly for unusual cases. If the two codes produce different results it can indicate a problem, but if they agree, “that’s a huge tick of confidence”.

So how much longer does the world have to wait? In August 2021, the US National Ignition Facility at the Lawrence Livermore National Laboratory in California focused lasers onto a small target, generating fusion power at levels close enough to “ignition”, the point at which a nuclear fusion reaction becomes self-sustaining. Joiner says that although First Light uses projectiles not lasers, it works along similar lines: “We saw that as a proof of concept.”

UKAEA’s Akers says he hopes fusion will be providing energy commercially in the 2050s, with rising energy prices and an increasing focus on decarbonisation leading to more funding. “If we invest properly, we’ve got a good shot at that,” he says. “It has sometimes been said that fusion will be ready when it is needed.”

“If we invest properly, we’ve got a good shot at [providing energy commercially in the 2050s]. It has sometimes been said that fusion will be ready when it is needed”
Rob Akers, UKAEA

As well as supporting work to develop tokamaks and other technologies such as projectiles and lasers, computing may breathe new life into techniques that were attempted but not fully explored decades ago. Meanwhile, the digital technologies required, including engineering simulation, will benefit other industries, including those that also contribute towards the global aim of halting the rise in atmospheric greenhouse gases.

“Fusion is a high-risk but massively high-gain endeavour. I think as a species we’ve somehow lost our ambition to do the big things,” says Akers, such as the moon shots of the 1960s.

Modern life depends on abundant, reliable, cheap energy, and many see cutting greenhouse gas emissions as preventing people from leading the lives to which they are accustomed. Instead, he says, people should look to technology to help create a cleaner, greener world and drive economic growth: “That’s where computing is absolutely essential, because ultimately net zero is a ‘system-of-systems’ problem that can only be delivered using 21st century data science and supercomputing.”

Read more about software in industrial applications

Read more on Big data analytics

CIO
Security
Networking
Data Center
Data Management
Close