High-performance computing drives high-performance F1 cars to success

Jennifer Scott investigates how Formula 1 team Marussia uses a supercomputer to design and test its cars on a budget

At the Marussia Formula 1 team base in Banbury, high-performance computing is helping the team compete on a budget. Oxfordshire and the surrounding area has been synonymous with racing for some time – numerous high-profile car companies have been ploughing money into Petrol Valley for years and you can even find the cast of Top Gear loitering behind motorway service stations.

But now a new kid has come to town. Marussia is a Formula 1 team born of Richard Branson’s move into the sport in 2009. It has been racing under the new name, following the purchase of the controlling stake by Marussia Motors in 2010, throughout the 2012 season and, despite its strong Russian ties, has its headquarters in Banbury.

F1 on a budget

Technology plays a massive role in any F1 team, but for Marussia it means a lot more. The team has what is believed to be the largest CFD (complex fluid dynamics) array in Europe – a supercomputer capable of running complex algorithms that model the flow of air over a race car’s body, effectively simulating a wind tunnel in a computer, to improve aerodynamics. Aerodynamics has the biggest impact on a race car’s performance, which means the supercomputer plays a critical role in the team’s success on the track.

Computer Weekly was invited to Marussia headquarters for a tour of the site and to speak to F1 legend Pat Symonds, now technical consultant for Marussia, and previously Michael Schumacher’s race engineer and the executive director of engineering for Renault, on how technology helps Marussia approach F1 in a different way. 

“When we entered [the championship], we agreed to operate throughout a season for £30m,” said Symonds. “It costs £100m more than that for a medium-level team, and a top-level team would be working on budgets the wrong side of £150m.

“My role was about what we have to do different [to compete]. We needed to mimic existing teams in some ways, while being innovative in other ways,” he said.

If you want to have a creative team, the computing needs to be on tap like a utility

The first step was to find a partner. The rules in Formula 1 racing mean teams cannot share or sell exact designs between one another, but they can reveal the process to help lead others up the right path.

Marussia signed a deal with F1 champion McLaren before the 2012 season kicked off to try to emulate the team’s success on a lower budget.    

“We needed to get the heritage of IP [intellectual property] in a mature F1 team and we needed to do that quickly,” said Symonds. “We couldn’t go to McLaren and ask for the IP of a car, but we could ask it to help us with the IP of the process.”

However, with a small budget, Symonds and the team at Marussia realised they could not just throw staff at the process and see what came out. The team is made up of roughly 150 people, compared with rivals which have between 500 and 600.

On the IT side, Marussia uses services firm CSC to provide IT expertise, both in the Banbury head office and on the trackside.

Wind tunnels in silicon “With F1 today, the majority of the performance comes from aerodynamics,” said Symonds.

“It is probably true to say other things come into the mix – engines, drivers, chassis – but generally speaking, nothing contributes more than aerodynamics.”

Most teams work out the best designs for this and test out their decisions using wind tunnels. The largest teams will have their own wind tunnels at their headquarters and others will spend millions on renting space at one.

With just £30m on the table, Marussia did not have the luxury of renting wind tunnels for 60 hours every eight weeks, as allotted by the FIA (Fédération Internationale de l’Automobile), the governing body of F1 racing.

The team decided to invest the majority of its money in CFD. It works as a digital wind tunnel, with software running a simulation of the wind and the design of the car. It enables engineers to see any weak points in the design and tweak the car accordingly, without having to put it into production or spend huge amounts on rent.

Named after the Swahili word for wardrobe, its Kabati supercomputer for CFD has four clusters with just over 600 servers. This provides 72 teraflops of power, which is equivalent to making 72 trillion floating point calculations per second.

The supercomputer runs the equivalent of 6,500 cores, an internal infiniband network and 130TB of usable high-performance storage. However, F1 regulations prevent the team going over 40 teraflops at any one time. It is the 10th largest supercomputer in the UK, 230th largest in the world and by far superior to any rival F1 team.

“If you want to have a creative team, the computing needs to be on tap like a utility,” said Ian McKay, HPC services manager at CSC. “It is the equivalent of computing power for 90,000 iPhone 4Ss. We do big data [and] big problems, and we try to do it fast.”

All this power and an extra partnership with a technology services firm may sound expensive, but it still ensures the testing phase for the car’s aerodynamics is significantly cheaper than its rivals, allowing Marussia to save money while having a hugely competitive car.

“It is not just cheaper, but also quicker,” said Symonds. “Rather than a seven-week run in a wind tunnel, with CFD… it is 18 hours for a normal run.”

Trackside technology

CFD is a very significant factor, but Marussia still has to do some physical testing, and CSC has also helped with what it calls the “correlation” phase.

You can’t ask Bernie Ecclestone to hold a race for five minutes because a server has gone down

Once the CFD has been done to the allocated allowance – 40 teraflops over the average eight weeks – there is time to do a small amount of wind tunnel testing and track testing. The correlation team then puts these results together to add in any other factors that may have occurred in physical testing, and relays these to the design team.

The final role CSC plays is at trackside. We met two of its employees who scored their dream jobs when they were asked to travel alongside the Marussia team to each race to set up the IT environment needed in the garage.

“It may sound easy, trundling off to the races, going racing and then coming back and drinking champagne, but it is not quite like that,” said Ian Jackson, trackside

IT engineer for Marussia. “[Members of the team] want to work in the same way as if they were in [headquarters]. They want their systems to be transparent, wherever they are – they just need the data and analysis tools to work.”

Each F1 car has hundreds of sensors embedded to relay information to the team in the garage, both to figure out ways to improve the race immediately and to help with design features later on.

As such, Jackson and a number of other CSC employees need to set up a sophisticated local area network that is connected to the huge supercomputer at headquarters and able to withstand the elements, with races moving from sandstorms to snow every two weeks.

“We need 100% availability,” said Symonds. “You can’t ask Bernie Ecclestone to hold a race for five minutes because a server has gone down. We need excellent communication, data coming off cars, and people being able to look at it no matter where the race is. This is what we have with CSC and our supercomputer.”

Race to the finish

So, if it is the back-end technology and specifically the CFD that is growing the success of the Marussia F1 team, will all the other constructors follow suit?

“I think it will all go that way,” Symonds told Computer Weekly. “We have a parallel in vehicle dynamics where we used to do all our experimentation of vehicle handling on the track. It is very expensive and has many variables so in the vehicle dynamics domain, we have moved nearly totally over to simulation. I think that is the trend we will see in aerodynamics.”

The head of Marussia has told the team he wants to be on the podium for the inaugural Russian Grand Prix in 2014. It may be a lot to ask, but with the technology on offer, we cannot wait to watch the race and find out.

Read more on Chips and processor hardware

CIO
Security
Networking
Data Center
Data Management
Close