Feature

Case Study: How the Met Office supercomputer offsets its carbon footprint

When the Met Office's latest £30m supercomputer was switched on in May it drew a fair amount of criticism, and a cloud of suspicion has hung over the organisation's Exeter-based headquarters ever since. Isn't it odd, critics asked, that an instrument for combating climate change has such a huge carbon footprint?

Despite the achievements of the UK Met Office, the nation has been characteristically eager to rain on its parade. Much was made of the fact that the new supercomputer uses 1.2 megawatts of energy a year - enough to power a small town.

With that in mind, it is more powerful than 100,000 PCs - in computing terms, that's a bargain. The economies of scale achievable mean that the carbon footprint of the Met Office computer is much lower than a town's worth of desktop machines.

And yet the public finds it difficult to understand why an organisation that is leading the fight against climate change is itself producing 12,000 tonnes of carbon dioxide every year. Perhaps it would help if they knew how many tonnes of carbon dioxide the Met Office's 400 scientists are helping to prevent from being created as a result of their heightened understanding of global weather conditions.

"The more computing power we have, the better predictions we can make," says Steve Foreman, the Met Office's chief technology officer.

We can only save the planet from climate change when we understand it completely - and our full comprehension of the infinitesimal number of variables that determine our weather patterns is a long way from being realised. But the more detailed the Met Office's study of the planet's constantly shifting elements becomes, the more accurate our understanding will be.

The Met Office's systems analysts have divided the planet's atmosphere into boxes of 20 kilometres in length. The height of the earth's atmosphere is 70 of these boxes.

Within all these boxes, all the variables that make up our weather - pressure, gases, the extent of motion - are measured. In total, there are 300,000 of these points across the globe. Data is collated from hundreds of weather stations, balloons, satellites and atmospheric observations around the world.

Tracking weather variations is a task that requires 1,000 billion calculations every second, which are then fed to the Met Office's 400 scientists. In this context, given the scope and importance of the work they do, 30 tonnes of CO2 per scientist could seem like a reasonable price to pay if we are to understand the threat of climate change.

The scale of the Met Office's achievements tend to be overshadowed by the fact that it has yet to achieve perfection. Why, the Daily Mail asked, does the Met Office still get the weather forecasts wrong?

If the Met Office is to achieve 100% accuracy, for every square kilometre on the planet, it must step up the thoroughness of its research. This will inevitably call for more calculations, as the earth's atmosphere is studied in even greater levels of detail than before. Which, as a result, will call for more powerful computing platforms.

Now the Met Office has upped the ante and increased the granularity of its studies. Under test, it has a new version of its previous model, which lowers the horizontal resolution in the UK from 40km to 25km. (Resolution is the distance between points of measurement. Four points are needed to define the patterns of atmospheric variation). The upshot is that eventually each horizontal layer of the earth's atmosphere will no longer be divided into 300,000 points, but in the near future will consist of 790,000 points. So the 744 CPUs that make up the Met Office's IBM supercomputer will be numbercrunching more furiously than ever, the workload will edge ever nearer the machine's capacity for 145 teraflops and make increasing demands on its 15.5TB of memory.

Add to this, the two giant halls (each the size of two football pitches) that house the supercomputer will draw even vaster resources of electricity from the national grid, and these CPU monsters will need even greater efforts to cool them as they generate 24TB of data every day. Since the tape robots that store all this data start to suffer at about 23°C, the heat produced by this system will draw on even more resources for cooling.

Steve Foreman says he is rising to the challenge of lowering the system's carbon footprint. "We're looking at more efficient systems of cooling, possibly even free cooling," he says. Currently, chilled water is used to cool the CPUs, but this system could be replaced when the time for an upgrade arrives. Another excess that troubles Foreman is the heat that is generated by the CPUs.

The Met's headquarters in Exeter is so economically designed, according to Foreman, that it only ever needs heating for three days each year. So, until they can find a source that needs it, the excess heat generated by the supercomputer will have to be dissipated into the atmosphere. A more tangible energy-saving device could soon be achieved by tailoring the power supply. "We could use direct current, which would significantly lower the power use," explains Foreman. "You lose less power than with AC and you get a better rate of power conversion."

The challenge would be in finding an uninterruptible power supply that works with direct current. Given the scale of the challenges the Met Office has already faced, it should come as no surprise if it were to solve this one.

There are still many riddles and conundrums to be solved before we can predict the British weather, let alone climate change. Some involve the Met Office itself - logically, one would expect the Met Office to be a fan of cloud computing.

Computing power at the Met Office 
The Met Office embraced the computer age in 1959, when a Ferranti Mercury - nicknamed Meteor - was purchased. Capable of completing 30,000 calculations per second, it was a major step forward in the evolution of making weather forecasts. For the first time, scientists were able to regularly use numerical methods to forecast weather patterns.
By forming an understanding of the way the atmosphere works, equations are created which seek to mirror these processes. The equations, built out of lines of computer code, combine together to make models which are effective attempts to recreate the dynamics of the atmosphere through mathematics. They work by taking all the current weather observations available and applying the model to see what might happen next.
As our understanding of the atmosphere improved, and the number of weather observation inputs increased, the need for more computing power also grew. Thus the Met Office bought a new supercomputer in 1965, an English Electric KDF9, which could complete 50,000 calculations per second. This leap in speed - more than 60% - allowed for faster, more complex forecasts to be made.
This pattern of advancing technology and increasingly complex models continued, with the Met Office buying successively speedier computers every five to 10 years. By 1982, its CDC Cyber 205 could do 200 million calculations per second, and by 1997 its Cray T3E was doing more than a trillion calculations per second.
The Met Office now uses an IBM supercomputer that can do more than 100 trillion calculations per second. Its power allows it to take in hundreds of thousands of weather observations from all over the world, which it then uses as a basis for running an atmospheric model containing more than a million lines of code. Necessarily, the supercomputer requires a large amount of energy to run and maintain - about 1.2MW of electricity each year. While everything possible is being done to minimise this, the power consumption remains small in comparison with the socio-economic benefits delivered, including CO2 emissions reductions. For example, the Met's global aviation forecasts allow airlines to save fuel by using the wind to aid their flight to their destination. The Met Office has estimated that this alone helps save approximately 20 million tonnes of CO2 each year through increased efficiency.

 

This article first appeared in DatacenterDynamicsFOCUS magazine. DatacenterDynamics runs a series of global conference and expo events for professionals who design, build and operate datacentres. DatacenterDynamics London takes place at the Lancaster Hotel, 10th and 11th November 2009. Computer Weekly is a media partner.


Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

This was first published in October 2009

 

COMMENTS powered by Disqus  //  Commenting policy