When one of the top IT security labs in the US set about deconstructing the Stuxnet virus that attached Iranian nuclear facilities, their experts estimated that the code was so sophisticated it would have taken a team of 12 highly knowledgeable programmers more than a year to write the virus.
Without doubt, Stuxnet marked the point at which we moved into a new phase of cyber threats. The dangers of spotty teenagers writing viruses in their bedrooms are under control, and while the industrialisation of hacking by organised crime groups continues, it is no longer the main focus of IT security defenders.
As the government’s recent £650m commitment to cyber security testifies, this is now a battleground being fought by nation states and backed by the serious money and resources of government defence departments and intelligence services.
As such, it is increasingly clear that a global approach is needed – there is no Geneva Convention for cyber warfare. Already, numerous influential bodies are talking of the need for an international “cyber peace treaty”. The International Telecommunications Union has called for just such an agreement; Nato is known to be considering the principle. And earlier this year, the UN rejected a Russian proposal for a treaty on cybercrime, because it was unable to reconcile differences between developing countries and the most advanced capitalist countries, led by the EU, US and Canada.
So the UK armed forces minister Nick Harvey is right to call for agreed international laws governing cyberspace. Even if such laws might prove immensely difficult to police – the IT security community still cannot prove who or where Stuxnet was created – it is important to agree a baseline from which to start.
One word of warning though: as we have seen in the UK with attempts to collect internet traffic data under the guise of anti-terror laws, any cyber treaty needs to ensure the balance between openness, privacy and security. An Orwellian internet would be no solution to the threat of cyber warfare.