Universal Credit bodged its bodge

| No Comments
| More
Iain Duncan Smith looked like a fall guy when he started work on Universal Credit. He humbled forward to take the stage for his grand social project, with a little shove from his superiors. He looked so earnest, with his army officer poise. They waved him on: yeah you go for it, Iain.

Yet Universal Credit was just the sort of IT project his superiors had long since pilloried for always being bodged, wasting billions in tax... you know the words by now.

Leaving aside the fact that Universal Credit will give government more power to redress inequality than any disheartened socialist toiling under the post-Thatcher yoke might dream possible.

There is still this knee-jerk IT bodge story, which for now is all Universal Credit is to most people. That story is a matter of great hypocrisy.

The prime minister and his most grandiose lieutenants had promised they wouldn't do big IT projects like Universal Credit. They wanted to break the big government computer systems up, and government as well.

They effectively won power by doing so much jeering at the last, Labour government's ambitious, computer-powered social reforms that their faces got stuck in rictus.

And they wanted to cut social security payments to the poor - to make it harder for people to get help from the state, not easier.

Yet when Smith, work and pensions minister, said he wanted to build a huge computer system to improve the way his huge government department gave a huge amount of money in welfare payments to the poor, they said yeah you go on Iain. They said they would make an exception to their rule that there would be no more big state IT.

Not only that, they said they would do this perilously ambitious project in just two years and for just £2.4bn, as though they had banged their heads and forgotten everything they had said before about big IT bodges. The surest way to create an IT bodge was to overestimate how cheaply and quickly you could put a system together, and then to make a big political parade out of it, leaving yourself no choice but to drive all your resources recklessly into it. So that's what they did.

They were letting Smith attempt to build a grand, national benefits system of unfathomable proportions against a very specific, goofy budget and breakneck schedule. Just like the last lot.


Francis Maude, Cabinet Office minister responsible for ensuring the government didn't do any more IT bodges, then lumped Smith's Department for Work and Pensions with three other ambitious IT reforms at the same time.

The biggest told-you-so story in politics consequently became that Universal Credit was as big an IT bodge as ever.

But when you've got an organisation like the DWP, with 90,000 staff making £70bn of payments to 7m households through under a variety of regimes and through a national network of 700 job centres, 10 call centres, and throughout local government as well, your IT systems are going to be big and complex; and you are going to build them only after trial and error, unexpected pitfalls and climatic changes. Other civil engineering projects would come close to the same volatility if they did their construction out of Tetris.

Universal Credit's troubles did seem to prove the Tory case against big state organisations. It got knees jerking all round. But it wasn't quite so.

Its trouble was that the Cabinet Office's 'independent' Major Projects Authority approved its grossly unrealistic schedule. And that the extra requirements the Cabinet Office piled on the programme's shoulders quadrupled its chances of failure. It made unprecedented demands of the contracting model, the design methodology, the transaction model, its security precautions, the complex rules by which it administers benefits to people, and the organisation itself. This all caused the DWP so much trouble that the Cabinet Office, with its MPA hat on, had to freeze the programme. In the two years of chaos that ensued, with Cabinet Office ripping up DWP's plans and removing the extra burdens it had earlier imposed, the public perception was another big state IT bodge. The programme consequently went two-years over-schedule, as if to prove the case.

When Universal Credit was at last resumed in November, Cabinet Office had forced DWP to alter some of its big state designs. Its revised solution was more in keeping with its own favoured model of public IT: something more amenable to reform; and particularly amenable to its own favoured reform, which was to automate government functions and make their staff redundant.

Cabinet Office usurpation of DWP's project was partial however, according to the NAO report on the affair in December, because its own designs were unproven at high levels of scale and dependability demanded by the social security system.

DWP had been reluctant to move too quickly to Cabinet Office's radical tune. It had the nation's £70bn social security resting on well-rooted, dependable computer systems. The Cabinet Office's own alternative designs turned out to be to be premature. It was not even possible to say whether they would be reliable enough to run the social security system even in 2019, said the NAO in November.

The transformation of DWP from an organisation of 90,000 people into a Cabinet Office web app has consequently slowed.

The transformation was already inevitable to some degree, as people transact more of their lives online. The question has been how soon: whether it would be accelerated for the sake of a Tory strategy to break up government and cut social security.

DWP now expects between up to 74 per cent of people will process their social security claims online under Universal Credit in five years time. That's down from the 100 per cent aspiration of the Cabinet Office's 'Digital by Default' ambition.

Domesday computer to make equality automatic

| No Comments
| More
АК-47.jpgFire a few kalashnikov rounds into the air over a crowded square when Universal Credit is completed in 2019. Perhaps rattle a few volleys from the windows of cars on roundabouts. At the very least, throw some confetti.

Because the Universal Credit social security system will give government power to deliver equality at the press of a button. The only question is whether it or indeed any government will use it.

The Credit will be the final piece in a jigsaw of computer systems that will give government a totalitarian view of people's incomes, and extraordinary powers to act on its intelligence by levying tax and paying supplements.

Its intelligence gathering system is a Domesday Book for the 21st Century: the Real Time Information system, built by Her Majesty's Revenue and Customs almost without anyone noticing, and just coming online.

It happened with a surprising lack of public alarm, after all the fuss about ID Cards. But Universal Credit has commanded everyone's attention with another knee-jerk IT bodge story. HMRC meanwhile built links with employers' payroll systems, and the wider electronic banking system, to get live data about everyone's earnings.

This would be creepy but for Universal Credit. By establishing its own data feed from HMRC's realtime earnings intelligence, the Department of Work and Pensions has gained the power to correct pay inequalities in realtime as well.

What used to be a system of social security has consequentially become a system of social justice.

Get your own

David Cameron - Lewinnick Lodge - Newquay - 2 MAY 2010 - Conservatives promotional picture.jpgIt just depends what you call social justice. Prime minister David Cameron campaigned in the last election with the idea that fairness was a middle class luxury: what's mine is mine, and if you want some too, well fair on you if you can get your own.

While the UK has become the most unequal country in Europe, his ministers strive to cut the amount of money government redistributes from the rich to the poor on whose backs they stand. Universal Credit has been justified as a way to make sure the state paid the poor no more than their allotted handful of alms. Commentators have been more concerned with clawing back "overpayments" made by the approximated system of Tax Credits that preceded Universal Credit, than with increasing the paltry amount of redistribution that the system already manages to perform.

Universal Credit was presented to the public however as a system to administer social justice. It was supposed to encourage people to take grossly low paid jobs they would otherwise shun by topping up their wages with social security payments.

The point of Universal Credit was to "make work pay", as its creator Iain Duncan Smith pitched it in 2010.

Without Universal Credit, the tax system worked against them. A family living on social support and a part-time wage could earn £7,500 more if their wage earner went full-time on the minimum wage. But the government would claim £7,000 of that in tax. That's what they call 94 per cent 'marginal taxation', for someone who doubles their hours on poverty wages.

BBC bruiser Andrew Neil interviews Iain Duncan Smith about Universal Credit - 15 December 2014.pngBut as BBC bruiser Andrew Neil put it to Smith in December, if that same poor family got Universal Credit, 83 per cent of their extra income would still be consumed by tax. If on the other hand they were wealthy, and their wages went above £150,000, the government would tax only 45 per cent of the extra. So much for Cameronian fairness.

If Smith wanted to "make work pay" he might have simply made employers pay people a proper wage in the first place.

He had instead created a benefits system that subsidised employers who paid low wages, he admitted to Neil.

But the point of Universal Credit was that succeeding governments could tweak the Universal Credit algorithm to increase the amount of money it redistributed to the working poor, Smith said in not so many words. Once the system was set up, social justice was just a matter of calibration.

The problem it claimed to address wasn't merely caused by companies not paying staff enough money. It was company directors not paying staff enough while paying themselves too much. And it was Übermenschen such as IT professionals demanding so much money that many of them have long since enjoyed a work regime that involved doing six months work and six month's on holiday, or investing their surplus in property that they then rent out at exploitation rates to the people whose wages they've hogged, who they then employ to clean their cars and wipe their toilet seats.

Universal Credit's own problems were exacerbated by an inability to recruit enough IT Übermenschen to build it, and an inability to pay enough money to retain them. It presumably capitulated and did for them what the social security system has failed to do for those beneath them: topped up their wages.

When the system is up and running, it will hopefully do for the IT Übermenschen what they programmed it to do: get their over-inflated wages back in tax and give to people more in need of it.

Magna Carta

Universal Credit Concept Viability Diagram - DWP - 2009.pngThe basic concept diagram of Universal Credit illustrated how it would happen: an almost virtuous circle of data, money and labour.

HMRC would get data from banks and employers about the money people were earning, and DWP to feed back fair rebates to those being rewarded but meanly for their labours.

But the flow of money had not been automated between HMRC and DWP. The Treasury and parliament damned the flow and, under the current arrangement, would channel enough on to DWP for its Universal Credit to let a trickle back into the pay packets of the working poor.

Any government could open the taps. But they would not likely be under much pressure to do it, because the circulation of data was broken too. Employers knew what everyone was earning in their domain. The banks knew all they knew. All their incomes data went to HMRC. It and DWP would know what everyone was earning. But that's where the circulation of incomes data would stop. Only employees themselves would be kept in the dark.

This has created a quandary in this 800th anniversary year of Magna Carta, the celebrated tax reduction the Medieval Barons muscled out of King John.

king-john-signing-the-magna-carta-reluctantly.jpgAs HMRC cranks its totalitarian tax collecting system up this year amidst the commemorations of Magna Carter, it might seem appropriate to smash it up*.

For while the Carta is celebrated for empowering common people, it was the barons wot got the best deal. What's trickled down since has evaporated as soon as replenished. HMRC's Domesday computer now gives common people the means to muscle a better deal out of the barons, in these days of zero hours and obscene inequality. You might say Universal Credit was their Robin Hood. All they need now is a government prepared to use it. That would surely follow if the government took its defining computer-age policy to its logical conclusion - its promise of open data and transparency for the sake of democracy and liberty: so free our incomes data then, so we can see that justice is done.

* That's what the Conervatives said they would do with big government computer systems anyway, and yet here they've built another one.

UK drone net got torture-grade CIA comms

| No Comments
| More
Punishment_of_the_Paddle_1912.jpgA computer network that the US Central Intelligence Agency began using a decade ago to conduct the kidnap and torture of terrorist suspects has become an integral part of the system now operating drone strikes in the Middle East and Africa.

The means to send 'above top secret' intelligence communications around the globe without exposure empowered the CIA's Rendition and Detention Program to snatch and interrogate suspects in the US 'war on terror'. The same network system became the principal mechanism behind the intelligence-led "targeted killing" of suspected enemies using drone strikes today.

The technological link between the two sinister programmes, signposted in passing detail by the US Senate Select Committee on Intelligence Study of the Central Intelligence Agency's Detention and Interrogation Program last week, further confirms that a US military network routed via the UK carried intelligence vital to the US targeted killing programme, and presents evidence that may sway officials deciding whether contractor British Telecommunications Plc should be held to account for building a part of the network used to transmit drone targeting intelligence since 2012.

In both programmes, secure global comms gave the CIA unprecedented, computer-driven power to collate, combine and analyse information about individual suspected people, and to pursue their subjection or assassination in other countries rapidly, with dreadfully focused vigilance.

Now having been pulled up for torturing the 'wrong' people, for assassinating the 'wrong' people, and for straying beyond the scope of international law, its newfound intelligence powers have been exposed as a grotesque.

CIA_illegal_flights.pngInformation dominance

The Senate Committee Study and other investigations of the US' misuse of power have focused on its effect. Its mechanism and its means, however - its source - remain unquestioned. That is the information dominance the US has striven for since embarking at the turn of the millennium on its ambitious strategy to create an intelligence-led, computer-driven, globally-networked war machine focused on pin-point actions against individual people.

The Committee report nevertheless gave a peek at the source of this terrible power - a thread to be unravelled.

Describing how contractors who had masterminded the CIA's deranged interrogation programme set up a private company to do it once the operation had become well established in 2005, the report noted that the CIA had given the company (called "Company Y" in the heavily redacted, previously classified report - later identified as Mitchell, Jessen & Associates) access to its 'above top secret' computer network, so they could use its intelligence sources in their work tracking and snatching terrorism suspects, 'rendering' them to one of a network of secret prison bases in countries desperate or weak enough to permit them, and 'interrogating' them.

SCIF - Sensitive Compartmented Information Facility.jpg"The CIA also certified Company Y's office in [REDACTED] as a Secure (sic) Compartmented Information Facility (SCIF)," said the Senate report, "and provided Company Y access to CIA internal computer networks at its facility."

It was really only a passing reference. But getting access to Sensitive Compartmented Information (SCI) was tough enough that government agency's still paid pay for it out of their capital budgets.

They had to build specially secured buildings and rooms called Sensitive Compartmented Information Facilities (SCIF), just so CIA-grade intelligence could be handled, and even discussed.

This had been the case since 1999 when a formal order from the office of the CIA director ordered SCI as the designation for data relating to CIA intelligence sources, and the precautions that would secure its transmission over US military and intelligence networks.

It effectively extended the CIA's hush-hush, clean room secrecy over the network and to wherever its intelligence went, so the information could move freely among those permitted to know: software, network pipes, computer facilities, people, would all be locked down.

Intelligence network

Inevitably, such CIA-grade intelligence found its way onto the Global Information Grid (GIG), a US military and intelligence network that has over the course of the 'war on terror' become the data-fuelled engine of US operations, especially drone strikes. Likewise the Defense Information Systems Network (DISN) - the global network of high-capacity comms cables that formed the backbone of the GIG.

This was not a simple undertaking. The National Security Agency, the US' network intelligence centre, extended the CIA's secure, compartmented realm over the DISN/GIG by starting a programme to build architecture good enough to carry CIA-grade intelligence.

KG-340 - NSA Certified.pngThe NSA's crypto-modernization programme guided US military contractors in their production of devices such as the KG-340, a high-capacity encryption device that has become one of the principal building blocks of the GIG.

Thumbnail image for Thumbnail image for KG-340.pngAs a 'Type 1' encryption device, the KG-340 was certified by the NSA to carry any data up to the level of Top Secret / Sensitive Compartmented Information (TS/SCI).

That meant it would securely transmit government and military information classified by the usual trio of designations - Confidential, Secret and Top Secret - using encryption algorithms developed under the NSA's Commercial COMSEC Evaluation Program. But it would also encrypt data to the level required to transmit data within the CIA's compartmented realm.

The result was that the DISN and the GIG would incorporate CIA-grade intelligence into their operations, allowing it to be combined with data from other sources in systems such as those the US used to pick, track and attack targets like people on its terrorism suspect list.

Off-the-shelf spying

This was not done lightly. None less than the Director of National Intelligence, an office created in 2005 as over-arching head of intelligence agencies including the CIA and NSA, dictated how SCI would be handled by the DISN/GIG.

As it was put in a definition of SCI agreed in 2010 by a committee of military and intelligence agencies: "Classified information concerning or derived from intelligence sources, methods or analytical processes, which is required to be handled within formal access control systems established by the Director of National Intelligence".

On being established, the DNI established an agreement between the CIA, its sister agencies and the Department of Defence to collaborate on network security. Their subsequent work established means for assuring the transmission of SCI across the GIG.

The NSA ensured the devices that would do this did so in conjunction with established network technology: devices such as the KG-340, which would turn a standard, high-capacity network into one capable of sending sensitive compartmented, CIA-grade intelligence. Supplied by industry, they would rely on proven technology. The KG-340 was designed by SafeNet, a long-standing military networking pioneer. It was recently bought by Raytheon, one of the larger US weapons manufacturers. It was designed to be a standard "off-the-shelf" network components that would work with other standard, off-the-shelf network components.

That was where BT came in. The US Defense Information Systems Agency (DISA) contracted the telco to build a high-capacity DISN trunk line between the UK and a US military base in Djibouti, North Africa. As part of the DISN/GIG, DISA set out in its contract specification to BT that it would cap the line either end with KG-340 encryptors. The BT line would thus carry CIA-grade intelligence as well as other Top Secret information for military operations such as drone strikes.

As DISA itself said of the DISN in its 2015 budget statement to Congress: "The DISN provides secure voice, video, and data services over a global fibre-optic network that is supplemented by circuitry obtained from the commercial sector.

"DISN subscription services are described as follows: compartmented information communications services for the DoD Intelligence Community and other federal agencies."

BT has tried to portray this network as an infrastructure comprised of unexceptional features and built for banal purposes, in an effort to discourage UK officials looking into the question of whether the British telco ought to be called to account under international rules for corporate social responsibility for its part in the DISN, after US intelligence-led drone strikes became an international human rights scandal.

Officials have spent 18 months deciding what to do because, they have said, has been a lack of evidence that the BT network was anything more than BT said it was: a trivial network connection of no significance and of no interest even to its own corporate ethics board. The DISN, however, was built to be the foundation of all US military operations.

Thumbnail image for Thumbnail image for Restraint_chair_used_for_enteral_feeding_at_Guantanamo.jpg

NSA encryption no smoking gun, says drone net contractor

| 1 Comment
| More
KG-340 - NSA Certified.pngAn NSA encryption box that secures the US military's global drone network has become the focus for UK officials deciding whether a telecoms contractor should be pulled up under international rules on corporate ethics.

British Telecommunications Plc, which faces a formal investigation of its contract to supply part of the US military network, told UK officials they should disregard the NSA encryptor and drop the case.

It had been cited in evidence by legal charity Reprieve, in a bid to make BT meet an obligation to assess whether it was responsible for human rights atrocities after supplying part of the network the US has used to target a calamitous drone assassination programme against suspected armed opponents of its military offensives in the Middle East.

BT told officials the NSA encryptor - called a KG-340 - was the only part of Reprieve's case that had not already been thrown out as insubstantial, and it was irrelevant anyway.

Thumbnail image for Miles Jobling - BT Solicitor.jpg"BT did not provide or install any KG-340 encryption device," said a legal brief BT solicitor Miles Jobling sent to UK officials at the Department for Business, Innovation and Skills.

"The KG-340 encryption devices were to be provided by and managed by the US government," it said.

Officials deciding whether to let BT off an international obligation to look into human rights atrocities have hinged their judgement on whether it looked likely on the face of it that BT's work for the US military was related to drone strikes.

They threw the case out once already, saying the evidence was thin. But Reprieve made them look again after a Computer Weekly investigation showed BT had provided a major trunk of the global military network the US used to conduct drone operations.

Thumbnail image for KG-340.pngThe KG-340 encryptor had been just one small part of the extraordinary "network-centric warfare" system the US has been building for the last 15 years, and which culminated with its "targeted killing" programme: which pulled disparate intelligence sources together on a network to remotely track and kill people on its suspect list.

Its design overseen by the US National Security Agency (NSA), the KG-340 was built to transmit classified intelligence and military communications down this network's high-speed backbone - and at "ultra-low" latency, so it could transmit critical combat mission communications without delay.

KG-340 for Net-Centric Warfare on the Global Information Grid.pngAs a "Type 1" encryptor, NSA certified the KG-340 under its Commercial COMSEC Evaluation Program (CCEP) to transmit comms classified "Top Secret". It had been a primary building block of the Global Information Grid (GIG) - the network system that drove the US net-centric warfare systems - and the high-grade backbone that underpinned the whole thing: the Defense Information Systems Network (DISN).

The US military's contract notice for BT's part of that backbone said it would be capped with KG-340 encryptors, removing any doubt about its purpose.

BT told officials: "The KG-340 is an off-the-shelf product that simply protects the integrity of secure communications."

"Mere knowledge that the US government uses KG-340 cannot of itself impose any burden on BT," it said.

The British telco had insisted it should be able to remain ignorant of what its customers did with its services, in contrast with international rules on corporate social responsibility.

The KG-340 was acquired by defense contractor Raytheon in 2012 when it bought the government division of SafeNet, its original manufacturer. The latter had developed it under the NSA's Commercial COMSEC Evaluation Program.

Ignorance is defence for drone death net corp

| No Comments
| More
BT Tower top.jpgComputer Weekly quizzed BT on a plea of ignorance it sent to officials trying to decide whether the telco should be held to account for a network the US military used to conduct a controversial programme to assassinate suspected terrorists with drone strikes in the Middle East and Africa.

It had told UK officials they shouldn't investigate its contract under international rules for corporate social responsibility because, as a telecommunications firm, it expected to be able to turn a blind eye to what its customers did with its services.

It pleaded two other sorts of ignorance as well, in a statement it has issued repeatedly in the last year to escape an obligation to investigate human rights abuses in its supply chain.

It first pleaded the ignorance of the chemical weapons widget manufacturer. Like the weather-worn businessman on an industrial estate outside Dover who happens to make the only sort of widget that can control the release of anthrax from a warhead, but usually sells them to companies that make mobile disco smoke machines, and turns a blind eye when an order comes in from the bursar of a Syrian military laboratory.

BT likewise claimed it had no responsibility for what its customers did with what it sold them, and so said UK officials should throw out a complaint by legal charity Reprieve that it had neglected its obligations under an international agreement called the OECD Guidelines for Multinational Enterprises.

BT Tower top.jpg"BT can categorically state that the communications system mentioned in Reprieve's complaint is a general purpose fibre-optic system. It has not been specifically designed or adapted by BT for military purposes," it told officials on 8 October.

Evasive manoeuvres

The system in question was a high-grade comms line it supplied as part of the US military's Defense Information Systems Network (DISN), the global fibre-optic backbone of US intelligence and military operations.

The telco seemed to be trying to imply the system it supplied would not be suitable for military purposes. And it could therefore never have had any reason to suspect its equipment might be used in military operations, let alone drone assassinations. And so it shouldn't have any obligation to look into US human rights abuses under OECD rules.

But saying it wasn't specifically designed for military purposes was not the same as saying it couldn't be used for military purposes. Oddly, BT did not claim its system would not be used or could not be used to operate US drone strikes.

CW Logo.pngComputer Weekly put this to BT in February after it had persuaded UK officials to bury the complaint. It made the same arguments then as it sent to officials last month, after a CW investigation showed BT's DISN connection did operate drone strikes, and was the major trunk line for communications between its hub at RAF Croughton in Northamptonshire and Camp Lemonnier, a combat base in Djibouti, on the horn of Africa.

BT Tower top - CROPPED LONG.pngA BT spokesman replied in writing that the system was used by the military, but not necessarily for military operations.

"Camp Lemonnier is a United States Naval Expeditionary Base. So the fibre optic connection will be used by the military," he said.

But he added: "The fibre connection could be used at the base for a wide range of day-to-day activities, such as general housekeeping/internet browsing, email, communications, stores ordering, data functions and voice communications."

Unanswered questions

It was misleading to say the connection could be used for managing the naffy internet. That was handled by a local firm under another contract. The BT line also happened to be part of a military network so powerful that to use it for domestics would be like using a flame thrower to toast your bread. BT wanted to create the impression domestics were as likely as drones. But they were most unlikely.

CW Logo.pngBT's rebuttal had also left the original question unanswered. So CW put it again: would such a BT connection have to be specially adapted for military purposes? Would it be suitable only for laundry dockets if it wasn't specially adapted? What sort of line would you need for drone operations anyway?

BT Tower top.jpgBT pleaded ignorance again: "BT has no knowledge of the reported US drone strikes and has no involvement in any such activity," it said.

But it couldn't say its connection services hadn't been used in drone operations. This was because it formed the very foundation of US drone operations. But if it could claim it didn't know this, it might be able to persuade officials that it shouldn't be held to account for it under the OECD rules on corporate social responsibility (CSR).

Corporate social responsibility

When UK officials reported on the affair in February, it emerged that BT had done a cursory assessment of the risk of human rights abuses in its DISN contract.

But it hadn't addressed the drone question specifically. So it hadn't officially addressed the specific human rights controversy that had been raging over the US drone assassinations.

That didn't mean it didn't know. And it might have been true that BT did not operate the drones itself. And it might not have been contracted to supply in direct support of drone programmes (so it had "no direct involvement"). But its network infrastructure was still the vital component of drone operations. This was a matter of public record for people in defence comms circles.

CW Logo.pngSo CW put it to the telco: the reason BT cannot not say is because it doesn't know - and it doesn't know because it hasn't asked. But it did know about the drone strikes because everyone knew about them. It just hadn't acknowledged them officially.

BT Tower top.jpg"Yes, you're absolutely right," said the spokesman.

"To be exact: 'BT has no knowledge, other than from general press reporting, of US drone strikes and has no involvement in any such activity.'"

CW Logo.pngYet OECD rules said companies should assess the human rights risks specifically when they didn't know. That was the whole point of corporate social responsibility. CW put these this to BT: surely it had breached the rules by trying to ignore its DISN contract's association with the controversial drone programme?

BT Tower top.jpgIts spokesman wrote: "I've double checked and there's nothing more to add."

There was one more thing it said, and has said repeatedly since February, and again since Reprieve resubmitted its complaint with evidence that its line was part of the drone network.

BT Tower top.jpg"BT is glad that UK NCP has assessed Reprieve's complaint and rejected it," it said.

It could not refute the allegation because it was true. But it would persist in trying to discredit any attempt to make the allegation stick under OECD rules. It thus hoped to go on ignoring its connection to the US drone programme.

The NCP rejected the first complaint because without BT's due diligence there wasn't any evidence. Now CW had shown the evidence had been there all along if you knew where to look, BT was trying to imply officials had already decreed that the allegation was not true.

BT wrapped all these statements of ignorance up into a single statement it has issued repeatedly - to CW, to other publications, and to the UK representative itself last month: a rebuttal made of pettifoggery; legal obfuscation in place of corporate social responsibility.

Drone net contractor claims telecoms opt-out

| No Comments
| More
Thumbnail image for Thumbnail image for BT Tower.pngTelecoms contractor BT pleaded ignorance of controversial US drone operations to avoid accounting for its work on them under international rules of corporate social responsibility.

The telco has issued the same statement repeatedly since legal charity Reprieve complained about its contract to supply a major trunk line for the US global military network under OECD rules last July.

It sent the statement most recently to UK officials, who are about to decide whether BT should be held to account for the contract under soft laws on corporate ethics established by the Organisation for Economic Co-operation and Development.

BT told them should be excused the rule that companies determinedly avoid their work contributing to human rights abuses.

The US has operated a drone assassination programme from its network, killing suspected insurgents without trial and slaughtering civilians in attacks gone wrong. But the telco said it had reasons to be ignorant of the programme and reasons to remain ignorant of it under OECD rules.

BT pleaded a special sort of ignorance in a legal document it sent to UK officials, a copy of which was obtained by Computer Weekly and published exclusively here.


It claimed the same principle of ignorance telcos use to resist government attempts to censor people's communications on the internet. Telcos see it as their duty not to pry on people's communications.

"BT cannot monitor or control the content that is carried on the system," it told officials at the UK Department of Business, Innovation and Skills.

"BT cannot properly enquire into or know what customers do with the equipment it provides to them," it said, arguing that telco ethics did not permit it to know whether the US used its network to launch controversial drone strikes.

World Wide Web founder Sir Tim Berners-Lee and others have made the same argument against state surveillance of people's communications. Their principle is that electronic comms are like the postal service: postal couriers wouldn't open people's envelopes to judge whether they thought a message is worthy of being passed on. Telecoms carriers likewise must not pry into people's digital comms.


BT turned this into an argument for not asking questions about iffy business activities. It wasn't to know about drone strikes over its network, it argued, because it wasn't allowed to know.

BT might have justifiably refused to snoop on a customer's comms. But its argument looked disingenuous. Even if this sort of intrusion into personal affairs was forbidden, it wouldn't stop a comms company making ethical judgements on commercial contracts. Telcos might ordinarily refuse to supply commercial customers if, say, their credit ratings looked too risky.

Whether telcos discriminate against people when providing general communications lines to the general population has been an issue of civil liberties. Comms firms nevertheless routinely sell advertising and tailor their services according to algorithmic analysis of people's messages. And people have had their phone cut off for lesser things than drone death. Like not paying their bill on time. Or making hokey music and software downloads.

Similarly, if a known despot asked BT to install a line between his control room and his death squad, the telco might reasonably ask questions. This was just the sort of situation where OECD rules normally require companies to ask questions, instead of just judging business opportunities according to their own financial interests.

Thumbnail image for Thumbnail image for Sir Michael Rake - BT.jpgAs BT chairman Sir Michael Rake put it in the firm's annual statement on corporate social responsibility in May: there could be "no compromise between financial results and social returns". There's more to business than simply making money, he said.

The OECD had clarified this recently, for example, for companies extracting minerals from war zones in central Africa. They were meant to take special care over due diligence on things that looked iffy.

BT answers drone death complaint

| No Comments
| More
raf_croughton.jpgYou would think British Telecom might have been terribly grateful when legal charity Reprieve pointed out that the US military was using one of its network systems to assassinate suspected insurgents in the Middle East, without trial and using drone strikes that had been going tragically wide of their target.

It must have been deeply distressing to BT, a vocal advocate of human rights and corporate social responsibility. On publishing its annual statement of corporate principles in May, Sir Michael Rake said it's very purpose was "to use the power of communications to make a better world". As a signatory to the UN Global Compact, it had pledged to make sure it was "not complicit in human rights abuses".

So you would think the telco might have overflowed with gratitude when Reprieve pointed out its error - when it pointed to the horror being delivered down its fibre-optic cables. It would act urgently, surely.

There was no doubt about what it should do. It was set out in an international agreement on corporate social responsibility (CSR) - also supported by BT - that the UK and other members of the Organisation for Economic Co-operation and Development signed in 1976, and updated with stronger human rights rules in 2011.

BT would have to do a full, honest and faithful assessment of the situation. And if it found that its activities had contributed to human rights atrocities, it would have to do something about it. It would stop them from happening. Failing that, it would withdraw its support. And failing that, it would have produced from its assessment a reasoned justification for carrying on as before, stating why what seemed like an atrocity was in fact perfectly necessary because, it might say, it was necessary to commit atrocities to create a "better world".


Camp Lemmonier - Djibouti.jpgBut BT didn't do this. Instead, it said it didn't believe the allegation was true. It didn't actually know whether it was true either. It didn't know if the high-grade comms line it had been contracted to supply between the US military communications hub at RAF Croughton, in Northamptonshire (shown above), and the Camp Lemonnier drone combat base in Djibouti, North Africa (on the right), formed the operational backbone of a drone assassination programme that had been a raging international controversy.

Even BT's military comms experts had apparently not known about it. So it said it couldn't be held to account for it. So much for corporate social responsibility.

The allegation was true though. Computer Weekly exposed this in weeks-long investigations between March and June this year.

Thumbnail image for 2013_05_31_PUB_Noor_Khan__Kareem_Khan.jpgYou would think BT might be overcome with gratitude on learning this. US drones had mistakenly killed hundreds of people (left). They had also killed thousands of suspected insurgents in Pakistan, Yemen and Somalia, all without trial. Campaigners said it was illegal. The US said it helped fragile states fight terrorists. Others said it stoked the fire. With BT's help, the US was using this terrible power to pick people out of the crowd and execute them from on high. The United Nations, picking over charred remains, said these targeted killings threatened world peace. It called for the restoration of international human rights law.

2014-10-08 - Formal BT rejection of CSR complaint about drone deaths.pngSo you would think BT might look into this, in respect of its own CSR rules if nothing else. Instead, it told officials to bury the complaint. But it didn't refute the allegation. It just tried to discredit it.

Its legal department made an official rebuttal under OECD rules in October. Published here exclusively (right), it ran to 8 pages without refuting the allegation. It pleaded ignorance of the allegation in defence of it. So much for corporate social responsibility.


You would have expected otherwise because the whole point of the OECD rules was that when things looked iffy, corporations did a formal assessment (called due diligence, in the parlance). BT has said repeatedly that it fully supported these rules - called the OECD Guidelines for Multinational Enterprises. The rules say responsible companies do formal assessments when things look iffy.

But BT didn't take an official look at the drone controversy when it took the contract. It didn't even do the due diligence after Reprieve said there was cause for it. It thereby preserved its ignorance. It then pleaded ignorance as its defence: how could it be expected to answer allegations nobody could say where true?

442px-Ian_Livingston_World_Economic_Forum_2013_crop.jpgThat was enough to bury the complaint at first. The UK National Contact Point for the OECD Guidelines, the OECD's representative at the UK department of Business, Innovation and Skills (BIS), rejected it in February. It wasn't prepared to use the OECD rules firmly. It said this had nothing to do with BT CEO Lord Livingston becoming minister of trade and industry at BIS. The rules worked by mediation not prosecution. They were gentleman's rules (soft law, as it's known). But perhaps not old boy's rules. And anyway, the UK NCP had been moved from trade and industry to consumer affairs and under a different minister, Jo Swinson. Perhaps consumer affairs would be more in keeping with its light touch.

Either way the UK representative was caught in a catch-22 of its own making: it wasn't prepared to spend any time investigating the allegation until it looked substantial; but it depended on BT's due diligence for substantiality; and BT hadn't done due diligence on it. So much for the OECD Guidelines on corporate social responsibility.

Law to leave wiggle room for public data profiteers

| No Comments
| More
Forthcoming data laws will put only a loose cap on public bodies seeking to sell public information for profit, according to the head of a UK consultation on their scope.

Yet while the law change will also boost the amount of public information being sold for profit, the UK insists it will become harder for the majority of public bodies to get away with it.

Officially, by boosting the amount of public data that must be made freely available to the public, the data rule will fulfil a promise made by the coalition government's biggest honchos: prime minister David Cameron, chancellor George Osborne, and Cabinet Office minister Francis Maude.

Alternatively, a severe compromise weakened the coalition government's defining technical policy so much that the data experts behind it have cried foul and increased their calls for a total ban on public data profiteering.

These differences will be settled when the National Archives publishes draft regulations in December, just months before a general election that will force information-age electors to consider how well the coalition has looked after the public assets as it transfers them to the digital realm.

The government had in fact been using its open computing policy to drive a private road through the public sector. But its open data policy held that public assets should stay in public hands. That was until it turned that policy into law, and its former backers in the open data community reacted by saying it would actually help public data profiteers.

Creep (mission)

It extended the digital profit permit to public libraries, museums and archives - allowing them to operate online like commercial bodies, charging fees to digital visitors and paying some of the proceeds to people who owned cultural assets. Benefactors of the National Maritime Museum, for example, might get a constant trickle of money for old rope by copyrighting its likeness and displaying it online. The museum would take a cut. Other public bodies could similarly get permits to sell public data for profit under the new rules.

The UK extended the profit motive into these areas by pushing its partners in the European Union. Thus it insisted after the European Commission sought to implement the total profit ban that Cameron's open data backers wanted.

But the UK and other EU countries would have to define their own conditions public bodies to profit from data. The resulting European Directive on Public Sector Information left a blank space for countries to fill in the details.

The model for this were partially-privatized public bodies created by previous Conservative governments in the UK. These "Trading Funds" were under orders to trade their public assets to cover their costs, and to make a profit as well - a "reasonable return on investment".

Now libraries and so on could do that sort of profiteering too. Most public data would be prohibited for trade though because tax payers had already paid for it. It's just when costs went beyond convention that public funding would fall short. Such as when libraries found they could no longer afford to stock long-held periodicals because their publishers had moved them online and started charging data subscriptions. Other public bodies would be permitted to trade data too, perhaps if their costs went up similarly and central government refused cover.

The blank in the EU directive would let countries decide how public data profiteers calculated their data fees. And how much profit made a "reasonable return on investment".

National Archives put that crucial question to public consultation. But the resulting regulations will dodge the question of profit.

The open data lobby reckoned the blank bit of the law would create more room for public bodies to profit from data. They wanted it prohibited. But they couldn't even find out how much profit the UK was going to permit public bodies to make.

Mouth (horse's)

Howard Davies, who has lead the consultation as National Archives strategy manager, told Computer Weekly the regulations would leave the answer blank so the government could change the rules easily.

"I am sure what we are going to be looking at is that there will be a form of words that directs public sector bodies that are needing to make a charge to some externally referenced piece of guidance," said Davies. "I don't see that it's going to be chapter and verse in the regulation."

Before now a Treasury document called Managing Public Money had put a loose cap on the profits of Trading Funds.

It said government services competing with private sector suppliers should try and earn an equivalent rate of profit to their competitors. That would be about five or ten per cent. But they might earn to up to 15 per cent if they were in a risky business.

Davies said the regulations would not likely refer to Managing Public Money even. They would just call upon external guidance.

The existing profit cap therefore looked looser than before. But Davies insisted so few public bodies would be permitted to charge profit that the rules would rarely apply.

"If you go outside the Trading Funds, the exceptions are going to be few and far between," he said.

"This is all public sector information. [Public bodies] are already funded to collect, hold and manage that information. So the cases where you need to cover costs and make an additional charge is going to be so rare I can't think of an example," said Davies.

Shop (private)

The amount of public data to be sold for profit had nevertheless been increased dramatically. And it had been open-ended.

This was dire in the open data lobby's view: it played into the hands of Trading Funds like Ordnance Survey, which makes money selling geographical data from the national mapping database. OS had according to Ellen Broad, policy lead at the Open Data Institute, been particularly opposed to a profit prohibition.

As though to prove their point, OS didn't bother responding to the consultation, not even to contribute to the question most pertinent to its health - whether it should continue charging profit. A spokesman for OS said it "hadn't felt the need to respond on this occasion".

The law change secured its future broadly. But it has been exploring further steps toward privatization. Its spokesman said it was talking to the Department for Business, Innovation and Skills (BIS - its parent) about releasing its assets as open data after all.

Portrayed by open data advocates as an obstruction to progress and prosperity, OS nevertheless employed 1,027 people last year. The case against it and other public European data bodies has long been that many small private companies are better than one big public one. That was the premise for the partial privatization of the Trading Funds as well in prior decades as well.

Meanwhile, half of Ordnance Survey's £144m sales came from other government bodies last year. And two thirds of its profits - £32m - went back to where they came from, in a dividend payment to BIS.

Geo data geared for stage-2 privatization

| No Comments
| More
The UK's public mapping data has been put on the road to privatization after the Ordnance Survey opened talks to become a govco.

The controversial plan contradicts official claims that the UK is committed to turning its most valuable public data sets into open data. Such data is already being sold for profit by quasi-public bodies - called Trading Funds - set up under the privatization programmes of consecutive Conservative governments over the last 40 years. The govco plan continues that effort.

Ordnance Survey slipped out that it was in talks to become a govco in the summer, when a public furore over the government's other govco plans was reaching full boil.

Sir Rob Margetts CBE, non-executive chairman of Ordnance Survey, said in its annual accounts on 18 June it was looking at govco plans because it wanted more flexibility.

Sir Rob Margetts CBE - Portrait by Alastair Adams.jpgHis office was unable to say what that meant. Other Trading Funds have sought govco status to shed the generous terms and conditions they must give their civil service staff. This would give them flexibility to undergo "digitization", which involves replacing costly manual processes with computer automation. The coalition's Digital Transformation programme aims to automate 80 per cent of operational civil service jobs. Trading Funds seeking govco status have meanwhile claimed a need for flexibility to do private sector mergers without civil service constraints. Employment laws usually force private companies acquiring public assets to give effected staff the same generous terms as they get in the civil service.

An Ordnance Survey spokesman said one advantage of the deal would be that its staff would no longer be civil servants. He was unable to elaborate. But he insisted it would remain wholly government owned.

A spokeswoman for the Department for Business, Innovation and Skills, which acts as shareholder of the partially-privatized Trading Funds, refused to discuss the plans.

She said BIS couldn't talk about the reason for the talks until the talks were concluded. They had not yet concluded. Ordnance Survey said in June it was aiming for a decision by 30 September.

That was before the govco furore forced BIS to puts its other plans on ice.

Public and Commercial Services Union Land Registry Protest 14-15 May 2014.pngBusiness secretary Vince Cable tore up similar plans for the Land Registry after their leak to The Guardian newspaper sparked staff protests, and a parliamentary committee criticized his department's earlier privatization of Royal Mail.

The Royal Mail deal, with the national address database at its heart, created such a stink about squandered data assets that the Advisory Panel on Public Sector Information (APPSI) stepped in with guidelines to stop any more public data being sold off carelessly.

Professor David Rhind CBE - chairman - Advisory Panel on Public Sector Information - APPSI.jpgAPPSI Chairman David Rhind - formerly chief executive officer of Ordnance Survey - said it was "entirely proper" for government to monetize its data assets.

He said no more public data assets should be sold off so carelessly. They should be leased to the private sector for up to 10 years-a-time instead.

But private investors might make a claim for ownership when they were required to keep leased public data sets up to date. So he imagined the public might not always keep ownership of its data after all.

Rhind explained his position on privatization with reference to Ordnance Survey's current search for a CEO to lead it into an emerging global market for mapping data.

The new CEO would lead Ordnance Survey through a time of transition by commercialising data for global markets, said the job advertisement on the Sunday Times website, on the day its annual report was published in June.

Global ambitions

Markets for geospatial data services were booming, said its annual report's statement on the govco proposal.

"Becoming a GovCo would enable Ordnance Survey to become more responsive and flexible, keeping pace with the rapidly changing markets," it said.

Ordnance Survey made about £45m selling data to energy and property firms with an obvious need to map their assets last year, said its annual report. It also started selling data to wealthy insurance and banking firms. They had been feeding its geographical data into risk and market intelligence systems.

This contrasted with Companies House, another Trading Fund that declared in July it would make all its data open. Its announcement followed Cable's veto of the Land Registry privatization. But the companies registrar's own accounts exhibited a database with little proven monetary value and minimal growth potential.

Vince Cable conference speech - May 2014 Libdem blog.pngThe political backdrop of the affair, however, was complicated. Cable was reported to have stopped the Land Registry privatization against the wishes of Conservative Party ministers. He told his Liberal Democrat Party conference on 6 October that his Tory coalition partners were obsessed with cuts because they detested public services. His department was responsible for the Royal Mail data privatization and the Companies House data release.

Ordnance Survey's spokesman said this week: "We are working with BIS on reviewing the Trading Fund business model. We are looking at being a govco. It's a project that's still being explored."

But he insisted the govco would remain wholly public. Only its data would be sold. He talked up its open data releases. But its minimal releases were all intended to introduce potential customers to its premium-priced data.

UK needs "mature debate" on open data, says senior official

| No Comments
| More
The UK's open data programme looks like it has hit a wall after a senior official intervened to call time on its release of the nation's most valuable public data assets.

As the campaign bugle call for prime minister David Cameron and chancellor George Osborne, open data was at the heart of their plans to downsize government. But their open data programme came up hard against the those quasi-public bodies permitted to sell the UK's most valuable public data assets for profit - bodies called Trading Funds.

That changed in the summer when Companies House, the Trading Fund that manages the public financial statements of private companies, said it would release all such statements as open data by June 2015. The public have had to pay a nominal fee to search its records.

But while it looked like the open data programme was at last achieving its radical ambition, the Companies House announcement was not the breakthrough it seemed. And now the Companies registrar has said that might be as far as it goes.

Tim Moss - CEO - Companies House - 2 - CROP.png"For too long the debate has been an all-or-nothing, black-or-white situation," Tim Moss, chief executive of Companies House, told Computer Weekly.

"I've heard people say, open data: good, charge-for: bad. The world doesn't work that way.

"We need to have a mature debate on open data, as to where it works and where it doesn't. I don't think we always get that," he said.

Moss was responding to the open data advocates around the prime minister, David Cameron, who had protested that the UK had neglected to use an update of regulations to impose open data conditions on all public data.

For Moss, the debate was not a matter of whether the public should be able to access public data without condition. It was up to Trading Funds to decide for whether it was in their own interests to do it.

"The open data model absolutely works for us," he said. "And that's why we've built it into our strategy.

"But the business of other Trading Funds is different and has to be looked at differently," he said.


On the face of it, open data looked devastating for Companies House. It made made £64m last year, and employed 967 people.

Formerly public bodies, Trading Funds such as Companies House were set up to charge fees for public services with commercial potential. Created by previous Conservative governments, they were partial privatizations of public services not rich enough to be sold off. The fees where meant to cover their costs. Open data would cut their income and invalidate their reason for existing as Trading Funds. They have been reluctant to sanction this.

The coalition government's "digital" reforms have meanwhile been replacing manually-intensive government functions with automated processes. That has cut the cost of managing and distributing public information drastically. It made their open data programme feasible. If public data could be distributed free of charge for the public good, then why charge for it?

That question has been answered partially by its moves to privatize its most valuable public data assets.

The Trading Funds have meanwhile been undergoing their own digital transformations. But their cost-savings have been boosting their bottom lines. Other Trading Funds such as the £144m turnover Ordnance Survey have so far refused to invest their cost savings in open data. Though they have made token releases, their livelihoods still depend on selling public data.


So open data sounded radical for Companies House - meaning either renationalisation or dissolution. Moss too, told Computer Weekly it was a radical move.

But what would it mean for the £64m Companies House and its 967 people? What had Companies House forecast would be the consequences if it started giving its assets away?

Moss said he didn't have the numbers to hand.

But if he did, they would have revealed another surprise: open data will have minimal impact on Companies House. It won't be hollowed out by open data.

More than 80 per cent of Companies House' income came from company registration fees last year. The law obliges companies to pay to file their financial documents at Companies House.

Its public search fee had been token. It raised £15m from search fees last year. But its search function made a loss. It used to run separate computer systems to receive company filings and provide public access. Now it has merged them, companies will make their filings as data and the same system will give public access at almost zero extra cost. The process will be automated or self-service in keeping with the coalition government's digital strategy.

Digitization might cut its costs. But open data won't undercut its own business. The considerable public interest in opening access to its archives was already reflected in its token fees. Digital transformation removed any need to charge by removing the cost of providing the service. Making its data open was a trivial matter because it did not concern its core income-generating business as registrar.

But digitization has not removed all incentive to charge for public data assets where they have greater commercial value. Contrast Companies House with Ordnance Survey, which has made limited open data releases to entice customers to buy access to its full database. Ordnance Survey is a data business. Its public data assets have immense commercial value. They are consequently less likely to be turned into open data.

Francis Maude, Cabinet Office minister responsible for the UK's open data policy said in July Companies House proved the government was committed to turning data "of most value to citizens and businesses" into open data.

It rather demonstrated how the government was committed to turning only that data of least value to private investors into open data.

UK open data "failed opportunity" says founding academic

| No Comments
| More
The UK has sent the world's open data movement up a wrong turn, fear its greatest advocates, by easing restrictions on profits that can be earned from the sale of public data.

Their fears were roused by a reboot of regulations that will make it easier for public bodies to sell data. This was contrary to the campaign promises of the coalition government's Conservative leadership, who said they would make public bodies publish their databases as open data. Public data was to be put in public hands for the public good.

The new rules, to be drafted in December, will permit those quasi-public bodies who hold the most valuable public data to continue selling it for profit.

Open data advocates linked to prime minister David Cameron's initiative called an alarm after the National Archives put the new rules out to routine consultation in the summer. One key part of the new rules was how much profit should people be permitted to make from public data. The Archives left this part of its proposals blank.

The government has meanwhile been selling choice public data assets off, eyeing others for long-term lease to entrepreneurs, and putting others at the heart of international expansion plans for those quasi-public data bodies, which the government has subjected to a systematic programme of privatisation since it came to power.

Ellen Broad - policy lead - Open Data Institute.jpgEllen Broad, policy lead at the Open Data Institute (ODI), said in a submission to the UK consultation earlier this month the new rules were a "backwards step" for the government because they went contrary to its open data policy.

"What the UK is proposing would make it easier to charge more," she told Computer Weekly.

Her submission was backed by some of the key people behind the government's open data policy: Rufus Pollock, the Cambridge academic and president of the Open Knowledge Foundation whose work on open data was headlined in the Conservative Party's 2010 election manifesto; Mark Taylor, the open source software pioneer who helped write that manifesto; and Tim Berners-Lee and Nigel Shadbolt, world-wide web pioneers who the prime minister, David Cameron, honoured by establishing the ODI and the Queen honoured with Knighthoods.

Pollock, Berners-Lee and Shadbolt had also formed the Public Sector Transparency Board, inaugurated by Cameron in June 2010 to oversee the transition of public data assets into open data, freely available for public use.


Now five years on, the rebooted open data regulations were a "failed opportunity" to bed that policy down, Pollock told Computer Weekly.

The open data movement's founders were dismayed because the UK had been an exemplar for open data to the rest of the world. But the new rules had spoiled its visage and obstructed their efforts before their work was done.

The new rules were set by an update to the European Union's Public Sector Information Directive, an agreement to replicate the UK's open data policy across the rest of Europe. They were implemented by Neelie Kroes, European Commission vice president who was already pushing the open computing agenda in Brussels when Cameron was using it to campaign for office in 2008. The regulations will be the culmination of all their work.

Rufus Pollock - president - Open Knowledge Foundation - CROP.pngBut Pollock suspected the UK had watered the EU agreement down.

"There was a failed opportunity," he said. "My sense was the government didn't take it. The EU wanted a stronger line than was in the directive. That was toned down."

That stronger line would have been what Cameron's open data team wanted: a strict prohibition on profit from public data.

The EU agreement contained a provision to accommodate a unique aspect of the UK data landscape: those quasi-public bodies entrusted with looking after the most valuable public data. Called Trading Funds, they include organisations such as Companies House, Ordnance Survey, the Land Registry and the MET Office.

The government permits the Trading Funds look after valuable UK data on condition that they charge access fees to cover their costs. These costs were higher in the pre-digital age, when collecting, managing and distributing public information were manual processes.


But Cameron's open data policy relied on digitization - the automated collection and distribution of public data sets. This would cut the cost of distributing public data - called its "marginal cost" - to as good as zero.

The government had always allowed the Trading Funds to charge a small profit on top of the cost of distributing their data. Now the marginal cost of public data was almost zero, the open data movement had pushed for all charges to be removed: there was no longer any reason for making a profit on something that didn't even need to be sold, especially when it was a public asset.

But that may not happen. Pollock suspected the UK was torn between short-termism and open data. Should it generate tax income by selling public data? Or should they stick to their guns and and treat public data like public investment, to be injected into the economy as a long-term growth stimulant - pump-priming for the digital age?

Pollock said governments were failing to see the bigger picture. He suspected HM Treasury was more interested in tax revenue than open data.


The Treasury had however led implementation of the coalition's open data policy. The Cabinet Office open data team worked from the Treasury after the coalition came to power and the Cabinet Office was staffed by officials put in place by the last, Labour government. Chancellor George Osborne had campaigned with Cameron for open data. The new rules are being implemented by the National Archives.

Howard Davies - Standards Manager - National Archives.jpgHoward Davies, who as standards manager at National Archives helped write the UK consultation, said: "We are not wanting to see a situation where excessive charges are made. That's just not what government policy is about at all."

"Where there are going to be charges, they will be limited to marginal cost of reproduction, provision and dissemination of documents. That's for the vast majority of public sector information," said Davies.

Profit could be made from public data "only in those exceptional cases, where bodies have to generate revenue to cover part of their own costs", he said.

Those exceptional cases would be the Trading Funds, he said, and other cases "where a public body is required to cover its costs".

Southwest One earns breather on £50m IBM debt

| 1 Comment
| More
Avon and Somerset.jpgSomerset's controversial Southwest One outsource is trying to square £50m of unpayable debt with parent IBM after the County Council cut its contract.

But while talks dragged on last year, the joint venture made its first operating profit, scraping £70,000 thanks largely to a legal settlement and service cancellation fees paid by Somerset, according to 2013 financial results it published this week.

It scraped a pre-tax profit with just four years left to run of its 10-year outsource contract with Somerset, Taunton Deane Borough Council and Avon and Somerset Police Constabulary, all of which took minority shares in the venture with IBM in late 2007.

IBM kept it afloat with guaranteed loans after years of losses and an admission this week that it would reap only a trickle of profits from its final stretch.

Led by a Somerset County Council that became hostile to Southwest One after Conservatives took it over in 2009, the venture's
public partners renegotiated its contract to bring some of their services back in-house. The last of those changes won't show up in its accounts till this time next year, when its profit will be balanced precariously again.

Their change of heart left Southwest One with £48.8m losses to be lumped on IBM in 2018 if there wasn't also a change in fortunes.

Derek Pretty - chairman - Southwest One.jpeg
"The directors are in discussions with IBM," said Southwest One chairman Derek Pretty in the company's 2013 report this week.

"There are insufficient cash flows to be generated in the remainder of the contract to settle this loan balance.

"IBM does not feel there is any immediate need to restructure this debt," he said, with talks about a final loss having already gone on for more than a year.

IBM was holding out for a late comeback. Pretty said they had already begun talking with their public partners about extending the contract beyond 2018.

Southwest One had meanwhile not let go of its original ambition, to consolidate the backoffices of other public authorities in the region.

"I am very hopeful that there will also be opportunities for profit and service improvement to be found as the company continues to engage in both our clients' change programmes and perhaps become involved in wider public sector restructuring initiatives," said Pretty, a former Kwik Save finance director.

But he gave only vague details on what amounted, after all the bluster from Somerset Council, to a meagre renegotiation.

Mouth and trousers

The central government's Universal Credit scheme had forced Southwest One to hand Revenue & Benefits back to Taunton Deane, said the report. That was a crown jewel. And it gave back Design and Print services, and advisory staff for finance and human resources. It gave Property and Facilities Management services handed back to Avon Constabulary, perhaps ahead of a cost-cutting consolidation of police buildings.

It didn't amount to much though.

Somerset had taken mostly scraps back in-house as well. But not before it had been forced to square its differences with its joint venture subsidiary.

Southwest One wrote off £4.4m of invoices that had gone unpaid since 2010. The council's campaign against Southwest One had also involved neglecting to implement the backoffice savings schemes that were the venture's raison d'etre, damning it in public, and contesting its contract.

But Somerset was forced to pay Southwest One £5.8m compensation for the trouble. The council's initiative appeared to to have achieved little but to damage Southwest One.

The 2013 results neglected to recount what services Somerset had actually taken back. But they didn't amount to much either.

The County Council said in 2012 its cull had taken pensions admin, health and safety, finance and HR advisory, and some accounting, business development and staff training back in-house.

But Southwest One would retain the juicy backoffice stuff that formed the bulk of its purpose: accounts payable, accounts receivable, recruitment, HR admin and payroll. The Somerset documents read by Computer Weekly said nothing about the other mainstay of its Southwest One venture: it's procurement office, which still processed £26m last year, though it was down 15 per cent.

Thumbnail image for Councillor David Huxtable - Somerset County Council.jpg"The contract renegotiation last March returned to the Council a number of the more strategic functions originally placed with Southwest One," said a February review of the renegotiation overseen by Somerset cuts supremo and Southwest One nemesis
(and board member, till last year) David Huxtable.

Somerset retained other bulky services as well, including IT.

Somerset's negative obsession with Southwest One didn't make any sense unless interpreted as a Conservative Council's determination to discredit a deal set up by its close Liberal Democrat rivals, or the Conservative coalition government's determination to undo big outsourcing contracts and sell off their public services. Pretty said Southwest One had already fulfilled Avon Constabulary's 10-year target for procurement savings. Somerset had handicapped its own procurement savings when it began its campaign against Southwest One in 2010.

Somerset slams Southwest One again

| More
Somerset's Conservative Council has reprised its campaign against Southwest One, its own joint-venture that had obstructed its efforts to privatize council services.

It's Audit Committee published a critical report on Southwest One this week, digging up a long list of old complaints about the venture, which the Conservative Council's Liberal Democrat predecessors set up with IBM and neighbouring public authorities in 2007.

But the committee report neglected to mention how Somerset had itself undermined Southwest One in an effort to break it up.

Though it was chaired by David Huxtable, the councillor who led Somerset's cuts programme and who has represented Somerset on the Southwest One board since March 2010, the report neglected to mention how the council's Conservative leadership put a freeze on Southwest One after the coalition government came to power in 2010, preventing the venture from delivering the savings promised when the LibDems set up the deal. Somerset Conservatives had already deposed the LibDems in 2009 with a campaign promise to fix Southwest One, which was then still setting up and already delivering promised savings. It had been trying to renegotiate the contract so it did not obstruct its plan to privatize council services. The council's squeeze undermined Southwest One's commercial performance. The council then hailed this as justification of its opposition to the venture.

This week's report - "Update on Lessons Learnt from the South West One Contract" - neglected to mention any of that. It also neglected to give any but the most vague description of any lessons that had been learned.

It instead gave a long list of teething problems Southwest One faced in its first years of operation, and a brief list of problems participating public bodies encountered while learning to work in a joint venture together, and a few complaints about how the Conservative council had been unable to tear up the 10-year, £198m LibDem contract after it took over in 2009.

Cllr Sam Crabb - Somerset Council.jpgHuxtable is not a member of Somerset's audit committee, but nevertheless leads its scrutiny of council finances, which he manages as Cabinet member for Resources. He told Computer Weekly this was not a conflict of interest. Cllr Sam Crabb, opposition lead of the Audit Committee who is said to be Somerset LibDem's expert on Southwest One was not at the meeting that produced the critical report. He said he had been away on business.

Jane Lock - Liberal Democrat opposition leader of Somerset County Council.jpgJane Lock, LibDem opposition council leader, stood in for Crabb. She told Computer Weekly she was not familiar with the subject. She agreed the report seemed short of lessons learned. But she was familiar with Conservative opposition to Southwest One: "They wanted rid of it," she said.

The report's more recent complaints concerned the Conservative council's inability to get rid of it. And it complained that Somerset had not had power publish Southwest One documents and data against the wishes of its partners. This had hampered transparency, a mechanism of the coalition government's programme to break up such contracts and replace the public services that rely on them with private providers.

Huxtable told Computer Weekly in 2012 how his council had been trying to cut the contract since 2009 so it could cut council services.

Councillor David Huxtable - Conservative Cabinet member for Resources at Somerset County Council.jpg"In the harsh political world it was a contract set up by the Liberal Democrats and we've spent the last three years trying to renegotiate it to get more flexibility," he said.

"Every part of our organisation has a part of Southwest One appended to it, whether it be buying financial services, property advice or whatever. But local authorities can't afford to do all this stuff they used to do.

"So if we for instance shut down a department, if we privatized school meals - the Southwest One overhead cannot be removed, so we are still paying a Southwest One overhead on something we haven't done for almost two years.

"Going forward, pretty much everything we want to change now, we are almost precluded from doing because we will be carrying this overhead forever - unless we can renegotiate parts of this contract.

"We are reviewing all our services. I think we've got 170 services. Some of those could be moved into a trust or not-for profit. 65 per cent of our organisation is already run by somebody other than us. We are not atypical of a local authority. WS Atkins do all our highways. Somerset Care do all our homes. Southwest One was just another contract to run all the back-office services. Unfortunately, it was far too complex and nobody envisaged the day when money in local authorities would actually physically go down," Huxtable told Computer Weekly in 2012.

Yet published accounts suggested Somerset could have made the financial savings it sought without cutting council services, if only it had not frozen relations with Southwest One and begun trying to close its services.

Southwest One had in 2010 already made £48m back-office savings Somerset would recoup over the 10-year life of the contract. Southwest One had found another £59m the Conservative council then refused to approve, trying instead to get out of the contract and cut £40m of council services under pressure from central government. Somerset stopped approving Southwest One savings schemes when it started trying to renegotiate the contract.

Huxtable told Computer Weekly today he did not recognise this account.

"My understanding is different. I don't think we froze savings," he said.

"We asked for more cashable savings, and that put us into conflict with Southwest One. My understanding was that Southwest One were in some instances claiming to have made savings when they were in fact made completely independently from them. That's when we started falling out with each other.

"I think what we all realised in local government post-2009 was the days of County Council budgets constantly growing was over. And in fact they started to go into sharp reverse. So we started to ask for more savings, cashable savings. And having to reduce the amount of money we gave them which, as you know, put us into conflict over their contract," he said.

Somerset settled out of court with IBM last March, after the IT company challenged the council's refusal to recognise savings it had made and pay it an agreed share.

The Audit Committee published its critical report on Thursday, the day of a council by-election the Conservatives narrowly won over the LibDems. It appeared the day before in trade reports sympathetic to the Conservative critique of Southwest One.

Huxtable produced the Audit report at the suggestion of Grant Thornton, the district auditor whose own account of Southwest One was unusually sympathetic to the Conservative account of its failure.

Microsoft gets flack over "rubbish" UK data

| No Comments
| More
Sir Tim Berners Lee calling for RAW DATA NOW at TED 2009.pngData experts and government officials have fingered Microsoft's popular Excel spreadsheet as a source of gremlins troubling the UK government's reform programme.

Four years after the coalition government began its attempt to create the most open and transparent democracy in the world, technical problems persist.

With only half a year until the end of its term, the coalition may leave government with its flagship transparency reform in a bodge.

It sought simply to publish public records on the internet as open data: a form people could scrutinize easily using computers. It would create an army of "armchair auditors" who would hold public bodies to account.

But the scheme has languished since its launch four years ago: irregularities still plague the data, making it difficult for all but computer experts to scrutinze it.

Traced to its source, the bug leads to the same problem that upset other coalition transparency reforms.

That was vested interests. As it happens, prime minister David Cameron's plan for government was all about dismantling vested interests. His reforms were humbled by the same vested interests they sought to undo.

The gremlins that infested his government's data came from a forsaken backwater of computing called character encoding (apparently overlooked in the coalition government's plans). The Cabinet Office admitted character encoding was still a problem for the most crucial part of its transparency reforms after the issue was exposed in Computer Weekly last week.

Even the World Wide Web Consortium (W3C), the high temple of the global computing movement that inspired Cameron's transparency reforms, has struggled with encoding.

In a nutshell

The encoding problem was thus: the government had no standard way to encode data - no standard way to take the letters and numbers people read on screens and to represent them in codes computers could handle. So its attempts to publish public spending were flawed by the incompatibility of the data they released.

The Department for Work and Pensions, for example, publishes around a quarter of a million spending records every year.

The whole point of doing this was to help private companies compete over public services and contracts; and to help patriotic citizens terrorize public bodies with awkward questions about the minutiae of their budgetary records.

The entire initiative would be futile if the public could not easily draw meaning from that data. But the data was being released in batches that were incompatible with one another.

One of the reasons for this shambles was ecoding. It was illustrated by a similar problem W3C had with video formats. Software vendors cornered the digital video market with "proprietary" video codecs. That is, they claimed property rights over the codes that represented to a computer the images people see. Market forces gave their codecs power. That power forced people to use their video encodings rather than anybody else's. And that undermined the principle of the web that communication would be uninhibited by vested interests. That was where the coalition government was coming from with its own policy to deliver open data. It had to be uninhibited.

Cameron and W3C wanted it to be like pen and paper. Imagine if it was the other way around. Imagine a world that had "proprietary" pens.

You would sit down to write in lament of vested interests only to find your Bic pen, say, would only write on Bic paper. So instead of writing poetry you'd be shaking out your piggy bank or breaking rocks to get money to buy Bic's proprietary paper.

The Conservative Party formulated a computer-led reform plan that would tolerate no proprietary claims over the vehicles of digital communication.

In reality, their plan presumed to defy global commercial forces: companies like Adobe and Microsoft, with a global base of customers already tied into using their proprietary formats. A laissez faire legislator like Cameron would move these vested interests as he might move a sand dune with hands, and some blowing and coaxing.

But that was a bit of a side-show really. It got more attention than it deserved because office documents were something everyone could relate to.

Cameron's liberation policy was concerned primarily with the only thing his government had rights over itself: its own data.

Labour <=> Conservative

200px-David_Cameron_St_Stephen's_Club_2_cropped.jpgThe idea, as presented by the prime minister: transparent budgets and open data would make government more efficient and accountable. Costs would be cut. Plebs would be empowered.

That was actually how Gordon Brown, the last prime minister, put it just before he got voted out of office in 2010.

You could swap his name with Cameron's and (largely) not tell the difference in what they said.

You could trace Brown's data liberty schpiel to the same wellspring as Cameron: national nerd hero and world-wide web founder Sir Tim Berners Lee.

Both prime ministers took up the cause Berners Lee had dedicated his life to: the common basis of communicating, sharing and combining data that was the foundation of the world-wide web. In respect of him, they made this a cause of national pride and the basis of reform.

Hence primes ministers Brown and Cameron planned for government data to be "linked", in the way Berners Lee had been urging it should be.

That meant had to be possible take any bucket of data and combine it with any other, and to arrange the lot in any way your fancy chose. That meant the data had to be good quality. It had to be comprehensible to computer.

200px-Gordon_Brown_Davos_2008_crop.jpgBetween them, Brown and Cameron set Berners Lee up with a £10m office called the Open Data Institute, to aid UK policy implementation of his ideas. A sort of colonial office of the W3C (of which Lee is founding director), it was going to make sure UK data lived up to the reform schpiel.

Cameron kicked it all off in 2010 by publishing public spending records as open data. Four years later, that data is effectively incomprehensible. The the ODI is still trying to make it linkable. Britain's aspirations therefore to be the most open and transparent government in the world, the world leader in open data, the most efficient, open and responsive government in the world, are still work in progress.

Some of the most prominent government data experts confirmed what the data itself had already said about its own poor quality.


Jeni Tennison - Open Data Institute.pngUK spending data was "horrendous", Jeni Tennison, technical director of Berners Lee's Open Data Institute, told Computer Weekly.

"It's ridiculous," she said.

Even when computer experts tried to link this data they had to jump though such hoops that it was "shocking", said Tennison, who got an OBE for her work last year and sits on the Cabinet Office Open Standards Board and Open Data Panel.

"It isn't like we are in a state where the data is basically okay and it just takes a bit of effort to put it together. We are talking about a state where it's basically rubbish," she said.

Companies that set themselves up to do innovative things with UK spending data had to spend 80 per cent of their time simply tidying it up so they could even start to work with it.

UK spending data rubbish was rubbish because it had incompatible encoding. Staff were largely powerless to do anything about it. Because their software was at fault.

Microsoft's Excel spreadsheet has got most of the blame for this.

The problem, according to Tennison and other experts, and just about any forum that addresses the subject online, was Microsoft's atrocious handling of UTF-8, the character encoding widely favoured as the lingua franca of open data.

UTF-8 became encoding-of-choice for the UK government as well as the world wide web. But most of government was using Microsoft software. Microsoft's UTF-8 incompatibilities have long been condemned by experts. The problem was inherent to both Microsoft Windows and its applications, most notably Excel. Users could circumvent them by following complicated instructions. But the workarounds were arduous. This was problematic for government, where most staff use Microsoft software but were apparently not shown how to get to work with UTF-8. More recent versions of Microsoft software employed codecs related to UTF-8 but not compatible with it.

"Popular spreadsheet applications", as Tennison put it, made it hard for users to encode their data in a format that would be universally compatible.

Technical obstacles

"When you export from popular spreadsheet applications you don't get control over encoding and it usually chooses a bad one," she said. "It usually won't be UTF-8. It will usually be something like Windows 1252."

Windows 1252 was an old, proprietary Microsoft encoding. The result, said Tennison, was the data contained characters incomprehensible to other people and programs. Their systems - unless they were using Microsoft Excel on a Microsoft Windows computer - interpreted the incomprehensible characters as "garbage".

"It can cause problems matching stuff up," she said. "If you have the name correct in some data and not in other data then you can't match those two names together. And therefore you can't put the data together accurately."

Ian Makgill - Spend Matters - On panel at Open Data Institute members networking event - 26 March 2014.pngIan Makgill, managing director of Spend Network, a start-up trying to clean up government spending data, concurred with the ODI.

"A lot of the problems are with Microsoft Excel not being able to output open [data] because it likes proprietary formats," he said.

"It's damaging. Microsoft's handling of these things is a problem. Different versions of Microsoft Excel have different formats.

"They default to proprietary formats... because that makes data available in other products," said Makgill, who is regularly cited by other prominent UK data experts as the leading authority on  government spending data quality.

Makgill and other experts said the Microsoft problem was not only its handling of UTF-8, but the difficulties it created for people who wanted to publish their open data in a universally compatible file format. HM Treasury said in 2010 its open data should be published in .csv file format (comma-separated values). But Microsoft didn't handle this most simple of file formats well. This had further helped degrade the UK's open data quality.

Hushed words

Source.jpgComputer Weekly learned through an unofficial government channel that the UK Cabinet Office, which is responsible for the UK's open source, open data and open standards policy, also blamed Microsoft's software for hindering its work.

"There are several issues with saving UTF-8-compliant .csv files from Excel," said a source close to the Cabinet Office.

Another Cabinet Office source said government data was going out with mistranslated pound signs after being exported by Excel. Government guidance in 2010 said departments should leave pound signs off their payment amounts. But departments still put them in. So their output was garbled. Makgill said apostrophes caused similar problems.

These hushed words, by the way, were from officials in a government that stands for transparency. Its transparency only applies in areas where it is in its own interest to cause disruption. That does not extend to itself.

Thumbnail image for Harvey Lewis - Deloitte.png"Microsoft data files are always a bit of a challenge," said Harvey Lewis, head of data analytics at consulting firm Deloitte.

But data quality was not a big issue for Lewis.

The government had rushed its data out in 2010 in respect of the Sir Tim Berners Lee's famous geek plea for "raw data now!", made at a 2009 conference for the sci-tech elite in California.

The government had always intended to get its data out first and then clean it up later.

And, said Lewis, open data had been for government primarily an innovation policy - a means to stimulate the economy. For companies like Spend Network  to thrive from selling linked data services from gov data that had to be cleaned up before it could be linked, the public might have to accept that government will go on spewing out raw data.

Treasury oopsy

HM Treasury did indeed tell civil servants they should publish data now and perfect it later. It even referenced Sir Tim's own advice.

"The focus of the guidance is on how, pragmatically, to make the data available quickly
rather than seeking to achieve full alignment across every entity," it said.

"Publishing raw data quickly is an immediate priority, but we are working towards producing structured, regularly updated data published using open standards," it said.

People in and around the Cabinet Office said the ongoing problem is that people don't know how to persuade their Microsoft software to output in a universally compatible format. Four years on, they still needed training. And UK data was still rubbish.

But HM Treasury, overseen by the National Archives, established conditions for their own data initiative to struggle when they issued the guidance that set it off in 2010. They instructed government officers to publish their data in a standard Microsoft Windows encoding. It assumed they would be using Microsoft software. It imagined alternative encodings as a future possibility.

Bigger picture

Even the W3C has meanwhile struggled to establish UTF-8 as a standard way of encoding .csv files on the web.

It set up a working group last December that won't publish its conclusions until August 2015. It does have more to contend with then character encodings. But character encoding was one of its most thorny issues, said Tennison, who co-chairs the CSV on the Web Working Group, that is addressing the issue for the W3C.

"We are leaning in the direction of UTF-8," she said. "It should be UTF-8".

ODI has simultaneously been trying to persuade government departments to clean their data up using a tool it produced, and to join a certification scheme to improve other elements of their data publications. Departments have shown little interest, despite the poor state of government data.

Vested Interests

Some departments have been so reluctant to even release data that SpendNetwork  had to order their release under Freedom of Information law. Wigan Council would only release spend data after the Information Commissioner intervened. The Ministry of Justice fought all the way to an Information Tribunal.

316px-BorisJohnsonSept08.jpgA similar initiative by London Mayor Boris Johnson floundered for six years because civil servants refused to allow their data to be published. The open data initiative was part of the Conservative Party's plan to break up the public sector. Gordon Brown's proposals were not dissimilar.

Johnson put it in his 2008 manifesto with help from Cameron's campaign team. His Greater London Authority's Oversight Committee said last June something ought to be done about London's poor spending transparency. Civil servants were not co-operating. It traced the problem to the vested interests of the companies whose business dealings were exposed in the spending records. Civil servants might also have had an interest in non-co-operation with the means of their own demise. The coalition plan has aimed for 80 per cent cuts in operational jobs in the civil service.

The coalition claimed on coming to government that its primary interest was challenging the vested interests of corporate IT suppliers. Those interests have prevented it even publishing its own data effectively. Its grander plan to challenge vested interests it saw in the public sector was consequently obstructed.

Microsoft would not talk about either about UTF-8 encoding or its problem with .csv files.

"Modern versions [of Microsoft software] support the most popular standard document formats including PDF, ODF, and Open XML," it said in a written statement.

This, it said, meant applications such as Excel would export "to other programmes which use open standards". It said people should contact their Microsoft supplier if they had any issues.

Computerized job cuts hitch up hokey austerity figures

| No Comments
| More
RothkoBla.jpgTens of thousands of job cuts allowed the UK coalition government to firm up £14bn of otherwise flaky claims for austerity savings it said proved its frugal, "hard-headed" government was succeeding.

But while it used the numbers to call the last government soft and wasteful, the cost-cutting Cabinet Office Efficiency and Reform Group could not substantiate most of its claims because of poor accounting.

Much of what it could substantiate derived from practices established long before it came to power. And its numbers were inflated with claims for ongoing savings years after the original cuts were made. Aside from civil service job cuts and usual procedures, reliable claims for efficiency savings amounted almost to nothing.

Cabinet Office said in June it had cut £2.4bn last year by making 70,000 civil servants redundant. That and other staff-related cuts counted, at £7bn, for the largest share of £14bn total cuts. But £2.3bn of that came from cuts in pension contributions to those civil servants who remained. That was not really an efficiency saving at all, said the National Audit Office in July. The rest came from cuts in temporary staff and premises that were no longer needed.

Most of the coalition's claimed efficiency savings were simply dubious. The NAO said in July that £5.2bn of them had only weak accounting evidence to back them up.

Of the other £3.7bn savings, £1.5bn was claimed for centralised procurement the last government had been doing since 2000 and had already been earmarked for tightening after the Treasury's 2009 Efficiency Review. Computer Weekly since raised significant questions about dubious claims for savings made under such arrangements, even under the current government.

Rough numbers

So discounting £4.7bn of staff and related cuts, £9bn of flaky and irrelevant numbers, and £1.5bn for old hat, that left just £613m to celebrate. That included the Common Infrastructure Programme, a continuation of the last government's work to consolidate the public sector computing infrastructure, a £378m cut in advertising and marketing from cuts made as long ago as 2010, and £119m cut by forcing public bodies to close their websites and merge them into a single Gov.uk domain.

francis_maude_190.jpg"It's a conservative figure," said Cabinet Office minister Francis Maude of the £14bn savings claim when he announced it in June.

Conservative - or coalition - cuts policy relied on the idea that the last, Labour government ran costly, wasteful IT projects. And it would save money by cleaning up the last government's mess. The big story was all about the big stick it was going to wave at big suppliers.

But repeated claims it made for savings it squeezed out of big suppliers were dubious, according to work by the NAO and Computer Weekly last year. NAO said last month the latest numbers were still flaky.

The flakiest numbers were those attributed to the two arms of the Cabinet Office most central to the coalition's big IT savings strategy: the Major Projects Authority and Government Digital Service. The Cabinet Office had ignored NAO advice to clean up its accounting and even had to withdraw savings claims it submitted to the auditor after it transpired they were hokey.

Major projects

Even at face value it was hard to accept MPA claims that it had saved £2.5bn by cancelling and curtailing major projects last year.

ERG published some account of these cuts in a "Technical Note" in June. NAO commended the effort. But its evidence was so scant as to be almost worthless for its assumed purpose under the coalition's "Transparency and Accountability" programme.

It attributed only £220m of its project savings to cancelled projects. That included forecast spending for projects cancelled in 2010.

The coalition made a hoo-ha about imposing a "moratorium" on large projects when it came to power in 2010. But what projects, cancelled when, by who, for what reason had given these numbers? Without the much trumpeted transparency, it is impossible to say.

Other chunks of its £2.5bn project savings included £270m for "re-scoping", where unnamed departments axed unnamed, low priority parts of unnamed major projects for unspecified reasons.

These unspecified savings might have seemed valid if they had come from projects re-scoped in 2012 or 2013. But Maude's moratorium occurred in 2010. Projects approved recently would not need re-scoping because the MPA would already be advising what was in and out of scope.

The MPA's biggest project saving looked hokey as well. It claimed £1bn for cuts made in "ongoing expenditure" by the Department of Health (DH) Modernisation Programme. But the Modernisation Programme was begun in 2001. The coalition was presumably taking credit now for a savings scheme made by the last government.

Thumbnail image for Banksy_075_preview.jpgClosing up shop

For more substantial evidence of cuts you have to look to coalition's "Big Society" attempt close public sector operations and encourage private groups - and such as charities - to do the work instead.

Maude said in June the government had distilled its reform programme into five principles: automating public services, squeezing public budgets, releasing public assets as open data, helping private companies step in, and catalysing the whole thing with instructions for public officials to take risks and embrace change.

That story was played out in ERG's report, with public jobs cut where the Government Digital Service could automate them, and public websites cut and merged into the single Gov.uk domain. Public sector advertising and marketing were being cut at the same time. The details were again desperately scant.

Nevertheless, a government that doesn't want government doesn't need to do marketing about what government does, unless what government does is cut government. So you get marketing for government cuts dressed up as transparency, with little more information than you would need to be reassured cuts were being made, and nothing on what and why.

Ultimately, forbidding public bodies their own websites, and cutting advertising and marketing, ties public bodies' hands behind their backs and creates market opportunities for private interlopers.

Gov UK Green Deal page - 20140912.pngBy closing their websites, the Cabinet Office has cast their punters into the soulless expanse of Gov.uk, an award-winning website with the presentational panache of a Tesco Value tin of gruel.

Grey marketing effectively stops public bodies from competing on the same terms with the private companies coming to eat their babies. It makes government look like the Conservative Party has said it is: colourless, bland, ineffectual, dull, stupid, unloved.

Or as prime minister David Cameron put it in 2010: "top-down, top-heavy, controlling... sapping responsibility, innovation and civic action. It has turned many motivated public sector workers into disillusioned, weary puppets of government targets.

"It has turned able, capable individuals into passive recipients of state help with little hope for a better future. It has turned lively communities into dull, soulless clones of one another," he said.

Perhaps they're right. But where do the dullards, the downtrodden, down-at-heart, the sceptics, the neglected, the elbowed, the broken, the quixotic, the meek and the timid go when all the public sanctuaries have been over-run with brash jostling entrepreneurs and dynamic do-gooders who get things done?

Civil Service Live event in Liverpool June 2014 -2.pngThey will end up as poverty-wage scrubbers of course. Underpaid, under-housed, underfed, under-educated, under-nourished, early-death workhouse-pension no-hope lifers such as the people who wipe your toilet seats, and freelance journalists.

Maude's rhetoric is as hearty as an old milk maid rousing morning-after dormas for their spring dawn chores. But what use being enthusiastic about a binary future when you're a zero and not a one.

Data retention snoop law >all you need to know<

| More
Battle-Of-Orgreave_6.jpgWhen the British government claimed emergency snooping powers in July, it did so in contempt of a court ruling that said they ought to be illegal.

Ought to be. The government got its snooping powers all the same. But it stirred up a stink in the process. And rightly so: the affair has exposed how it has taken possession of a part of cyberspace and begun administering it with the civil sensibilities of a colonial power.

The European Court of Justice had outlawed these snooping powers - called data retention - in April. But it outlawed them only on a technicality of European law. It didn't actually revoke the UK powers.

The result was like a sharp telling off. But its repercussions could be momentous. It could decide one of the most important and most difficult questions of our time: where to draw the line on surveillance.

Theresa May, home secretary, responded to the court ruling by staging an emergency that effectively forced parliament to reassert her snooping powers. If she hadn't done this, she may have been forced to give some powers up.

2013 - First Snowdon Revelation - Secret Court Order for Verizon comms data - Section-215-Order-to-Verizon.pngNo matter that British and other EU police had been using these data retention powers routinely since 2009. Their credibility had been severely damaged by the hullabaloo struck up by CIA whistle-blower Edward Snowden last June. Snowden's first revelation had been a "Top Secret" US data retention programme just like the one in Europe. What had been routine was suddenly alarming.

Snowden had coincided with a European Court hearing about data retention. Six months later, after continuous revelations about the US data retention programme, the court released a preliminary opinion condemning it in Europe. It outlawed data retention three months later. It might have been a technical matter that had no direct influence in UK law, but the court used its authority to set a powerful example.

That example was a list of principles to distinguish good data retention law from the kind of regime that would characterize a police state.

Carry on snooping

May's emergency legislation accommodated most of the court's principles. So it was not quite the all-or-nothing stand-off it was widely assumed to be, with the court revoking data retention law and May defiantly reinstating it.

The court's principles were also not as strict as they might have been. It had even applauded data retention in principle. In practice as well, it applauded the basic mechanism, by which the Home Office ordered telecommunications companies to retain records of people's emails, telephone calls and web browsing so that police could look at them.

p. 2014.08.14 < Theresa May < TM head shot.pngThe home secretary did defy the court however on the most fundamental of all the principles it set out to make sure police didn't abuse these powers.

It was hard to understand why she did this. The court had merely said, these data retention laws are fine in principle but they should not be so totalitarian. It merely wanted police to get a warrant before they started rifling through someone's communications records. Just as they must get a warrant before they search someone's house.

May's response to the court was basically, lets carry on snooping and talk about it later. She commissioned a review to soak up any complaints. And she claimed her emergency powers as temporary legislation, called the Data Retention and Investigatory Powers Act, that would expire in 2016.

Yet she could easily have implemented the court's most important recommendation if she had a will to do it.

Part of the stink was that May's emergency legislation looked like it hadn't been produced in an emergency at all. It strengthened the Home Office's snooping powers with intricate legal filigree. But it overlooked warrants, which would have been simple to draw up in law.


It may not have been so simple in practice. Police have grown accustomed to doing data searches under a lax regime. It has been so lax for so long that it might have become difficult to impose more stringent rules.

The lax rules do in fact require police to ask permission before they search people's comms records. But they have only to ask a designated police officer. They have consequently been making many more snoops than if they had to get a search warrant. They have been making about half-a-million comms data snoops a-year.

Contrast that with US police, who must get warrants. They have been making roughly 40,000 to 60,000 comms data searches a-year, according to David Davis, one of few members of parliament who opposed May's emergency legislation.

US police have been searching comms records at a rate, per head of population, that was just 2.5 per cent of the UK rate. French police have to get data search warrants as well. They make about 36,000 searches a-year - 7 per cent of the UK number in a country of about the same population.

His point was that police were making too many comms searches in the first place. So while their high volume seemed to obstruct the imposition of data search warrants, it might be feasible if the warrants discouraged what were presumed to be high numbers of unnecessary police searches.

Sir Anthony May, the Interception of Communications Commissioner, gave some support to this idea in his 2013 annual report. UK police might, he said, have been using unwarranted comms data searches so routinely that they had grown to presume any snooping they did was justified if they thought it was.

2014.08.04 < The Rt Hon. Sir Anthony May, Interception of Communications Commissioner < am.jpg"It seems to me to be a very large number," said his report.

"It has the feel of being too many. I have accordingly asked our inspectors to take a critical look at the constituents of this bulk to see if there might be a significant institutional overuse of the powers," it said.

Crime data

Might have been. There was however a rough correlation between police searches on comms data and numbers of serious crimes. The founding principle of the data retention regime was that police could use its powers when investigating serious and organised crime.

In 2013, when police did about 500,000 comms data searches, their tally of the most serious crimes - homicide, rape and violence with injury - totalled 343,873, according to the Office of National Statistics.

And then the rest: it does depend how you define serious crime, but in the 12 months to March 2014, another 20,620 people were caught with a weapon. Vandals and arsonists struck 506,190 times, and thieves stole 75,330 cars. Police also recorded about 400,000 drugs and public order offences. And 211,344 fraud cases, mostly cheque, card and online banking.

Police still might have done 98 per cent more snoops than they ought to, which the frequency of US searches suggested to Davis. But the government has shown no desire to draw a line. The UK doesn't even have a legal definition for 'serious and organised crime', according to the ONS. Even the National Crime Agency, which is charged by the Home Office with tackling serious and organised crime, couldn't say what it was when CW asked. The Home Office had a stab in its Serious Crime Strategy last October. But it was debatable.

A warranting authority would therefore have the same wide discretion as the police seemed to give themselves. It would face the prospect of going from a standing start to 500,000 warrants-a-year. That might be handled by the same magistrates who issued conventional search warrants. But both the Home Office and Department of Justice refuse to release numbers of search warrants issued. Their capacity is unknown. The way is unprepared. The Home Office has no apparent interest, though some pretence.

The European court ruling nevertheless condemned the whole data retention regime as disproportionate. That there was too much snooping was, in other words, official.

This conclusion, however, derived primarily from the court's other most substantial criticism of the regime: it retained everyone's communications data indiscriminately, regardless of whether they were suspected of a serious crime.

p. Jack Straw when Foreign Secretary < 599px-Jack_Straw_050519-D-9880W-029.jpgThis had been the whole point of data retention in the first place. As Jack Straw, who helped establish these powers as a former home secretary, put it to parliament in July: you can't know whose communications you need to snoop before you know you need to snoop them - so watch everyone. It would be too late to go round retaining suspect's comms records once a murder was already done.

Accordingly, the Home Office justified its legislation by citing serious crimes where retained comms data had helped police catch the culprit.

Mobile phone records had helped undermine the alibi Ian Huntley had used to try and evade conviction for the murders of Soham school girls Holly Wells and Jessica Chapman in 2002. They had helped convict a gang of youths for the cross-fire killing of 11-year old Rhys Jones in Croxteth, Liverpool, in 2007. Likewise, Oxford and Rochdale child grooming gangs were rounded up with the help of comms data in 2013 and 2012.

It sounded obvious when they put it that way. But the court thought the power to track everyone's comms all the time just in case they committed a crime was not a trivial matter. This used to be what totalitarian regimes were supposed to do.


Used to be. Now computers have made it more feasible for liberal states to intrude in people's private lives, they have established a regime that captures everyone's comms data. The ultimate justification of this was that it was possible. It is the age-old vindication of power: because.

20120418 - Police on Dawn raid on drug dealer's house in Swindon - Swindon Advertiser - DrugsRaid.JPGTotalitarian policing is not normally so easy to justify. Conventionally, if police had the resources to put a roadblock on every street, taking statements from every person they stopped about the who where when of their journey, they would no doubt catch the odd child murderer. If they conducted house-to-house searches on every street, kicking in every door and rifling through every draw, they would no doubt catch some more.

If the Home Office ever trialled such a scheme, they would have caught a paedophile and said, see how this totalitarian power is justified? They would thwart a terrorist attack and they would say, this would not have been possible without an omniscient state.

So society has assumptions against things like blanket surveillance and heavy-handed policing. Human rights law made principles of them in the real world. But in cyberspace, the data retention regime has dispensed with them swiftly.

This suggests they were never as principled as they were merely pragmatic. The "right" merely marked the limit of all that the authorities found it was practically possible to do. As human rights lawyer Geoffrey Robertson QC has put it when describing the frailty of those rights we do have, states usually find it more convenient to be pragmatic than principled.

The data retention regime for this reason extended police powers into areas of cyberspace where they would normally seem oppressive.

police militarization.jpgComputers made blanket surveillance possible and harder to refute.

You can rake 6m people with a dragnet stitched from algorithms that look for certain traits, and patterns of behaviour and biography, where before such activities were only possible by the use of extreme and massively overwhelming force.

Totalitarian police powers may have been tolerated only by fascist, military and imperial states, because only they had no qualms about using force to impose more stringent rules than would otherwise be pragmatic.


Cyberspace is meanwhile becoming a fully-fledged correlate of the real world. In terms of assembly, you can pretty much do in cyberspace what you can do in person. You can meet. You can talk. You can plot a socialist revolution if you happen to have realised it is the only way to get equality. In the physical world, if you plot revolution in a public square, the authorities can watch you: because they can. In a house they need a warrant.

In replicating these powers in cyberspace, the Home Office claimed special circumstances: it worked on the premise that the usual rules didn't apply.

It was able to do this because the data retention regime established an in-between space - a limbo between cyberspace and the real world, where matters such as who owns what and who can do what are still being worked out.

This limbo was constituted of meta-data, a cloud of attributes that describe things in abstract, impersonal terms that allow them to be categorised and processed by computers: colour, shape, size, time, duration, name, preference. Meta-data gives computers a handle on the real world.

The data retention regime captures meta-data about people's communications. It records who spoke to who, when, from where, and for how long. It doesn't record what was said. It just records the major attributes of the communication.

The regime begged a question: who should have rights over this comms data. Before data retention law demanded its retention, it was so ephemeral it almost didn't exist at all. Your old emails might be stored in your inbox archive for a while. Your old texts might be in your phone memory. The audit trail of these comms was not something that was purposefully retained. It was like Eric Dolphy, the outlandish jazz musician, said of music: "After it's over, it's gone, in the air. You can never capture it again."

Civil liberties campaigners have been trying to keep it that way. But the data retention regime effectively put comms data in limbo by ordering its capture.


The regime staked its claim by contrasting comms meta-data with the actual private comms it described. Private conversation had migrated to cyberspace with its assumptions of privacy intact. Police can't listen to them in either domain without a warrant. But comms meta-data was different. That someone said something is less private than what they said, so it should be afforded less protection: that's the idea.

2009 - Seattle police officer blocks traffic in search for murderer of of four police officers in Parkland - Washington.jpgHence the data retention regime treated comms meta-data as though it were the sort of information police would get in the real world by watching over a public square with a pair of binoculars, or by sitting in a car outside somebody's door: who goes in, who comes out, how long they stay.

Anybody can sit outside a house to see who goes in and out. This is a civil liberty because it is possible for most people to do it, and difficult for anyone else to stop them. This sort of snooping occurs entirely in the public domain, the social domain, the world outside: in the street - that place where the mob rules when civil society doesn't.

So Police tend do this sort of surveillance in the physical world, lest vigilantes and lesser thugs do it. They do it without a warrant, sitting in their cars with their coffee and doughnuts, and their walkie talkies or whatever.

They can do this sort of surveillance without a warrant in cyberspace as well, thanks to the data retention legislation. The regime has assumed for police the same sort of power.


This view, however, is false. Searching somebody's comms data is more like seeing inside their house than watching their door.

It is not like knowing merely that two people are in the same house. It is like knowing that two people in the same house were in the same bed talking together, say, from 23:00 hours until 24:30 - and knowing who they are, and who else they spoke to that day, where they spoke and for how long, and for all people, in all houses, at all times.

Cyberspace has its public squares, its forums and mail lists, where police and anybody else can watch at will. But data retention law has given government agents - who are almost entirely police - snooping rights over private meta-data as though it resided in a public space. It assumed snooping powers as a fundamental right of police. In the real world though, police don't have a right to snoop: it is a privilege extended to them under warrant, in respect of their work as agents of civil society.

By defining the comms meta-data limbo as less than private, the Home Office allowed police to circumvent those rules that normally govern their access to private data, and it made this sphere a place where only police could go. It was like they had put a digital panel on everyone's front door that delivered up the who where when of intimate conversations but only to people employed by a security agency.

The home secretary and her supporters made a song and a dance about the old-world rules when they put their emergency legislation though parliament. Police would still have to get a warrant to actually listen in on people's conversations. But their boisterous assurances distracted attention from the fact they had let their meta-data regime sidestep the rules.


And even the system of interception warrants looked dysfunctional. The home secretary administers those herself. Every single time a police officer wants to listen in on somebody's conversation, they have to ask permission not from a magistrate but from the home secretary.

May claimed her oversight of this was rigorous, when she laid out her justification for the regime on 24 June.

"The warrant application gives me the intelligence background, the means by which the surveillance will take place, and the degree of intrusion upon the citizen," she said in a speech laying out her plans at Mansion House, the City of London's municipal centre.

"I do not take my responsibilities lightly. I approve warrants only on the basis of detailed intelligence and a reasoned explanation of their likely benefit. Sometimes I demand more information before taking a decision or I make my approval conditional. On some occasions I refuse the application.

"On the basis of a detailed warrant application and advice from officials in my department I must be satisfied that the benefits justify the means and that the proposed action is necessary and proportionate," she said.
She did this on average eight times every day in 2013, according to statistics in the Interception of Communications Commissioner's 2013 annual report. She approved 2,760 interception warrants that year, it said.

The oversight of the regime looked a mess from this perspective, from a conventional view of computer-led policing: where automated methods create activity in such high volumes that it overloads people-powered circuits designed in an age when magistrates made polite enquiries before signing pieces of paper that said yay or nay to a gentlemanly request for permission to snoop. That is of course assuming the police aren't simply doing too many snoops: that the benevolent oversight of an actual person with judicial authority is not just a memory of bygone times, bygone civilisation.


The court ruling seemed though to imply a solution to this problem. That was a legal framework for a system of oversight so complex that it could only be computer-powered. The home secretary's emergency legislation was in such close accord on this that there can have been no real disagreement between her and the court at all.

Indeed, with the exception of data search warrants, and when it came down to the letter of the law, the court judgment and May's legislation were effectively identical.

Joan of Arc interrogated in prison cell by Cardinal of Winchester - by Hip.jpgThe court's over-riding concern was that the data retention regime should not be indiscriminate. But it did not quite say the regime should not retain everyone's comms data: just that there was no need to retain all data about all people for equally as long.

It wanted the regime to treat people's calls, text messages and emails differently. Likewise data that identified a suspect and that identified their buddies. Or suspect and non-suspect. It wanted the confidential work of lawyers and journalists to be afforded some sort of protection from police snooping. The trouble until now was it didn't matter if you were Ian Huntley or Joan of Arc: it was all game, and all stored for just as long, and all made accessible to police under the same lax terms.

20130813 - Green Party MP Carloline Lucas arrested at anti-fracking protest - by johnyoung2345 at btinternet dot com.jpgCivil liberties campaigners and humanitarian-minded members of parliament like the Green Party's Caroline Lucas liked this idea that the regime should discriminate. Blanket surveillance has become emblematic of totalitarian oppression, of pre-crime science fiction where smart, wealthy people decree that you should watch everyone else just in case they upset the established order. Some MPs were concerned this legislation might not respect their own professional right to privacy.

Blanket surveillance sounded for a long time like the worst of all possible worlds. It threatened deranged vigilance over every nook and cranny. But it transpires that the alternative is no maypole either. The alternative is medieval.

The court's alternative would impose gradations of vigilance over people's communications. The degree of vigilance would reflect the risk that they were up to no good: whether somebody was connected to a public security threat, or perhaps associated with a certain place at a time of threat, or just a place of special importance, or circle of people, or particular known individuals - people perhaps already identified as suspects in a serious crime, or who were otherwise identified for whatever reason as people at greater risk of being involved in a serious crime than, say, the last person but maybe not as much as the person before that. These were all distinctions specified in the European court ruling.


So if you had the good fortune of an upbringing, sailed through school, got a decent job, happy marriage, prosperous children, nice house, safe town, glowing reports, community work, public commendations, right place, right time, antibacterial acquaintances, the Home Office would retain your comms data for the briefest time of all, and this would effectively make it less likely that police would search it. To you, the data retention regime would seem to be defined not by gradations of discrimination but gradations of liberty.

The court said these gradations should be given the force of law. To the common thinking humanitarian, this implied that every gradation of vigilance would be written down in law: a tight and therefore just definition: who got watched more closely and who got let off, with each category of liberty hammered out by public consultation, committee, draft definitions and hours of parliamentary debate.

But the court wanted the regime to discriminate by fluid factors such as time and social group. Police, immigration, intelligence and military agencies have meanwhile long been developing computer systems that discriminate between people according to a statistical measure of risk they might cause trouble.

The Home Office's 2010 to 2013 strategy for Science and Innovation in the Police Service, for example, said: "In the last two decades, the police service has developed new strategies of crime prevention and control such as problem orientated policing and hot spots policing.

"It has pioneered new analytical and statistical techniques to support these. Crime analysis is now an established part of core police business and new methods are being explored to increase its predictive power."

The data surveillance regime might likewise satisfy the court's impossible demand for granular discrimination by employing computer algorithms to classify who should be watched vigilantly.

Such a system might match biographical and security databases against social patterns and behavioural indicators. It would retain more communications data for longer from those people who appear most likely to murder children or hang themselves.

Ignore the idiots - my God does not hate.jpgDissent

It would also collect more comms data - and keep it for longer than average - from people deemed most likely to shoplift from Tescos, splash paint up an investment banker's Bentley, sneak a spliff in the park, keep vigil over a policeman connected to a death in custody, abseil down a coal-fired power station, sabotage an arms fair, be subject of a slander etched in the police rumours database, have sex in a public toilet, go naked or just go incognito.

And it would employ general definitions as well, like all those travelling on high-speed train, walking within the square mile of the City of London, attending protest rallies, visiting Pakistan or Yemen. And maybe those such as members of the Communist Party, school janitors, buyers of teenage porn, who ever associated with anyone who has attended a certain radical mosque at a certain period of time; or maybe all mosques at any time, only with different gradations of risk according to how radical they are. There is most probably a radicalism indicator used to classify mosques. When the UK's "Terrorism Threat Level" edges up, so will the degree of vigilance taken over data related to anyone travelling by air, and so on. Everyone probably fits into some serious risk indicator somewhere. So watch everyone.

This was the sort of regime envisaged by the court judgment. And for all the hullabaloo in parliament when the home secretary shoved her emergency measures through, this is just the sort of granular discrimination her legislation imagined: as much on paper at least as the court demanded.


The real trouble with data retention seems to have been beyond either the wit or remit of the court. For the regime did pose a serious humanitarian threat like those instinctive critics of the surveillance state expected. But the threat was contained in the very solution the court proposed, and that the Home Office was so ready to deliver.

2011 - Edward Snowden and former NSA and CIA director General Michael Hayden.pngThe trouble was that the court left the Home Office to define the gradations of vigilance that determined who it watched more closely than who. Its only condition was that the home secretary herself should make the final decision. She would do this by issuing retention notices to communications providers, just like the one Snowden leaked last year. These would say what data should be kept and for how long.

So the Home Office stitched gradations of vigilance into its definition of a retention notice. Very specifically, it gave the home secretary power to create a retention notice around any desired category of data, and to describe it with any criteria she desired - whether that be time, place, type or whatever else. And it said she must set for each such category a shelf life: how long it should be retained for police use.

Thus the regime would accommodate changing situations, as it must if it was going to be granular. The home secretary could issue blanket notices if she wanted, telling comms companies to treat all phone records the same way regardless of where, when and who. More likely, she would use blanket notices to form a basis of retention that applied to all people, and further notices to impose greater vigilance over people who fell under the domain of specific security alerts or police operations.

Her notices could even be computer-generated: produced from deep analysis of crime trends, risk barometers and sociological intelligence, such as has preoccupied police science and technologists for "decades". Perhaps the notices would grow over time a distinctive shape, as police operations and statistical experiment determined that one set of people - perhaps defined by their biographical or psychological misfortunes - was always more suspect than another: a hierarchy of liberty.

The operational result would anyway always be the same, whether the gradations were determined by state-of-the-art, computer-powered risk analysis or by a Home Office team of actuaries using slide rules and tables of social logarithms. They would produce a retention notice for the home secretary to sign and send to a communications provider, and presumably a supporting report of evidence to reassure her pen.


This all made the parliamentary hullabaloo look farcical. It was fed by human rights groups who raised an alarm over the emergency legislation because, they said, it did not discriminate. But it did. So May's opponents were protesting against legislation they said they wanted, demanding instead a law just like the one they opposed.

20130701 - Isabella Sankey - Policy Director - Liberty.jpgThey were led by Isabella Sankey, policy director of human rights group Liberty, who said May's legislation was in "direct contradiction" of the court, which had said "blanket indiscriminate retention of communications data breached human rights."

The court ruling did beg to be interpreted this way. The court was alarmed that the regime watched "practically the entire European population". But it did not say it shouldn't happen.

On the contrary, it merely said people's data should be retained only when it could be clearly justified in the pursuit of serious criminals. That might mean people's data was retained because they were suspected of being connected to a serious crime. But, the court said, it might also apply to anyone at all if that might contribute to the "prevention, detection or prosecution" of crimes for some other reason.

The court ruling would in other words allow a home secretary to order the retention of everyone's data on the basis that you wouldn't know in advance who's data you needed to retain until after a crime had been committed.

2007 - Alan Johnson.jpgThus the home secretary, her shadow home secretary and her predecessors argued in parliament for a base retention period - for blanket retention. But it would still discriminate: non-suspect people's data would be retained but not for as long as suspects. That was no less discrimination than the court demanded.

On the face of it, the humanitarian position was that there is a class of people who shouldn't have their records retained at all. That we should not all be treated as suspects implies either that nobody should have their data retained or only some of us should.

The data retention debate was therefore a squabble over how short should be the time that the average person's data was retained, and whether there was a class of people who are so squeaky clean that their data should not be retained at all. But since government, court and humanitarians were all in agreement that the regime should discriminate, they didn't really disagree about much. Their squabble was over the extent of liberties enjoyed by the privileged.

Class discrimination

The trouble with the data retention regime was not then that it didn't discriminate enough. Its trouble was it did not disclose enough about the discrimination it did.

It would not disclose the reasoning by which it set its lines of discrimination. Police and intelligence agencies would not publish the algorithms that do their social sorting as open source computer code. The Home Office even refused to publish the actual retention notices it issued. So the public would have no idea what data was being retained and for how long. They would not know what class of people - in both the mathematical and the sociological sense - the Home Office was watching more closely than who.

Their justification for such secrecy was that if people knew how their algorithms determined what was suspect, criminals would know how to act to avoid being netted by them.

But this was a demented logic. It would keep a shadow over the limbo-world their legislation made of meta-data. Because meta-data might have formed the base substance of this limbo. But its structures - its landscapes and thoroughfares - would be defined by the algorithms that categorized the things the data described. A class of comms meta-data whose attributes described patterns of statistically normal, law-abiding behaviour would in effect form the grand parades of this region of cyberspace, where people who tick all the boxes stroll unmolested by civil authorities. And a class of meta-data that described behaviours however more likely associated with crime would form the back alleys more than usually frequented by those who in the language of security service computing are called anomalies.

The regime's secrecy casts a shadow over the grand promenades as much as the seedy alleyways. In the real world, of course, the streets are lit. Criminals, when they're not doing their shopping and taking the kids to school, operate in the shadows. Everyone knows where the High Street is. Everyone knows those places where it's less usual to hang about. Everyone knows what constitutes normal behaviour in any given place. The patterns are familiar. Everyone knows the rules and the risks.

Civil authorities don't cast their streets in shadow to bring criminals out. They turn lights on so everyone can see.

lights.jpgIt is police state logic that keeps the algorithms of mass surveillance secret. It is fearful, suspicious and mean. It is report your neighbour and bug your friend. It is a phone call to the police from behind a lamp post, a complaint to the council from a gap in the curtain, and satisfaction from the drone sound that follows. It is a convenience for a society atomized by techno-powered individualism: watch everyone so they don't have to look after one another. It is the ultimate convenience of the consumer society.

The shadows, however, are also where police and state abuses go unchecked.


The likelihood of police abusing their powers was elaborated last week in a radio interview with Gareth Pierce, a solicitor famous for her defence of people wrongfully imprisoned.

She had exposed how police fabricated evidence against striking miners in the 1980s. She put their actions down to institutional prejudice.

"The miner's strike was one example, which is you have made a whole community suspect," Pierce said on the Radio 4 programme, A Law Unto Themselves.

"One can see it happen again and again in this country. The West Indian community in Notting Hill was made such a suspect community. It's frequently said the whole of the Irish community, over 25 or 30 years, was similarly criminalised. And it's said now that the Muslim community in this country is on block made suspect."

This came up when parliament debated emergency data retention. Katy Clark, Labour MP for North Ayrshire and Arran, asked the government what it had done to stop the state abusing these snooping powers for political ends. She feared a repeat of the past, where the security services had taken sides against unionised workers in industrial disputes, as they had in the miners' strike.

Katy Clark MP.png"The miners were considered to be the enemy within, and much of the rhetoric we hear from Government Members considers trade union activity and people who use democratic means to assert their rights to be a threat to the state," she said.

James Brokenshire, the Home Office minister handling May's legislation, seemed to treat the whole idea as a joke.

It is even likely that social vivisectionists at the Home Office had already determined that union activists were a greater risk to society than, say, managers who paid themselves too much.

But Brokenshire couldn't answer Clark's question. He had no assurances to give against his regime being used for political repression. He would only - in unfortunately patronising parliamentary tones - say that the Interception Commissioner would keep an eye on it.


The aristocratic manner of the Interception Commissioner's oversight was one of the more substantial complaints campaigners made against May's emergency legislation. His office inspects just a fraction of police searches on comms data. It does this by reviewing police records of data search applications after the searches have been done. This is a far cry from the court's demand for individual searches to be vetted beforehand by a third party. They are vetted beforehand by a dedicated police officer.

Jo Cavan, head of the Interception Commissioner's office, told Computer Weekly it reviewed 15,000 such police applications last year. That was 2.9 per cent of total requests made. Each request moreover might contain a much larger number of individual requests to search comms records.

The civil liberties groups were concerned that such lax oversight had missed innumerable erroneous snoops. The Commissioner himself said (vaguely) that he received 869 reports last year of errors police made when searching people's communications records. He found another 101 erroneous snoops in his routine inspections.

Of those errors the Commissioner did handle, half involved snooping on the wrong people, and two led to raids being made on the wrong people's houses.

2014 - Joanna Cavan - head of Interception of Communications Commissioner's Office.png"I don't think we would say it's a high rate of error," said Cavan. "Because when we inspect small public authorities, we inspect 100 per cent of everything and in larger authorities we inspect 10 per cent."

The Commissioner inspected only 75 of 214 authorities permitted access to communications records last year.

He nevertheless found that of those cases he did review, a majority of errors were were made by police.

This detail was obscured in the Commissioner's report. While it did say 99.2 per cent of search requests were made by police and intelligence agencies, the majority of errors - 87.5 per cent, it said - were made by public bodies.

Other public bodies such as local authorities and central government departments have data retention powers as well. The Home Office blamed them for the errors. But Cavan told Computer Weekly that the majority of errors were made by police agencies.


Introducing her "safeguards" to address problems raised by the court and her Commissioner, May said in June: "We have stopped local authorities using electronic communications data and other surveillance techniques to deal with a raft of relatively trivial problems."

She later told parliament her safeguards would protect public privacy would "reduce the number of public authorities able to access communications data."

A single police agency was responsible for 60 per cent of the 101 errors unearthed during the Commissioner's routine inspections, said Cavan. The source of the other 869 reported errors is not known because the Commissioner has not broken it down.

"We need to be vague," said Cavan. "There's huge problems with the record keeping requirements. The overall numbers are flawed."

Still, she said: "The majority [of errors] were from police forces or local authorities. Local authorities were statistically quite high last year. But they weren't this year or we would have made a specific comment.

"So it would suggest there are more police force cases in there," she said.

Local authorities meanwhile did less than one per cent of all comms data searches recorded last year. Though they comprised 62 per cent of those public bodies permitted to search retained data, they were responsible for just 1,766 of 514,608 searches done.

Other public bodies like central government departments, which account for 12 per cent of authorised bodies, did less than 1 per cent of searches as well.

The 54 police agencies, which account for just 25 per cent of authorised bodies, did 88 per cent of all searches. With the three intelligence agencies (MI5, MI6 and GCHQ), police and intelligence agencies accounted for 99.2 per cent of searches.


There was a greater problem with his department's oversight, however, than its being slight. It was short-sighted as well.

It was short-sighted not only because it did not scrutinize the reasoning - the algorithms - the Home Office used to generated its retention notices and determine who should be watched more closely than who.

It was short-sighted because it did not oversee people's comms data after police had obtained it, bar some cursory interest noted in the Commissioner's annual report. The oversight primarily scrutinized the way police acquired comms data. Granted, it was concerned with proportionality, a principle of human rights law: that police didn't get more data than they really needed. But it verified in effect the accuracy and efficiency of police data searches and strayed no more.

The court had raised an issue about this, momentarily. An old maxim of data protection law held that someone could only use any data they acquired for the purpose they originally acquired it. And when they had finished with it for that purpose, they were supposed to delete it. They were supposed to tell people what they were doing with it as well. The court said this meant police should tell people what had been done with their data when there was no longer any need for stealth.

This might mean, for example, police couldn't stuff suspect comms records into a database for future reference. They couldn't share them with another intelligence or police agency. They couldn't load them into a system that analysed social networks and patterns of behaviour.

20140520 - Pedro Cruz Villalón - ECJ Advocate General.jpgWhen Pedro Cruz Villalón, ECJ Advocate General, delivered the court's preliminary opinion on data retention last December, he upheld these rules against the regime. But the court watered them down for its final ruling in April.

The ruling said only that police should have good reason for putting data to subsequent use. The only reason they could keep hold of retained data, it said, was for detection and crime prevention. And only for "precisely defined serious offences". The UK Criminal Justice Act 2003 defined these as offences punishable with 10 years or more imprisonment - crimes such as the Soham murders that now justify the collection of all comms data for all people in case it might be used to detect and even prevent such crimes in the future. But the UK legislation used the justification of organised crime as well.

People data

With British police doing about 500,000 comms data searches-a-year, they would have done approximately three million comms data searches by the time the emergency legislation ran out in 2016. Actual numbers of accesses are not published because, as the Commissioner said, police don't keep proper records. But with many police applications for data accesses containing many more individual data requests, the numbers of people searched could be much higher than records initially suggest.

Those people made subject to police data searches were thus disregarded by the system of oversight that was meant to protect them. The system recorded no tally of people data-searched. It made no attempt to report the categories of people whose data is retained and searched. The oversight was conducted from the perspective and for the sake of police operations.

Even the Commissioner's proposed reforms of police search statistics did not address this problem.

"We have consulted with the Home Office and set out the revisions and enhancements of the statistical requirements that we believe are necessary both to assist us with our oversight role, and, to inform the public better about the use which public authorities make of communications data," he said in his report.

The Home Office and Commissioner concluded from this non-public consultation that their oversight of data searches would be improved if police were required to keep numbers of the total applications they submitted. Not people, but the applications that each contain many data requests that may concern numerous people.

The situation might also be improved, they concluded, if police were required to record the number of items of data they requested, and whether it was done for crime detection or prevention, or for the sake of national security. But not people.

They did want police to record what type of crime had justified their data search - whether it was for the sake of murder and whatnot. But they didn't care to record even numbers of people.

Back of an envelope

Network Intelligence.jpgA Conservative estimate would equate three million comms searches to 6m people, on the basis that most comms occur in two-way-conversations. But an audit trail of somebody's comms for police purposes would cover more than one conversation. It might include all emails sent in a day, all mobile phone calls in a given week. It is not absurd to imagine police and intelligence agencies populating their statistical crime analysis and prediction systems with enough comms data to map the social networks of a significant portion of the UK's 64 million people - perhaps even all suspect people. From another perspective, perhaps, all the underprivileged, stooping lower now under the weight of one less liberty.

The fundamental problem with the data retention regime, said the court, was that it allowed police to draw an exhaustive map of people's private lives, an intimate portrait of their private identity.

The effect would be to cast a chill over people - a "feeling that their private lives are the subject of constant surveillance", it said. The problem with this was that it suppressed people's freedom of expression.

It would encourage them to conform with the narrow world view conceived by social actuaries in Home Office uniforms, that they knitted into algorithms that describe a pattern of officially-sanctioned human behaviour. That is just the effect of the data being collected. That is even before the authorities put their collected data to use.

Sex and drugs

The court ruling implied a need for dissent to be possible. It implied a time when homosexuality was still illegal, or when women were still denied the vote. It implied those boundary regions of social comprehension where in recent decades transsexuals have made a stand against discrimination. Or those margins of the law where ongoing widespread transgression may yet force the state to concede people should be granted the liberties they already take for themselves, such as sex workers and drug takers. It is currently possible that drugs and sex work will be decriminalised, but only because large numbers of people persist in defying the law: as persistent illegality eventually beat those oppressors of homosexuality - because people persisted in defying the law, because they could.

The court ruling implied prejudices that are yet so ingrained we cannot see them. It implied perhaps the gross inequalities of wealth that might yet only be redressed if enough people use their their freedoms to defy the state, in open dissent. But only if they can.

Gareth Peirce - lawyer - publicity photograph.jpg"You do think sometimes that society learns lessons. But that isn't so. All there is is the ability to be constantly alert that all the danger signs are there," said Pierce, the human rights solicitor, on the Radio 4 programme dedicated to her last week.

Asked about her work defending victims of wrongful imprisonment such as the Guildford Four and the Birmingham Six, she drew parallels between the internment (and torture) by British police of Irish Catholics in the 1970s and again of terror suspects in Belmarsh Prison in Woolwich, South East London, in the last decade.

"... internment ... Having said it would never be used again in this country, we lock people up indefinitely without trial, and our government lawyers argued in addition that the government should be allowed to rely on evidence derived from torture.

"In the 21st Century, we were having to argue that it shouldn't be used, against our government's lawyers," she said.

That is not to tar the entire establishment with the same brush. Nor should the exception(s) prove the rule. But Pierce's career demonstrates how those with power cannot always be trusted.

Us and us

The data retention legislation admitted only that those without power cannot be trusted. It gave police granular insight into our lives while subjecting them only to the most cursory oversight. The legislation itself makes us into us and them and them.

The court ruling implied a right for us to protect ourselves from them. But the home secretary's justification of her legislation, set out in her Mansion House speech in June, was based on a desire to protect them from us.

Most of the reasons she gave for doing more powerful comms surveillance were the threats of terrorism that had been caused or worsened in the first place by their military invasions and their drone assassinations.

The legislation was necessary to protect us, she told her City of London audience, from the disaffected people of "Iraq.. Afghanistan.. Syria.. Yemen.. Pakistan.. Libya".

Blinded by this idea that she was with us against them, the home secretary gave operational oversight of comms data snooping to the same police and intelligence agencies doing the snooping.

Thomas Orchard died in police custody on 10 OCT 2012 in Exeter - Devon.jpgThese were the police authorities in whose custody 17 people died last year. These were the intelligence agencies who helped fabricate the case for war against Iraq. These were the overseers who held the British flag aloft while people were kidnapped and tortured under it.

And this has been overseen, in confidence, by a house of parliament who gave us gross inequality, expenses fiddling, cash for questions, cash for honours, cash for lobbying, cash for policy, the surveillance state, an alleged paedophile-ring cover up, back-room security deals, an imperial legacy, and war, war and more war.

It all comes down to trust ultimately. Police like to be trusted. As people, like us, they deserve to be trusted. It is safe to assume that like us, they mostly mean well.

Even those police who have already abused their data access powers were most likely trying to do good. Like those tabloid journalists at News of the World who hacked the mobile phone of Milly Dowler, the 13-year old who was abducted and murdered on her way home from school in 2002. Anyone who knows journalism knows those journalists were doing their bit, in this case at least, to look for the murderer. They just let their crusading zeal carry them over the line.

Carried away

The home secretary may have been similarly carried away when she laid out her justifications for claiming comms data surveillance powers.

20100813 - Solictors Journal - Four per cent of Terrorism Act arrests result in convictions.jpgShe illustrated her case by citing how police had arrested 2,500 people on terrorism charges between September 2001 and 2014. This was supposed to warrant more unwarranted snooping. But she forgot to say the conviction rate for arrested terrorist suspects was just 16 per cent. In 2009 it was 4 per cent. In comparison, the average UK conviction rate for all reported crimes - from theft to murder - since 2003 was 80 per cent.

2008 - Yvette Cooper - shadow home secretary.jpgSimilarly, May and her supporters in parliament - such as shadow home secretary Yvette Cooper, who backed the emergency legislation eagerly - used misleading information to persuade other MPs to accept it.

They conflated the warrants police must seek before placing a wire tap so closely with their case for warrantless comms data searches that it effectively reassured parliament that warrants would cover all when they would not.

Confusingly, they also made a case for warrantless comms data searches that was morally impregnable but factually false. That was life or death emergencies.

They used the example of a child who told a telephone helpline he was going to end his life. Police got his internet address and arrived at his house just in time to cut the rope before he died.

But they forgot to mention that police already have powers to act on their own discretion in an emergency. They can even tap someone's comms without a warrant in an emergency. They just need verbal confirmation. In fact, police used their emergency powers to search people's comms records 42,293 times last year. That was roughly the same number of all comms accesses done by US and French police in an entire year. They would have the same discretion to search comms data in an emergency even under a warranted system.

Spurious evidence

Another spurious fact the Home Secretary used to sell her legislation was that police had used comms data in 95 per cent of all serious crime cases. Therefore, the argument went, it was vital.

But police have had unwarranted access to this data since 2009. Of course they will have used it in the majority of cases. The behaviour does not justify itself. May's justification was simply because they can.

Gini coefficient measure of income inequality is highest for 30 yrs.pngTake middle-aged women, for example. They earn between 18 and 30 per cent less than men in similar jobs. The gap was about the same 10 years ago.

Overall wage inequality has meanwhile been worsening for 30 years. Inequality is rife. But that does not justify inequality. Those with power permit inequality because they enjoy its fruits. Because they can.

So while a rough correlation between the number of police comms accesses and the number of serious and organised crimes might suggest their snooping was roughly in the line of duty, it seems foolish to allow the police to take unbridled power, and to allow them to wield it in secrecy.

But if 500,000 comms data snoops is the product of reasonable policing, that raises a question about how to do oversight at such a large scale.

Of course, as police have had the resources to manually vet their own snoops, then it stands to reason that magistrates' offices can find the manpower to vet them as the court intended. Unless it has been more feasible for police to do the vetting because the volumes are too high for manual oversight to be done properly. Computers would then be the obvious answer, for oversight of high volume snoops on a high volume of information age comms conducted by a high population of people. The oversight would have to be risk assessed to make it manageable - to make it comprehensible to computer. It might even give the public the same powers of oversight over police as visa versa: a public watch; and more stringent on any police officer ever associated with a death in custody, say, or a complaint about racial prejudice or a shoot-to-kill: like Big Brother, only in reverse.

Some police activity is already risk assessed, behind the scenes at the Home Office, by the same systems that determine who is suspect enough to be pulled up at airport check-in, or who has their communications records retained for longer than anyone else in the first place.

Such systems would know the degree of certainty with which a particular person or group has been targeted by surveillance. The statistical measure of risk that might determine that known anarchists within spitting distance of Whitehall, say, or known paedophiles within sniffing distance of a school, were a greater risk than visa versa: that measure would also signify the degree to which that estimate could be trusted. So the data accesses might end up justifying themselves: by the probability of the risk estimate. The Home Office might think therefore that it need be answerable to no-one. But the public will only have proper oversight when they are told where the lines of discrimination have been drawn and with what degree of certainty, and when they are permitted to scrutinize and contest the intelligence systems and the algorithms themselves.

Drone kill communications net illustrated

| 1 Comment
| More
SIPRNet Backbone - Europe - Secret Internet Protocol Router Network - 2004.png
Computer Weekly can illustrate how a UK network connection forms part of a US weapons targeting system that has slaughtered civilians in anti-terrorist attacks gone wrong.

The illustrations add credibility to a legal challenge begun last month over a 2012 contract BT won to build the UK branch of the system - a fibre optic network line between RAF Croughton in Northamptonshire and Camp Lemonnier, a US military base in Djibouti, North Africa.

British officials had been slow to finger the BT contract under human rights rules because they said there was no evidence to suggest the UK connection was associated with US drone strikes, let alone any that had gone wrong.

There is however clear evidence that the UK connection is part of a global intelligence and weapons targeting network that operate US drone missions like a hand operates a puppet.

The network was meant to make drone weapons targeting more accurate, and catch fewer innocent people in the cross fire.

But this "network-centric" targeting also became the means of a chilling new type of warfare called targeted killing: computer-driven, intelligence-led, extra-judicial assassinations of suspected terrorists like those who kidnapped school girls in Nigeria and massacred shoppers in Kenya.

The UK connection was part of this because under the targeted killing programme, the network is the weapon.

Designed to be utterly discriminate but in practice not completely accurate, it had according to the Bureau of Investigative Journalism accidentally killed hundreds of civilians in 13 years of drone strikes on insurgents in fractured states in the Middle East, Asia and North Africa.

The strikes have all but ceased, and the mistakes reportedly led the US to reign in the programme. But the role of the UK connection remains a burning question.

It will not only determine the outcome of a judicial review being sought in the UK by a Yemeni education official whose civilian brother and cousin, Ali and Salim al-Qawli, where killed by a drone strike on their car, in the Yemeni village of Sinhan on 23 January 2013.

It will force the UK to face its part in the killing programme. And it will illuminate a frightening growth in the combined power of military and intelligence services: to use the power of domineering surveillance to feed systems of automated targeting and killing.

Network killing

The mechanism of net-centric warfare makes the idea that the UK connection has not facilitated US drone strikes absurd.

The network made targeted killing possible. The network was also the basis of the mechanism that drove the actual strike operations. It carried the intelligence that selected the targets. It ran the software that directed the operations. It incorporated the drones that carried out the strikes.

The drones do not exist as separate entities called in to finish the job. The drones are nodes on the network. They are a part of the network. The network is the weapon.

The US has been building its network up to drive the systems and weapons - and particularly the drones - that support its strategy of network-centric warfare. The UK connection is part of this strategy.

Drones rely on the network like trains rely on tracks, like puppets rely on strings. The network gives the drones their directions, distributes their surveillance, targets their weapons.

Network map

The blue map at the head of this article shows a map of the fibre-optic core of this global network, the Defense Information Systems Network (DISN) as it stood in 2004.

US Military Command regions - GAO - Defense Headquarters - 2013.pngShowing the European branch of the network, the blue map depicts RAF Croughton, in Northamptonshire, as a major junction of the DISN - then essential for carrying classified military communications for the Secure Internet Protocol Router Network (SIPRNET).

It shows how Croughton is connected to Landstuhl/Ramstein in Germany, the regional hub for the Stuttgart headquarters of US Africa Command (US Africom). It shows Croughton also links to Capodichino, the communications hub in Naples, Italy, where the US Navy has its European and African command centres.

Capodichino, like Landstuhl/Ramstein, connects on to Bahrain, the base for US Central Command (Centcom) in the Middle East.

The blue map pre-dates the the UK connection to Djibouti, which BT was contracted to provide in October 2012.

The contract specified a high-bandwidth line between the UK and Capodichino (effectively an upgrade), and then an extension to Djibouti, where Camp Lemonnier was having bandwidth problems.

When the blue map was published in 2004, the US military was working intensely to turn the DISN into the global surveillance and weapons targeting system it is today.

Their efforts turned the DISN into the backbone of a more extensive Department of Defense network called the Global Information Grid (GIG).

Drone net

Scientists and engineers from places such the Massachusetts Institute of Technology Lincoln Laboratory (MIT-LL) and the National Security Agency (NSA) modeled the GIG on the internet. It was a network of networks like the internet.

They joined the DISN with satellite and radio to form a single, seamless network. The US Department of Defence then strove to plug every device into it - every vehicle, every system, every drone - to form one all-encompassing net.
High-Level C4 Infrastructure Operational Concept Graphic - Department of Defense - Unmanned Systems Roadmap 2013 to 2038.png
The plan had both prosaic and transcendent aims. At the workaday level, US military and defence agencies didn't have enough bandwidth to support growing fleets of drones, let alone their emerging divisions of unmanned sea and land vehicles.

Drones needed the DISN to carry their control signals, as illustrated in the concept diagram above, from the US Department of Defense Unmanned Systems Roadmap 2013-2038.

The diagram illustrates how defence and intelligence agencies rely on the DISN as well, to gather data drone sensors such as video and infra-red. The DISN carries drone data to systems such as the Distributed Common Ground System (DCGS), the common store of imagery intelligence for US military and intelligence agencies.

Satellite connection

The DISN/GIG became more essential to drones as demands for their dense imagery intelligence outgrew the satellite and terrestrial network's ability to deliver it.

GIG Basis of intel over TSAT - Office of the Under-Secretary of Defense - Integrating Sensor-Collected Intelligence - 2008.pngThe US Under Secretary of Defense used this diagram in 2008 to illustrate how military and intelligence agencies used the GIG to communicate via satellite with drones and deployed forces.

High-bandwidth satellite constellations were part of the plan. The one shown (TSAT - Transformational Satellite) was later supplanted by Wideband Global Satcom (WGS).

DISN basis of Teleport - Military Satellite Communications - Space-Based Communications for the Global Information Grid - John Hopkins University Advanced Physics Laboratory - 2006.png


The DISN was connected to satellite constellations through antenna called Teleports, illustrated on the left by Johns Hopkins University Advanced Physics Laboratory.

The US subsequently built Teleports in eight locations: Bahrain; Wahiawa, Hawaii; Fort Buckner, Okinawa, Japan; Lago Patria, Italy; Landstuhl / Ramstein, Germany; Guam, Philippines; Camp Roberts, California; and Northwest, Virginia.

Thumbnail image for DISN basis of JTRS and TSAT - Office of the Under-Secretary of Defense - Integrating Sensor-Collected Intelligence - 2008.pngInternet radio

Deployed forces were also connected to the GIG, with a radio system written in software communicating using the internet protocol.

The Joint Tactical Radio System (JTRS) was developed by Massachusetts Institute of Technology Lincoln Laboratory (MIT-LL), a uniquely, wholly military-funded university department that in 2004 was responsible for developing major components for the GIG and surveillance and weapons targeting technologies that would run over it.

Thumbnail image for Predator Operating in Deployed Mode - Unmanned AirCraft Systems Roadmap 2005-2030 - DoD - 2005.pngPredator

The Predator strike drone relied on the GIG/DISN to target its weapons, as illustrated in the US Department of Defense Unmanned Aircraft Systems Roadmap 2005-2030.

There were two ways of controlling a Predator: from a local station or one far away. Both relied on the GIG/DISN.
Predator Operating in Remote Split Operations - Unmanned AirCraft Systems Roadmap 2005-2030 - DoD - 2005.png
When a remote pilot housed with deployed forces in the theatre of operations (FOL - Forward Operating Location) did the controlling, the drone relied on the DISN to "reach back" to core military computer systems essential to its mission.

The most essential computer system was the Distributed Common Ground System (DCGS).

The DISN and DCGS were also essential in the other primary Predator control mode.

In a Remote Split Operation, the Predator would be launched from a "line-of-sight" control station near to its mission. But control would pass over to a remote pilot back in a fixed military base, such as Nellis Air Force Base in Nevada.

The drone's control signals would be routed over the DISN to a Predator Primary Satellite Link (PPSL). Having pre-dated development of the GIG, the Predator used proprietary network technology and outmoded, asynchronous transfer mode (ATM) communications. This was handled by the DISN Asynchronous Transfer Mode System (DATMS).

But the old Predator comms systems were a hindrance to the GIG strategy. They were not internet-enabled. That meant they couldn't be assimilated into the GIG.

The aim of the GIG was for every sensor, every weapon, every comms system and every software program to operate using the internet protocol. Any military resource would then be available for control or observation, for attack or just for intelligence, to anyone with access to a GIG terminal anywhere in the world, in real-time.

The difference between drones communicating over DATMS and drones communicating over the internet-enabled GIG/DISN was like the difference between communicating via walkie talkie or running apps on your smartphone.

UA Progression From Circuit-Based to Net-Centric Comms - Department of Defense Unmanned AirCraft Systems Roadmap - 2005-2030.pngDoD had a 10-year plan get around the problem by gradually upgrading its drone comms infrastructure. The first step would connect drones to the GIG by turning their satellite links into GIG gateways. That was to be done by around about now. They would act like they were an integral part of the network.

The drones would ultimately become internet-enabled themselves. They would communicate as internet nodes. Their on-board internet routers would use any spare bandwidth to route other people's GIG traffic. They would become part of the network.

By the time the Department of Defense Unmanned Systems Integrated Roadmap FY2013-2038 was published in December, US drones were operating in the way illustrated in stage 2 in the diagram above.

Drones were using network gateways to get their command instructions over the DISN, it said. The DISN likewise disseminated the surveillance data they picked up on their missions.

GIG Communications Infrastructure - GIG Architectural Vision - DoD CIO - 2007.pngThe power of the GIG plan would follow when every drone, soldier, satellite, ship, truck, gun and so on was sharing their intelligence and surveillance sensor data over it, as illustrated in the concept diagram above, from the Department of Defence's 2007 GIG Architectural vision.

They were all part of the GIG. They all communicated using the same internet protocol. They were all integral to the network.

Network-Centric Enterprise data & software building blocks for GIG - DoD Net-Centric Services Strategy - 2007.pngBuilding blocks

A common communications protocol laid the foundation for common data formats and common software interfaces.

This was the transcendent aim of the GIG. It allowed military assets to be available on the network as building blocks. The GIG would be greater than the sum of its parts.

This would theoretically make every chunk of intelligence data, every surveillance camera, every weapon, every software system as a building block on the GIG.
Global Net-Centric Targeting - Key Technologies for DoD Net-Centric Computing - Computer Technology Associates - 2007.png
Net targeting

That was the basis of net-centric warfare - making everything available as a software service on the military internet.

Its most characteristic application was net-centric targeting.

That involved combining different surveillance sensors and intelligence databases on the fly, to get an automated fix on a target.
NCCT Process Example - C2ISR for Air Combat Command - US Air Combat Command - 2006.png

The power of net-centric targeting became apparent in simple tests that combined just two airborne surveillance sensors.

Each sensor had limited ability even to spot a target on its own, let alone get a fix, according to this graphic from a 2006 presentation by Colonel Tom Wozniak of US Air Combat Command.
NCCT Process Example - Network-Centric Sensing - C2ISR for Air Combat Command - 2006.png
The US Network-Centric Collaborative Targeting System (NCCT) takes sensor readings with a middling probability of making sense to a target computer, and combines them to create high-probability fixes where they match.

The NCCT became operational in 2007 after years of development in collaboration with the UK, according to DoD statements to Congress.

USAF Time-Critical Targeting Challenge - Key Technologies for DoD Net-Centric Computing - Computer Technology Associates - 2007.pngThe pre-eminent application of net-centric targeting is the one that made the US targeted killing program possible: time-critical, or time-sensitive targeting.

The graphic above shows how it usually takes hours for military personnel to plan a strike.

They have to digest their battle plans for start - pore over maps and work out what's where. Then they have to find their target. That means arranging for intelligence, reconnaissance and surveillance (ISR) sensors to hunt for it. They have to collate all their intelligence and analyse the data.

Then they have to calculate a fix, nominate targets to be attacked, prioritize among them, co-ordinate their operations, find suitable weapons platforms and get them to the target area, account for weather, choose the best route to the target, watch out for friendlies and, when the strike has been made, assess the damage.

Network-targeting promised to do all this in minutes by automating it.
Semantic SOA - Key Technologies for DoD Net-Centric Computing - Computer Technology Associates - 2007.png
Intelligence databases

Net-centric targeting relies on a process called data fusion, or semantic interoperability.

That means storing your data in ways that can always be cross-matched.

Not just military data. Net-centric targeting developers wrote civil databases into their plans too, such as immigration databases and feeds from civil intelligence agencies.

Automated targeting

Combined with algorithms that watch for target signatures, this creates the means to spot targets on the fly.

And it creates the means to spot targets as small and as fleeting as people. And to kill them within minutes, as illustrated in the diagram above, published in 2007, the year the US Network-Centric Collaborative Targeting System (NCCT) went into operational use, by Computer Technology Associates (CTA), a defence and intelligence systems contractor that helped develop the system.

The example describes a target signature: an algorithm tells the targeting system that in the event of an emergency it should look out for a particular person, known to the Central Intelligence Agency (CIA) as "target ID 1454".

The targeting system keeps watch for them with its Blue Force and Red Force Tracking systems. The military uses these to trace the movements of those they've classified as goodies and baddies.

The targeting system keeps track of immigration and airport databases as well. In the example, somebody on the Red target list (the general hit list) has popped up in the immigration database.

It checks to see if they match against CIA records. They do. And they match against the CIA file with the same target ID as specified in the target algorithm: "target ID 1454". The targeting system sends geographical co-ordinates to people in green uniforms.

Time-sensitive targeting

This sort of computer vigilance, combined with networked intelligence, threw up new targeting possibilities.

The US started building common surveillance systems with its partners in the North Atlantic Treaty Organization (NATO). The more sources of intelligence they had, more targets they could see.

NATO usually took days to plan a strike against even a fixed target. If it could do that it minutes, it could spot targets that were too elusive before.

Examples of Potential TSTs - Time Sensitive Targeting - Architecture Considerations - NATO - 2013.pngThese Time-Sensitive Targets (TSTs) could be threats that emerged so quickly that they had to be attacked within minutes if they were to stopped.

Or they could be "lucrative" targets that appeared in the surveillance net only fleetingly, and would escape if they weren't attacked quickly.

"TST gives friendly forces the option of striking targets minutes after they are identified," said this presentation by NATO's chief scientist in 2013.

CFBLNet Participants - CFBLNET 2012 Annual Report - Combined Federated Battle Laboratories Network - 2013.pngNATO targeting

The US formed a coalition to develop a web of NATO net-centric targeting systems.

It would get target intelligence on the fly from surveillance gathered by any number of NATO countries that happened to have forces, sensors or databases with something to add to the kill equation.

The Multi-Sensor Aerospace-Ground Joint ISR Interoperability Coalition (MAJIIC) worked on making NATO ISR sensors produce data in the same formats.
Coalition ISR Sensor Environment - MAJIIC -  NATO NC3A - 2006.png
MAJIIC aimed to make innumerable surveillance platforms compatible: electro-optical (EO), infra-red (IR), synthetic aperture radar (SAR - high resolution video or still images), moving target indicators (MTI), and Electronic Support Measures (ESM - electronic emissions).

Their aim is what US strategists call "dominant battlespace awareness" - having more eyes and ears feeding more situational awareness back into the network than anybody else.
NATO TST Tool - FAST - Flexible, Advanced C2 Services for NATO - Joint - Time Sensitive Targeting - 2013.png
Afghanistan strike

NATO's chief scientist gave a recent example of net-centric targeting using its own TST tool last year.

The screen shot demonstrates a NATO strike against armed opponents of its military invasion in Kabul, Afghanistan.

It shows a map of Kabul with the location of the targeted people, as displayed in its TST tool, called Flexible, Advanced C2 Services for NATO (Joint) Time Sensitive Targeting (FAST).

This is the view that would appear on the computer screen of an intelligence officer, perhaps at a desk in Djibouti, Bahrain, Stuttgart, or Tampa, Florida.

The intelligence officer has named his target "Terrorist Group Meeting" and given a track identification code "ZZ008" to distinguish it from other targets in the system.

The software supports internet chat between military personnel overseeing the operation. This is the sort of communications handled by SIPRNET over the DISN. The FAST tool handles multiple target tracks at the same time.

An intelligence chief and another Senior Intelligence Duty Officer (SIDO) are also in the system, pursuing tracks on targets "ZZ004" and "ZZ005".

"Any word on the Predators yet?" says a message from a chief of Intelligence, surveillance, target acquisition, and reconnaissance (Istar).

"ETA predators 5min," he is told.

Just a minute before, Air Traffic Control sent a message saying an aircraft had been launched against another track ID, "ZZ006".

Six-Phase Time-Sensitive Targeting Process - Time Sensitive Targeting - Architecture Considerations - NATO - 2013.png

Kill chain

Computerizing weapons targeting involved breaking it up into a series of steps.

It was systematized, as business functions like purchasing and manufacturing were when they were computerized: where human actions were classified into distinct processes like 'produce purchase order', 'send invoice', 'receive goods'.

Time-Sensitive Targeting is commonly known as the kill chain. This has six steps: find, fix, track, target, engage, and assess.

CESMO Test TH05 - NATO Cooperative ESM Operations - NATO C3 Agency - 2007.pngUK gizmo

The UK developed a system that feeds NCCT with target data gleaned from conventional signals intelligence.

Called Cooperative Electronic Support Measures Operations (CESMO), its target data is merged with other intelligence in NCCT.

A NATO test of CESMO in 2005 produced this map, showing line-of-bearing (LOB) readings from ISR sensors.
SOA & XML Security Experiments - Cooperative ESM Operations - CESMO - Norwegian Defence Research Establishment - FFI - 2008.png

Each single LOB is a single signals intelligence reading from a surveillance aircraft.

As expected, the test found a single reading was too unreliable to get a fix on an elusive target.

Even a single aircraft with two LOB fixes would have such a large "error ellipse" that it could not be used to target a weapon, said the NATO C3 Agency.

Error Ellipses for different permutations of Line-of-Bearing - NATO C3 Agency - 2007.pngA target leaked electronic emissions for less than 30 seconds in NATO test TH05.

The area of a poor target fix - called the error ellipse - was 11km by 600m.

This was clearly far too large to risk an attack.

But with five sensors on the lookout, they got a fix.

"Most of the data pertaining to CESMO is classified," said a paper by the NATO C3 Agency in 2007.

But, it said: "It is possible to show how [CESMO] can geo-locate targets that cannot be found by stand-alone operating ELINT or ESM platforms."
Google-like DCGS - Key Technologies for DoD Net-Centric Computing - Computer Technology Associates - 2007.png

Target intelligence

The US military stores ISR data in its Distributed Common Ground System (DCGS).

This is commonly described as the imagery intelligence store queried by US defence and intelligence agencies alike when planning operations and forming target tracks and fixes.

Allied nations use it too. As do targeting systems. It gives them a common view of the battle field and everything on it: common ground.

Common ground means the same surveillance from platforms such as drones, the same human intelligence, the same geo-location co-ordinates from target tracks, the same signals readings from CESMO, the same aerial photography and satellite images.

Drone operations

This screenshot purports to be taken from a DCGS tool in 2003 when the system was still in early development.
DCGS Tool Screenshot of operations over Croatia - Computer Technology Associates - 2007 - small.pngDCGS - Conops - Semantic SOA - Key Technologies for DoD Net-Centric Computing - Computer Technology Associates - 2007.png
The image is of Croatia, Montenegro and Bosnia-Herzegovina on 13 June 2003, the date Croatian defence minister H.E. Zeljka Antunovic welcomed the opening of NATO expansion talks among former Yugoslavian states and Baltic countries at the spoke at the NATO Euro-Atlantic Partnership Council.

It shows the flight paths and surveillance nets of various aircraft including Global Hawk and Predator drones.

Time-Critical Targeting was part of the DCGS concept of operations (Conops), according to this 2007 presentation by net-centric developer Computer Technology Associates.

The image depicts ISR data stores being combined with civil and military intelligence databases to create time-sensitive target tracks and strikes.

DCGS - Crossbow Capability at Royal Air Force Marham - RAF Benson - 2011 - SCALED UP.pngUK drone intelligence

The UK has access to this DISN data store as well.

Intelligence analysts at the Royal Air Force base in Marham, Norfolk, used DCGS imagery to direct UK operations in Afghanistan, said an RAF press release in 2011.

The RAF was building "real-time interoperability" with the DCGS, it said.

"Analysts receive feeds from the US Distributed Common Ground System (DCGS), which provides globally-networked Intelligence Surveillance and Reconnaissance (ISR) capabilities.

"This is the first time that the UK will have the capability to provide near real time imagery intelligence support to Afghanistan from the UK," it said.

Murky area

National intelligence agencies use the DCGS as well, according to some descriptions of the system. That includes the CIA, which is reported to operate some of the controversial drone strikes.

The last 10 years have seen persistent references to the Intelligence Community as an influence and contributor to developments of the GIG, DCGS, net-centric systems and intelligence sharing. The likelihood of the Intelligence Community's dependence on the DISN cannot be ignored.

Anup Ghosh, the former chief scientist of the Defense Advanced Research Projects Agency (DARPA), said in a 2005 speech that intelligence agencies were part of the DoD's GIG vision. David Smith, a DISA consultant who worked on the DISN/GIG transformation wrote in 2006 that it was driven by and would serve both the DoD and Intelligence Community.

The US established a Unified Cross Domain Management Office (UCDMO) in 2006 to "address the needs of the DoD and the IC to share information and bridge disparate networks", director Marianne Bailey said in a 2008 paper.

DoD told Congress in 2006 that tests of the GIG at the Naval Research Laboratory (NRL) would in 2007 include "end-to-end testing with DoD, Intelligence Community, Allied and Coalition activities". The tests would incorporate JTRS, TSAT, Teleport, GIG Bandwidth Expansion, and Net-Centric Enterprise Services (NCES).

Rene Thaens of the NATO Communications and Information Agency said in a 2007 paper that signals intelligence sharing systems would get developed now that the Intelligence Community had discovered their benefits.

The US Under Secretary of Defence's Joint Defense Science Board / Intelligence Science Board said in 2008 that investments by both the Intelligence Community and the DoD had created the GIG network infrastructure. It said excellent progress had already been made in "aligning meta-data from various sources across the Department of Defense and the Intelligence Community".

DoD chief information officer (CIO) John Grimes formally committed in 2008 to ensure information and network situational data sharing with the Intelligence Community. The DoD and intelligence CIOs also formally agreed to recognize one another's network security accreditations. DoD told Congress in 2009 that it was co-operating with the intelligence agencies on the development of its net-centric systems.

Mitre Corporation, a company that did software engineering on the DCGS, helped develop NATO ISR data standards and worked with MAJIIC, said in a 2009 paper about Net-Centric Enterprise systems that US intelligence agencies used them too.

DoD harmonised its IT standards and architectural processes with federal and intelligence agencies, and coalition allies in 2010, it told Congress in 2011. The alignment was done under the Command Information Superiority Architecture (CISA) programme, the Secretary of Defense office formed to develop the GIG architecture and net-centric reference model.

The US Navy told Congress in 2011 it was developing a system to fuse biometric data it took from people on ships it boarded with Intelligence Community counter-terrorism databases. DISA implemented an Intelligence Community system in 2011 that exposed DoD data to users with appropriate security clearance, it said in its GIG Convergence Master Plan 2012. It told Congress in 2012 the DISN carried information for "the DoD Intelligence Community and other federal agencies".

USAF Major General Craig A. Franklin, vice director of Joint Staff, issued an order in 2012 specifying conditions for the Intelligence Community to connect to the GIG, and for IC systems to connect to "collateral DISN" systems. He charged the UCDMO with establishing "cross-domain" computer services between the DoD and Intelligence Community. The UCDMO simultaneously published a list of network services that would work across DoD and intelligence domains.

The National Geospatial Intelligence Agency (NGA) said in 2012 it had "aggressively" broken barriers to imagery intelligence data sharing between civil, defense, and intelligence agencies. The US Navy said in its 2013 Program Guide the next increment of its portion of the DCGS (DCGS-N) would "leverage" both DoD and Intelligence Community hardware and software infrastructures. It said upgrades on the Aries II aircraft, its premier manned ISR and targeting platform, would "enable continued alignment with the intelligence community".

Teresa Takai, DoD chief information officer, ordered in 2013 that all DoD systems would be made interoperable with the Intelligence Community. She committed formally to agree meta-data standards with the Intelligence Community ICO. And she formally requested agencies and government departments including the CIA, Treasury, Department of Justice, NASA, and Department of Transport agree cyber security procedures for connecting to the SIPRNET by June 2014. The NGA said in its 2013 update to the National Imagery Transmission Format Standard (NITF/S) that developments had been driven in recent years by a need to share intelligence data between the DoD and Intelligence Community. It was developed in collaboration with DoD, Intelligence Community, NATO, Allied Nations, technical bodies and the private sector.

"Intelligence Community" is an official designation of 17 agencies by the US Director of National Intelligence that includes the CIA, Federal Bureau of Investigations (FBI), Department of Homeland Security (DHS), Treasury, Drug Enforcement Administration (DEA), Departments of Energy and State, coast guard, NSA, and intelligence agencies associated with each of the US military forces.

The progress of their net-centric integration appears from public records to have been long, arduous, partial, reluctant, ongoing, yet undeniable.

DI2E SvcV-4 Services Functionality Description - Mission services excerpt - 2013.pngEven if the CIA has been averse to conducting its drone operations directly over the GIG/DISN, it is unlikely the DoD network has not carried intelligence and other data essential to its missions in Yemen and elsewhere.

The CIA was directly associated with a more recent evolution of the DCGS.

Central Intelligence Agency

A more substantial computer framework for sharing data between defense and intelligence agencies and their international allies, called the Defense Intelligence Information Enterprise (DI2E), has subsumed DCGS.

At the heart of DI2E is the DCGS Integration Backbone (DIB), a set of data fusion services said in a 2012 Overview by the DCGS Multi-Execution Team Office at Hanscom Air Force Base, Massachusetts, to have delivered a system for the DoD and Intelligence Community to search, discover and retrieve its DCGS content. USAF characterised it as a cross-domain service.

DI2E delivered a plethora of cross-domain services for net-centric missions as part of the Department of Defense Architecture Framework in 2010, listed in the flesh-pink graphic above, which links to a sheet given to developers who intended a May 2013 DoD/IC "plugfest and mashup" at George Mason University, Virginia.

Sensor and target planning are included in the list of Mission Services on the sheet, a collection of over 150 net-centric software services called the DI2E SvcV-4 Services Functionality Description.

They also include SIGINT pattern matching, target validation, entity activity patterns and identity disambiguation for human intelligence (HUMINT), and intelligence preparation of the battlefield.

DI2E Interrelationships - DI2E Summary - Under Secretary of Defense for Intelligence - 2013.pngThis is a defense intelligence initiative. That means it comes under the direct remit of the Defense Intelligence Agencies. But as always, it is described as for the benefit of both the DoD and the wider Intelligence Community.

The Under Secretary of Defense for Intelligence published a diagram of the stakeholders in DI2E in a presentation last year.

D2IE was owned by defense intelligence. But the CIA and other intelligence agencies used it.

The US military was meanwhile reported to have stopped sending drones over Yemen, in April. The CIA was said to have continued, but from an erstwhile secret base in Saudi Arabia.


Even when drone attacks on Yemen were reportedly launched from Djibouti, the picture was murky enough for UK officials to dismiss a complaint by legal charity Reprieve that the UK connection made BT, its contractor, answerable for resulting civilian deaths.

The conflation of military commands around Yemen was complicated. It was hard to point at a drone strike and say who launched it, from where, with comms directed down what pipe. The US wouldn't say. BT had ignored the question.

Transfer of Africa operations from US Central Command to Africom - 2008 - Congressional Research Service - 2010.jpgCJTF-HOA - Combined Joint Task Force Horn of Africa - Operational Area and Areas of Interest - GAO - 2011.pngUS Central Command (Centcom), the military group that invaded Iraq, ran Lemonnier until October 2008, when it handed control to US Africom.

Centcom kept Yemen as an operational area. But its base in Bahrain was almost 1,000 miles away.

Africom kept Yemen as an "area of interest". Lemonnier was separated from Yemen by a finger of water just 20 miles across, called the Bab-el-Mandeb straight. Reports continued to cite Lemonnier as a launch site of lethal targeting drone missions.

US Africom would not tell Computer Weekly what drone missions launched from Lemonnier. Not even whether they did. Nor what mission support it gave Centcom. Nor whether it did. Nor whether Centcom had continued operating from Lemonnier after command passed to Africom. Nor whether Africom carried out missions in Yemen under Centcom's command.

US Africom spokesman Army Major Fred Harrell said a lot of assumptions were made about the drone strikes. But like the White House, he refused to clarify who, what, where, when.

But he did confirm that Centcom co-ordinated Yemen operations with Djibouti.

"Our area of responsibility borders that of Central Command and also US European Command," said Harrell.

"So it's safe to say that anything that occurs across what we call the seam between where our area of responsibility ends and where theirs starts, there's always co-ordination between combatant commands on what goes on.

"We do co-ordinate with our neighbour combatant commands, such as European Command and Central Command," he said.

This article has illustrated amply how such co-ordination is conducted over the DISN.

Earlier reports in Computer Weekly described how a 2012 DISN upgrade at Lemonnier coincided with the BT contract to extend the line from Croughton and a 2012 DISN upgrade in Stuttgart. And how an intelligence contractor was hiring analysts to work on targeting systems over the DISN from Stuttgart.

DISA Unified Video Dissemination Service - DoD - Unmanned Systems Roadmap 2013-2038.pngDrone video feeds

Upgrades including the UK connection have made the US network wide enough to yet another development in drone targeting and intelligence: real-time video feeds.

DISA's Unified Video Dissemination Service (UVDS) takes live video streams from Predator and Reaper drones and transmits them via Teleports such as those at the DISN comms hubs in Naples and Landsthul and Bahrain.

UAV video gets streamed via the Teleports and over the the DISN, according to the graphic below, from the DoD 2013-2028 Unmanned Systems Roadmap.

The graphic illustrates how their imagery is thus stored in the DCGS, and in archives at the NSA. From a DISA presentation last year, it illustrates how the whole system depends on the DISN.

Managing the Enterprise Infrastructure - Operating and Defending the DoD Information Networks - DISA - 2013.pngIt shows drones and surveillance aircraft associated with Camp Lemonnier, otherwise known as Headquarters of the Combined Joint Task Force-Horn of Africa (CJTF-HOA) under US Africom.

The drones feed their video streams via wideband satellite back to Lemonnier, as well as a nearby DISN trunk gateway - a Teleport.

DoD records occasionally state that its net-centric and DISN investments aimed to give simultaneous views of the battlespace to any personnel or commanders anywhere in the world.

This was for example one reason given for the DISN investment at Lemonnier. The idea was that it might help commanders at bases in different places like Bahrain and Djibouti, and commands with different headquarters in places like Stuttgart and Tampa, and perhaps even intelligence analysts in different domains, to co-ordinate their missions. Streaming drone video was a part of that.

Drone over IP

The DISN core, built with trunk lines like the one between Djibouti and the UK, provided the basis of this strategy, as of all the other net-centric services.

It would allow staff in different locations to use the same systems, to see the same intelligence, collaborate in the same operations.

DISN in Current - 2012 - UVDS Operational Architecture < DoD Unmanned Systems Roadmap 2013 to 2038.pngThe DISN Core fibre network is hence at the centre of this network diagram showing how the Unified Video Dissemination Service (UVDS) operated in 2012.

The diagram shows video feeds running from drones, over satellite links and finally via Teleports, over the DISN.

The Teleports at Lago Patria, Italy and Landstuhl, Germany are shown distributing live video feeds to the DISN.

The Italian Teleport, on the DISN between the UK and Djibouti, was made capable of live drone video comms in May 2012, the year BT was contracted to make the UK connection, the diagram notes.

Published in December 2013, in the DoD's 2013-2038 Unmanned Systems Roadmap, it shows how full-motion drone video is carried across the network to UVDS storage points, where they can be made available to users on SIPRNET.

The graphic has two distinct components: the part in green that gets the full motion video onto the network, and the part in red that makes the video available to military users and their systems. Both parts operate over the DISN.

DISN in this diagram links satcoms directed through Teleports with theatre communications for bases such as the one in Djibouti. It links those with the classified SIPRNET network that also runs over the DISN, and with UVDS systems operating from DISA's regional computing centres, called Defense Enterprise Computing Centers (DECCs).

Network infrastructure

The network components in the diagram above match both those described in the official notice that described BT's contract for the UK connection in 2012, and the US Navy's Congressional budget justifications that described the same DISN connection upgrade as an operational need for Djibouti.

The key components are the MSPP (MultiService Provisioning Platform - a device to connect the DISN line at bases such as Camp Lemonnier) and HAIPE (High Assurance Internet Protocol Encryptor) encryption devices.

These were devices specified in US Congressional justifications for building the DISN into the GIG.

Implementing the Global Information Grid - DoD - 2004.pngNew Global Information Grid-Bandwidth Expansion DISN Node - IA Newsletter - 2006.pngMSPP devices form the major junctions of the Global Information Grid (GIG), as illustrated in this graphic from a 2004 presentation by Frank Criste, then director of Communications Programs for the US Office of the Secretary of Defense.

The diagram shows the GIG test environment most likely operated by NRL in 2004. It portrays components that would later be used to build the GIG in the real world.

The network infrastructure specified for BT's UK connection was also illustrated in this diagram from an article in the summer 2006 DoD Information Assurance Newsletter.

It illustrates the GIG-Bandwidth Expansion programme, a scheme to upgrade the GIG for net-centric warfare, and to carry the imagery intelligence spewed out by burgeoning numbers of drones.

The new DISN infrastructure would include OC-192 fibre-optic cables, ODXC and MSPP network devices, and "KG"-class high-speed High Assurance Internet Protocol Encryptor (HAIPE) devices, all devices specified in BT's contract for the UK-Djibouti connection, and all also matching US budget justifications for expenditure.

"With the expanded bandwidth provided by GIG-BE, DISA can address high-capacity applications (e.g., imaging, video streaming) and provide a higher degree of network security and integrity," said David Smith, the DISN programme manager who wrote the article.

"The GIG-BE program is the first of its kind to bring high-speed High Assurance Internet Protocol Encryptor (HAIPE) devices to a DoD network.

"The HAIPE devices, introduced because of the National Security Agency's anticipatory development, will greatly increase the ability to bring secure, net-centric capabilities to the Intelligence Community and DoD operations," he said.

BT has consistently said it could not be held responsible for what anybody did with the communications infrastructure it supplied.

"BT can categorically state that the communications system mentioned in Reprieve's complaint is a general purpose fibre-optic system.  It has not been specifically designed or adapted by BT for military purposes. BT has no knowledge of the reported US drone strikes and has no involvement in any such activity," said a spokesman for BT in response to questions earlier this year.

Profit lures London councils into IT sales venture

| No Comments
| More
Lupa Capitolina.JPGTwo London councils have embarked on a scheme to sell IT services for profit after concluding their cost-saving backoffice merger could grab them a big slice of the £144bn local government market.

Giving their project the code-name Romulus, after the rapacious founder of the ancient City of Rome, the London Borough Councils of Havering and Newham drew up a plan to compete for back office business in the health, education, police and charity sectors as well.

Now they are looking at plans to spin Romulus off to compete in the private sector, an idea that has sent them goggle-eyed over the chance of making even more lucre, and bringing it back to the people of East London like sacks of pre-historic booty from raids on neighbouring tribes.

Havering and Newham adapted their public ethos to the profit motive with gusto, choosing for Romulus the business model that could make the most profit.

They had a few options on the table, including good old fashioned public service. But they stacked the odds in favour of what would bag them the biggest wad.

Their Full Business Case said in December they had ruled out simply outsourcing their backoffices because they wanted to keep the fees for themselves. Why pay an outsourcer a commission to do the work when you can do it yourself?

But that was not all. It was not just a do-it-yourself decision. They had not merely decided not to outsource. They had decided to go into the outsourcing game themselves. Now government rule changes have allowed public bodies to start getting some of those outsourcing fees for themselves, well why don't they just do that then?


The same profit motive also made them rule rule out merging their backoffices with other groups of councils that may have already done the hard work. Because that would mean paying commission to some other group. They wanted to be the ones earning the commission.

Government funding cuts left them no choice: find a way to earn profit from other public bodies or cut essential services to people in the nascent principality of Havering and Newham.

It is each man and woman for himself in the meantime, as one public sector managing director told backoffice staff recently, after their attempt to compete like a private company failed. Staff had been through the mangle for years already. Their loyalty to the public ethos is what kept them going through the quasi-privatization, bizarrely. But now they're being out-competed, broken up, repackaged, rewarmed, sold off, palmed off, to be put through the mangle again to see if there's any more die still to come out. At some point, perhaps in some generation down the line, they will find a job tenure secure enough to invest belief in their work again. But someone else whose profession was downsized by technology recently told your correspondent the workplace had become a place of fear. People used to have fun and some degree of autonomy. There are now a fraction of the people doing the job. They are time-and-motioned to within an inch of their lives. And the management made it clear they are looking for any excuse to sack anyone because they want to cut some more.

Back in Havering and Newham, where Romulus launched as OneSource this month, the profit question is a matter of ongoing debate, as though they are not quite prepared to face it yet, or to stick their necks out.

The business case made it clear. The plan was to generate profits (the word it uses often is "income" - the American word for profit - though Romulus means sales, while it sometimes says "savings" when it means profits). But Romulus has not yet launched a vehicle to carry the profit.

Tony Huff, who wrote the Romulus business case as OneSource business services director, said the councils had not decided what they would do with the income they generated, or profits they made, or savings they made, or whatever you would call it.

"Contributions to overheads is what we would call it," said Huff.

This could come from persuading other councils to join their partnership, like they ruled out joining anybody else's. Or it could come from the sale of backoffice services to other councils, or to other public bodies, and so on.

Look at it like this though. You are in the business of selling something. Say you generate more money from your sales than it cost you to run your business. What do you call that? You call it profit, don't you? That's the dictionary definition. These councils have not even decided what to call it, let alone what to do with the money when they get it.

That though, said Huff, was an aim of the joint committee the councils set up to run the venture in the meantime - to decide what to do with all the money.

"We have been asked to go away and look at how we might deal with new business," he said.

The most likely option, according to the Business Case, is launching a private company. Romulus will have to do that if it pursues its aim of selling to other public bodies, and charities, and private companies. It's a legal requirement under the coalition government rules that have made it possible for councils to get in the outsourcing game. The business case said they could sell to other councils without any restriction. But they had to form a private company if they wanted to make profits by selling to anyone else.


If Romulus doesn't launch a private company, it is easy to imagine a time when some other once-public backoffice from some other group of councils, or some HE College or NHS Commissioning Support Unit or police service spin-off, has become so big and successful that Havering and Newham can't justify doing their own any more. So they are damned if they don't. And if they do, they could still crash and burn amidst the Titanic clash of backoffice spin-offs that will surely commence with much blood-letting as public sector bodies up and down the country try to get on the profit-making game as well.

For now, said Huff, their main driver is making savings for Havering and Newham. The business plan is to make the savings and develop the business model, then bring in sales.

Savings means consolidation, means staff cuts, means automation of council services using web apps.

Geoff Connell, OneSource CIO, told Computer Weekly Havering and Newham had saved £10m per annum by getting 50 per cent of their citizen contacts online. Most of the savings had come from job cuts of people who used to deliver the services in person. Connell had the saving on the tip of his tongue. He didn't have the number of redundancies to hand. Coalition estimates have it at an average 78 per cent job cuts.

Redundancies will account for half OneSource's set up costs this year, three quarters next year, and £2.6m in total by April 2019, according to the Business Case. A quarter of the rest this year, and half of the rest next year will be the cost of developing Oracle software applications to run the back office services it has not yet designed: asset management, FOI, risk management and forecasting, and time costing.

The proprietary One Oracle platform, which Havering and Newham developed in conjunction with seven other councils, will form the heart of its sales business operation: back office processes such as finance and human resources. Connell said last week it also had sales of self-service apps on the cards as well, driven by a proprietary Microsoft system.


By choosing proprietary software systems over the open source software ventures Newham and other councils had toyed with in the past, it has protected its commercial interests. That is, the act of choosing Oracle puts Romulus' ventures in its own private interests, rather than the public interest. That should ultimately help clear up any confusion about terminology, and where this profit motive fits in the public ethos.

The public ethos resonates in the voices and publications of Romulus executives like a prideful vein of self-denial. Connell and Huff seem like the nicest young men who ever privatized a back-office. They have the citizens of Havering and Newham at heart. Maybe not the citizens of Barking. And maybe not the citizens of Bangladesh. But Havering and Newham are going to be all right.

Because Romulus has its eye on the prize.

"Local government is worth £144bn each year," said the business case. "This offers a great opportunity for this Programme in terms of business growth."

That's Romulus, son of Mars, the God of War, and the daughter of a deposed King: outcast, raised by a she-wolf, murderer of his brother, rapacious warlord, snatcher of women, violent oppressor, founder of Rome. What a start in life.

Councils stage open source revival

| More
Carbon_cycle.jpgLocal councils behind an ambitious public open source software scheme that flourished briefly with boom-time investment under the last government are attempting to revive it under the cost-cutting coalition's digital strategy.

But their old rival Microsoft is making its local government come-back too, after a 10-year gestation with the London Borough of Newham under a deal that became the focus of bitter opposition between the proprietary and open source software camps.

Last time, with central government funding but only lacklustre policy support, Bristol City and Camden London Borough built an open source content management system that was propagated as far afield as India and Bremen.

This time, arms twisted by budget cuts, but with government policy paving the way and hindsight in their favour, they have widespread sympathy but face boggling competition.

Newham is about to turn the game upside down by launching a commercial venture to sell tailored Microsoft software to other councils for a profit.

Just as Newham's Microsoft partnership helped bust the last open source alliance in local government, it looks set to put the cat amongst the pigeons, or doves, again.

John Jackson - Camden CIO.pngJohn Jackson, chief information officer of Camden London Borough Council, portrayed the latest open source scheme as a bold move when he announced it last week: an open systems alliance to bust the proprietary software ecosystems of companies like Microsoft.

"We've drawn a line. We've said its going to be different. We said to suppliers, 'What you are delivering is rubbish'.

"I think its about time we stood up, and about time we changed a very tired market place," he told the GovNet Open Source conference in London on Thursday.

"I'd like to see local authorities creating an alliance to work together to deliver things like common APIs (application programming interfaces), to deliver common web services, to deliver code sharing, to innovate and to drive systemic disruption.

"Camden and Bristol are going to be engaging in a very strategic partnership to work on promoting open systems, promoting open source, and helping other councils deliver an open systems vision into government."

He laid great emphasis on disruption. He wanted to disrupt proprietary software markets, which he likened to medieval fiefdoms that held unhealthy power over public bodies.

google_ecosystem_large.pngCouncils should band together, said the Newcastle history graduate, to create an Open Source Alliance, and write a "bill of rights" or "charter for change" to "break the medieval market".

Part of his solution was open source software. He said councils should put it at the centre of what they were doing, when they have kept it at the margins.

His pitch was essentially coalition government policy, re-purposed for local government: embrace open source to smash proprietary software ecosystems and markets.

But Jackson was also doing a plug for a local government procurement framework the Crown Commercial Service (CCS) put to tender nearly a fortnight ago, and which Camden, Newham and other London councils helped draught.

£300m motive

That was the £300m Local Authority Software Applications framework that will introduce its own modest market reforms when CCS - an arm of the Cabinet Office - completes its half-decade re-let in July.

Billed by Jackson as the first national procurement ever to specify open systems, the LASA framework contract notice did not quite fit the billing. It asked only that suppliers "assist our aims of accelerating development of open systems, data sharing and the interoperability of IT systems in government". It made no requirement of the official coalition policy preference: open source software.

Jackson was nevertheless adamant that councils could improve their lot by rallying around the framework. It's target market - local government line-of-business applications such as benefits, social care and libraries - is deeply, unhealthily proprietary.

Take for example Dudley Metropolitan District Council, whose information systems manager, Andrew Tromans, told Computer Weekly its own proprietary line-of-business applications had prevented it pursuing its open source ambitions.

Dudley had considered using MySQL, an open source alternative to Oracle's relational database. But all the council's line-of-business applications were based on Oracle. It had to buy Oracle anyway. So it seemed pointless implementing MySQL as well.

Gavin Beckett - Bristol City Council - cropped.jpegSimilarly, both Bristol and Solihull Councils, once among the staunchest open source advocates, both recently gave up decade-long attempts to replace Microsoft with open source desktops because their line-of-business applications used proprietary Microsoft formats.

Gavin Beckett, chief enterprise architect at Bristol City Council, told the conference how he implemented a proprietary Tibco enterprise service bus as well.


Jackson lightly goaded Steve Halliday, Solihull CIO, from his conference platform last week. He asked Halliday, who is also Socitm president, "What are you going to do to promote a new approach?".

Halliday had promoted the local government framework. Socitm had moreover adopted parts of the coalition's Government Digital Strategy, which had codified the government's preference for open source software.

But it dropped the bit that specified open source. Like the framework, Socitm adopted a less disruptive, loose definition of open systems: common APIs, compatible data.

Steve Halliday - Solihull CIO - cropped.pngHalliday was not available for comment. But a Socitm source sought to discredit Jackson's Open Systems Alliance off the record: councils had tried and failed to do this before when the Labour government was pumping hundreds of millions of pounds into its e-government drive to put council services online; what chance did they have now in the age of austerity?

Take Newham, said the source: Newham had collaborated with other councils to build a CRM system, and distributed it through the Local Authority Software Consortium (LASC) - but that floundered. The same would happen to Camden's Open Systems Alliance.

The source was being naughty though. Newham's LA-CRM was never open source. It was a proprietary system that led Microsoft's push into big local government software. It had not failed at all. It had been in gestation and was about to be relaunched.

It had been the nemesis of local government open source. Socitm had been instrumental in it. And it was about to rise again.


Newham is now preparing an unprecedented plan to relaunch LA-CRM as a commercial venture: the public body will sell the Microsoft system to other councils for private profit.

It developed the next-generation product under a Microsoft contract that had been bête-noire of the open source movement in the middle of the last decade: its dreaded Microsoft memorandum of understanding (MOU).

Newham had been a leading player in local government open source before it formed the Microsoft partnership, partnering with Bristol and Camden to develop and implement Aplaws - an open source content management system that promised for a time an alliance of councils like the one being posited again now: a stronghold to nurture public open software. Infamously toying with ditching Microsoft entirely, Newham called all eyes to its attention and convened a fierce competition for its loyalty.

Thumbnail image for 2003 - Newham and Microsoft - Memorandum of Understanding - cropped.pngNewham ended up signing a Microsoft MOU that - unbeknownst to everyone - laid the foundations for the profit-making, proprietary software venture Newham is about to launch, to sell Microsoft CRM software to other councils.

It created a stronghold for proprietary software development and forged an alliance of councils using Microsoft software. It broke the open source alliance and enticed other councils who might have joined it into Microsoft's ecosystem instead.

The MOU committed Newham to migrate its major business systems to Microsoft platforms and to form a joint sales venture with Microsoft.

Newham promptly migrated its Aplaws implementation onto Microsoft's first CRM platform.

"It is assumed that the customer works with Microsoft as the 'platform of choice' and is prepared to work towards migrating all relevant business and technical solutions onto a Microsoft Platform over a defined time from competitive technologies," said the 2003 MOU.

"It is expected that these solutions may be developed jointly and sold collaboratively with Newham and significantly change the landscape of the application market place," it said.

Public authorities weren't allowed to sell software and generate private profits in 2003. So Newham gave its Microsoft CRM software to Belfast City Council (yes, Belfast) so it could do the selling instead.

Geoff Connell - Newham CIO < Microsoft promo shot.png"We worked in conjunction with Belfast City Council because at the time they were able to trade in a way that English authorities weren't," Geoff Connell, Newham CIO, told Computer Weekly in a telephone call after the conference.

"So we gave them free use of the product and then they were the ones who ran around the country helping to install it for different authorities," said Connell, who had led LA-CRM development at the council when it signed the Microsoft MOU in 2003.

The 2003 MOU said Newham would help Microsoft consider "shaping the MS CRM as a viable cost effective alternative for LG [local government] clients". That they did.

Microsoft CRM subsequently evolved into Microsoft Dynamics CRM, a more substantial system within the .NET framework. Newham duly built its next-generation system on Microsoft Dynamics CRM.

The council obtained power to sell this system in 2012, when the coalition government passed a General Power of Competence into law, allowing councils to trade for profit like private companies.

dynamics-gp.pngConnell said Newham was considering hosting its CRM system in a Microsoft cloud data centre and then selling it to other councils as a Microsoft Dynamics software service. Customers would pay Microsoft for Dynamics and Newham for those specialised, local government line-of-business modules.

It would work like an appstore for people who bought Dynamics. They could pick and buy Newham modules as add-ons.

"We haven't taken it to market yet. But we intend to take it to market," said Connell.

"We will probably use a cloud offering. And then we will look to license the use of other specific code we've developed to do things like parking permits.

"The reason we wouldn't just give it away is the local authorities we work for have spent millions of pounds, invested in developing this capability.

"So it seems fair for the tax payers of our boroughs if others are going to use it, which we would want them to, that we get something back to recognise the sunk investment we've made," he said.

Well, sort of. Newham spent £5m on its entire customer service redesign, Connell later clarified. That included process redesign, staff cuts, website, master data management, systems integration and CRM.

It saved £10m-a-year, but mostly from staff cuts because citizens didn't need them when they used self-service apps. That is the part of the coalition digital strategy being pursued by all protagonists in this tale.

Seeding the market

Belfast had installed LA-CRM at 25 councils at its height, trading on kudos borrowed from Microsoft's rival open source movement.

It told customers last year it had ceased development. It would reassess its options in a couple of years when it had consolidated its own various LA-CRM implementations and Northern Ireland had completed a round of local government mergers. It is not hard to imagine it buying the next-generation from Newham.

Socitm Better for less report - December 2013.pngOther things are also not quite as they seem. When it comes to sharing software code, Newham and Socitm have both been using the same language as Bristol and Camden.

Connell talks passionately about Newham sharing code. It has already employed code distributed by other councils, and has even shared code itself with Camden.

But he said the council would attempt to sell any code deemed valuable.

Sharing was for scraps.

Where they differ, apparently, is the extent they wish to collaborate, or compete with other councils.

"We develop an awful lot of code," said Jackson. "What's the point of developing lots of code in isolation? Why can't we share it? Let's re-use and develop web services and APIs."

With the proprietary camp using the same language, it becomes hard to boggle the difference. They all say they want web services. They all say they want common APIs. Newham's Microsoft system has meanwhile already done what the coalition Digital Strategy ultimately strove to do, with all its talk of disrupting proprietary markets with open source: forced by a £106m budget cut, it has turned half its government services into web apps and made the staff who deliver them redundant.

Becket, who has the same aim for the same reasons, said the difference between what the Open Alliance and Socitm strove to do was the difference between words and action.

"What John's talking about is doing something - actually making that change," he said. "So if I sponsor a piece of work to develop a particular aspect of our portal, let's make that available to everybody. Let's just do it. Its actually the doing of it. That's what we are doing."

John_Prescott_on_his_last_day_as_Deputy_Prime_Minister,_June_2007.jpgThey are striving to do less than when the Office of the Deputy Prime Minister put up the money for Aplaws 15 years ago.

They have no plans for similarly ambitious software developments. Jackson's pitch was not as bold as developing an open source line-of-business application that would really upset the market. They don't have funding. They have cuts.

But they did envisage an alliance would create a common basis for LA computer systems and gradually erode the power of proprietary line-of-business vendors like Capita, Civica and Northgate. That would allow councils who wanted to share code to gradually build a shared resource for council functions - like the Microsoft/Newham alliance, only public.

The first step was to create a free market for software by agreeing common interfaces, getting enough councils collaborating to arm-twist vendors into doing things their way. That meant government taking a hand in defining the software ecosystem in which it existed, as opposed to an ecosystem a proprietary software vendor defined to maximize its own gains.

Ding ding

This view assumes the similar calls Newham and Socitm have made for common APIs and code sharing are compromised by their allegiance to proprietary ecosystems.

Yet Camden's Open "Systems" Alliance amounts to an admission that open source has been relegated to a mere middling ambition - not only as a government policy objective, but also among those public officials who have been its champions.

Newham has meanwhile begun talking to Camden about joining the alliance. It too has middling ambitions for sharing. It wants to share code. But it primarily wants to sell it. It just doesn't want to call the proceeds profit.

While their attitude is collegiate, their philosophical positions, once deeply opposed, are in danger of forming a tepid solution: a sort of harmony where bacteria - the lowest common denominator - thrives, because there is no resistance.

Each seems in denial: Camden that the Government Digital Strategy is a programme of privatization, especially in local government where no open source edict holds; and Newham, that its path leads to a universal market, where councils sell each other software and they call their surplus income 'profit'.

The digital strategy, as a Trojan Horse for local government privatization, will create the market. Newham will compete in it, with platform-dependent code that will help lock other authorities into a Microsoft ecosystem. Newham looks like Microsoft's Trojan Horse. Round two has just begun.

Telecoms contractor could be called to account for drone deaths

| 1 Comment
| More
Hear no evil, speak no evil, see no evil.jpgTelecoms supplier BT could be asked to account for drone attacks in Yemen and Somalia after connecting a fibre-optic cable to a US military base conducting the strikes.

The revelation comes from a high-level review of a complaint that the £23m BT communications line supported drone missions that had accidentally killed between 426 and 1005 civilians in the last decade in the course of strikes on suspected insurgents, according to estimates of the Bureau of Investigative Journalism.

Officials at the UK Department for Business, Innovation and Skills threw the complaint out last October, saying there was no evidence to say whether the comms line supported the drone attacks or not.

A review of the decision has since raised the prospect that BT could be asked to gather evidence to answer the question itself.

The big question - whether the fibre-optic cable is infrastructure used in drone strikes critics say are illegal - remains unanswered.

Legal charity Reprieve used an international agreement on corporate ethics, called the OECD Guidelines for Multinational Enterprises, to complain that BT should never have taken the contract because it obvious what the cable was for.

But the only evidence BIS had to go on was due diligence BT did to satisfy the OECD guidelines when it took the contract in 2012. BT's due diligence ignored the drone controversy. BIS said it therefore didn't know and couldn't say.

The review, published at the end of February, said companies shouldn't get away with glossing over controversies in their due diligence. They shoudn't turn a blind eye when they took on a customer with a bad reputation. They should ask awkward questions and address them specifically in their due diligence.

BT had done only general due diligence when it took out its contract with the US Defense Information Systems Agency on 26 September 2012.

Now the review, though indirect, has turned the spotlight back on BT.

BT refused to comment on the review's conclusion.

Wise monkey

It denied knowledge of drone strikes. It also tried to portray the fibre-optic line, which it laid between the US military intelligence communications hub at RAF Croughton, Northamptonshire, and Camp Lemonnier, the base in Djibouti, North-East Africa, that launches the drone strikes, as a civilian cable not suitable for military applications.

"BT can categorically state that the communications system mentioned in Reprieve's complaint is a general purpose fibre-optic system.

"It has not been specifically designed or adapted by BT for military purposes, including drone strikes," said the statement.

"[It] could be used at the base for a wide range of day-to-day activities, such as general housekeeping/internet browsing, email, communications, stores ordering, data functions and voice communications," said a BT spokesman in an email. He refused to rule out the possibility that it might serve a military function.

Neither BT nor BIS would release the due diligence BT had done. As a system of self-regulation, the OECD rules left it to those best placed to ask and answer the awkward questions, and to allay public concerns that they conducted their business ethically. But it did not allow the public to see the deliberations. Only officials could see the due diligence. The due diligence failed. The officials rubber stamped it even after the complaint.

Could try harder

BIS should clarify its position on how companies should address awkward questions in their due diligence, said the report, written by legal experts Jeremy Carver, a lawyer at Clifford Chance who advised Ulster Unionist Party leader David Trimble on the Northern Ireland peace process, Peter Astrella, head of corporate policy for UK Trade and Investment, and Daniel Leader, who has fought cases over rendition, torture and death of prisoners in Guantanamo Bay and Iraq.

BIS refused to say whether it would do this or not.

The report said general due diligence was not good enough when there was an obvious controversy - what it called a foreseeable, heightened risk that the contract would relate even indirectly to a human rights abuse.

Sheldon Leader, director of the Essex University Business and Human Rights Project and an advisor to the Law Society, said the OECD guidelines were clear on this point and companies were already required to address specific, foreseeable risks in their due diligence.

It would therefore be possible to bring a complaint against BT for not addressing the foreseeable human rights risks of the DISA contract when it did its due diligence in 2012. BT could meanwhile be held liable for civil damages if a specific link could be established between its comms line and drone attacks that have killed civilians.

The professor, an expert on the OECD rules and father of BIS review committee member Daniel, said BT's position that a comms supplier is not liable for what someone does with its services did not stand up when it came to potential human rights abuses.

"If I have a dangerous swimming pool, I don't intend anybody to misuse it, but I don't pay enough attention to the fact that someone could misuse it, then I'm responsible.

"The fact that BT put out a platform that is able to be misused is certainly something that could get the attention of the courts," he said.

Awkward questions

The complex context of the US base in Djibouti make concerned, civil observers more reliant on those involved to clarify the ethical questions and either reconcile their consciences with the conflict or, as the OECD rules say should be done, use their commercial influence to "prevent or mitigate" wrongdoing.

The drone strikes had been a matter of public controversy, particularly in the US, in the year BT took the contract. BT admitted it was aware of the controversy. The big question it refused to face, and which raged while BT was bidding for the work, was whether it was legal at all.

Critics said the strikes were illegal and constituted inhumane summary executions. These intelligence-led "targeted killings" of suspected terrorists, without trial, in areas outside official war zones had eliminated between about 2,800 and 4,400 people in three brawling states over the course of a decade, according to Bureau estimates.

The death statistics were drawn from a decade when the US War on Terror, in whose name they were first conducted, was refashioned into a more general mission to support fragile African and Asian governments against armed insurgents, with an emphasis on local military partnerships, medical intervention and construction projects. The drone attacks nevertheless had the trappings of war without being formally, legally-declared war. The US insisted it worked in collaboration with governments fighting violent uprisings, and the consequence of a ground-led counter-insurgency would have been many more casualties and a runaway escalation of violence. But its rules of engagement permitted strikes where local governments were unwilling or unable to co-operate. The White House press office would not say when it acted alone.

Those it executed were suspected to be the sort of terrorists who killed 74 people in Westgate shopping mall, Nairobi, in September. Critics said drone attacks would make things worse. The public outcry nevertheless grew loud enough last year for US President Barack Obama to appoint a committee of US judges to vet them.

Ongoing Bureau investigations reported between 17 and 26 civilians killed by US drone strikes in Yemen last year. The country has been fighting an al-Qaeda-led armed uprising with US support, but one derived from a long-standing North-South religious, economic and political division with colonial roots and a recent history of war.

Reported civilian casualties dropped to nil in Pakistan, where the number of strikes was cut right back. But Military drones strikes and civilian deaths have been ongoing in Afghanistan and Yemen, the UN reported last month, while the questions of their legality has still not been settled under international law.

Subscribe to blog feed


-- Advertisement --