Drone kill communications net illustrated

| 1 Comment
| More
SIPRNet Backbone - Europe - Secret Internet Protocol Router Network - 2004.png
Computer Weekly can illustrate how a UK network connection forms part of a US weapons targeting system that has slaughtered civilians in anti-terrorist attacks gone wrong.

The illustrations add credibility to a legal challenge begun last month over a 2012 contract BT won to build the UK branch of the system - a fibre optic network line between RAF Croughton in Northamptonshire and Camp Lemonnier, a US military base in Djibouti, North Africa.

British officials had been slow to finger the BT contract under human rights rules because they said there was no evidence to suggest the UK connection was associated with US drone strikes, let alone any that had gone wrong.

There is however clear evidence that the UK connection is part of a global intelligence and weapons targeting network that operate US drone missions like a hand operates a puppet.

The network was meant to make drone weapons targeting more accurate, and catch fewer innocent people in the cross fire.

But this "network-centric" targeting also became the means of a chilling new type of warfare called targeted killing: computer-driven, intelligence-led, extra-judicial assassinations of suspected terrorists like those who kidnapped school girls in Nigeria and massacred shoppers in Kenya.

The UK connection was part of this because under the targeted killing programme, the network is the weapon.

Designed to be utterly discriminate but in practice not completely accurate, it had according to the Bureau of Investigative Journalism accidentally killed hundreds of civilians in 13 years of drone strikes on insurgents in fractured states in the Middle East, Asia and North Africa.

The strikes have all but ceased, and the mistakes reportedly led the US to reign in the programme. But the role of the UK connection remains a burning question.

It will not only determine the outcome of a judicial review being sought in the UK by a Yemeni education official whose civilian brother and cousin, Ali and Salim al-Qawli, where killed by a drone strike on their car, in the Yemeni village of Sinhan on 23 January 2013.

It will force the UK to face its part in the killing programme. And it will illuminate a frightening growth in the combined power of military and intelligence services: to use the power of domineering surveillance to feed systems of automated targeting and killing.

Network killing

The mechanism of net-centric warfare makes the idea that the UK connection has not facilitated US drone strikes absurd.

The network made targeted killing possible. The network was also the basis of the mechanism that drove the actual strike operations. It carried the intelligence that selected the targets. It ran the software that directed the operations. It incorporated the drones that carried out the strikes.

The drones do not exist as separate entities called in to finish the job. The drones are nodes on the network. They are a part of the network. The network is the weapon.

The US has been building its network up to drive the systems and weapons - and particularly the drones - that support its strategy of network-centric warfare. The UK connection is part of this strategy.

Drones rely on the network like trains rely on tracks, like puppets rely on strings. The network gives the drones their directions, distributes their surveillance, targets their weapons.

Network map

The blue map at the head of this article shows a map of the fibre-optic core of this global network, the Defense Information Systems Network (DISN) as it stood in 2004.

US Military Command regions - GAO - Defense Headquarters - 2013.pngShowing the European branch of the network, the blue map depicts RAF Croughton, in Northamptonshire, as a major junction of the DISN - then essential for carrying classified military communications for the Secure Internet Protocol Router Network (SIPRNET).

It shows how Croughton is connected to Landstuhl/Ramstein in Germany, the regional hub for the Stuttgart headquarters of US Africa Command (US Africom). It shows Croughton also links to Capodichino, the communications hub in Naples, Italy, where the US Navy has its European and African command centres.

Capodichino, like Landstuhl/Ramstein, connects on to Bahrain, the base for US Central Command (Centcom) in the Middle East.

The blue map pre-dates the the UK connection to Djibouti, which BT was contracted to provide in October 2012.

The contract specified a high-bandwidth line between the UK and Capodichino (effectively an upgrade), and then an extension to Djibouti, where Camp Lemonnier was having bandwidth problems.

When the blue map was published in 2004, the US military was working intensely to turn the DISN into the global surveillance and weapons targeting system it is today.

Their efforts turned the DISN into the backbone of a more extensive Department of Defense network called the Global Information Grid (GIG).

Drone net

Scientists and engineers from places such the Massachusetts Institute of Technology Lincoln Laboratory (MIT-LL) and the National Security Agency (NSA) modeled the GIG on the internet. It was a network of networks like the internet.

They joined the DISN with satellite and radio to form a single, seamless network. The US Department of Defence then strove to plug every device into it - every vehicle, every system, every drone - to form one all-encompassing net.
High-Level C4 Infrastructure Operational Concept Graphic - Department of Defense - Unmanned Systems Roadmap 2013 to 2038.png
The plan had both prosaic and transcendent aims. At the workaday level, US military and defence agencies didn't have enough bandwidth to support growing fleets of drones, let alone their emerging divisions of unmanned sea and land vehicles.

Drones needed the DISN to carry their control signals, as illustrated in the concept diagram above, from the US Department of Defense Unmanned Systems Roadmap 2013-2038.

The diagram illustrates how defence and intelligence agencies rely on the DISN as well, to gather data drone sensors such as video and infra-red. The DISN carries drone data to systems such as the Distributed Common Ground System (DCGS), the common store of imagery intelligence for US military and intelligence agencies.

Satellite connection

The DISN/GIG became more essential to drones as demands for their dense imagery intelligence outgrew the satellite and terrestrial network's ability to deliver it.

GIG Basis of intel over TSAT - Office of the Under-Secretary of Defense - Integrating Sensor-Collected Intelligence - 2008.pngThe US Under Secretary of Defense used this diagram in 2008 to illustrate how military and intelligence agencies used the GIG to communicate via satellite with drones and deployed forces.

High-bandwidth satellite constellations were part of the plan. The one shown (TSAT - Transformational Satellite) was later supplanted by Wideband Global Satcom (WGS).

DISN basis of Teleport - Military Satellite Communications - Space-Based Communications for the Global Information Grid - John Hopkins University Advanced Physics Laboratory - 2006.png

Teleport

The DISN was connected to satellite constellations through antenna called Teleports, illustrated on the left by Johns Hopkins University Advanced Physics Laboratory.

The US subsequently built Teleports in eight locations: Bahrain; Wahiawa, Hawaii; Fort Buckner, Okinawa, Japan; Lago Patria, Italy; Landstuhl / Ramstein, Germany; Guam, Philippines; Camp Roberts, California; and Northwest, Virginia.


Thumbnail image for DISN basis of JTRS and TSAT - Office of the Under-Secretary of Defense - Integrating Sensor-Collected Intelligence - 2008.pngInternet radio

Deployed forces were also connected to the GIG, with a radio system written in software communicating using the internet protocol.

The Joint Tactical Radio System (JTRS) was developed by Massachusetts Institute of Technology Lincoln Laboratory (MIT-LL), a uniquely, wholly military-funded university department that in 2004 was responsible for developing major components for the GIG and surveillance and weapons targeting technologies that would run over it.

Thumbnail image for Predator Operating in Deployed Mode - Unmanned AirCraft Systems Roadmap 2005-2030 - DoD - 2005.pngPredator

The Predator strike drone relied on the GIG/DISN to target its weapons, as illustrated in the US Department of Defense Unmanned Aircraft Systems Roadmap 2005-2030.

There were two ways of controlling a Predator: from a local station or one far away. Both relied on the GIG/DISN.
Predator Operating in Remote Split Operations - Unmanned AirCraft Systems Roadmap 2005-2030 - DoD - 2005.png
When a remote pilot housed with deployed forces in the theatre of operations (FOL - Forward Operating Location) did the controlling, the drone relied on the DISN to "reach back" to core military computer systems essential to its mission.

The most essential computer system was the Distributed Common Ground System (DCGS).

The DISN and DCGS were also essential in the other primary Predator control mode.

In a Remote Split Operation, the Predator would be launched from a "line-of-sight" control station near to its mission. But control would pass over to a remote pilot back in a fixed military base, such as Nellis Air Force Base in Nevada.

The drone's control signals would be routed over the DISN to a Predator Primary Satellite Link (PPSL). Having pre-dated development of the GIG, the Predator used proprietary network technology and outmoded, asynchronous transfer mode (ATM) communications. This was handled by the DISN Asynchronous Transfer Mode System (DATMS).

But the old Predator comms systems were a hindrance to the GIG strategy. They were not internet-enabled. That meant they couldn't be assimilated into the GIG.

The aim of the GIG was for every sensor, every weapon, every comms system and every software program to operate using the internet protocol. Any military resource would then be available for control or observation, for attack or just for intelligence, to anyone with access to a GIG terminal anywhere in the world, in real-time.

The difference between drones communicating over DATMS and drones communicating over the internet-enabled GIG/DISN was like the difference between communicating via walkie talkie or running apps on your smartphone.

UA Progression From Circuit-Based to Net-Centric Comms - Department of Defense Unmanned AirCraft Systems Roadmap - 2005-2030.pngDoD had a 10-year plan get around the problem by gradually upgrading its drone comms infrastructure. The first step would connect drones to the GIG by turning their satellite links into GIG gateways. That was to be done by around about now. They would act like they were an integral part of the network.

The drones would ultimately become internet-enabled themselves. They would communicate as internet nodes. Their on-board internet routers would use any spare bandwidth to route other people's GIG traffic. They would become part of the network.

By the time the Department of Defense Unmanned Systems Integrated Roadmap FY2013-2038 was published in December, US drones were operating in the way illustrated in stage 2 in the diagram above.

Drones were using network gateways to get their command instructions over the DISN, it said. The DISN likewise disseminated the surveillance data they picked up on their missions.

GIG Communications Infrastructure - GIG Architectural Vision - DoD CIO - 2007.pngThe power of the GIG plan would follow when every drone, soldier, satellite, ship, truck, gun and so on was sharing their intelligence and surveillance sensor data over it, as illustrated in the concept diagram above, from the Department of Defence's 2007 GIG Architectural vision.

They were all part of the GIG. They all communicated using the same internet protocol. They were all integral to the network.

Network-Centric Enterprise data & software building blocks for GIG - DoD Net-Centric Services Strategy - 2007.pngBuilding blocks

A common communications protocol laid the foundation for common data formats and common software interfaces.

This was the transcendent aim of the GIG. It allowed military assets to be available on the network as building blocks. The GIG would be greater than the sum of its parts.

This would theoretically make every chunk of intelligence data, every surveillance camera, every weapon, every software system as a building block on the GIG.
Global Net-Centric Targeting - Key Technologies for DoD Net-Centric Computing - Computer Technology Associates - 2007.png
Net targeting

That was the basis of net-centric warfare - making everything available as a software service on the military internet.

Its most characteristic application was net-centric targeting.

That involved combining different surveillance sensors and intelligence databases on the fly, to get an automated fix on a target.
NCCT Process Example - C2ISR for Air Combat Command - US Air Combat Command - 2006.png


The power of net-centric targeting became apparent in simple tests that combined just two airborne surveillance sensors.

Each sensor had limited ability even to spot a target on its own, let alone get a fix, according to this graphic from a 2006 presentation by Colonel Tom Wozniak of US Air Combat Command.
NCCT Process Example - Network-Centric Sensing - C2ISR for Air Combat Command - 2006.png
The US Network-Centric Collaborative Targeting System (NCCT) takes sensor readings with a middling probability of making sense to a target computer, and combines them to create high-probability fixes where they match.

The NCCT became operational in 2007 after years of development in collaboration with the UK, according to DoD statements to Congress.



USAF Time-Critical Targeting Challenge - Key Technologies for DoD Net-Centric Computing - Computer Technology Associates - 2007.pngThe pre-eminent application of net-centric targeting is the one that made the US targeted killing program possible: time-critical, or time-sensitive targeting.

The graphic above shows how it usually takes hours for military personnel to plan a strike.

They have to digest their battle plans for start - pore over maps and work out what's where. Then they have to find their target. That means arranging for intelligence, reconnaissance and surveillance (ISR) sensors to hunt for it. They have to collate all their intelligence and analyse the data.

Then they have to calculate a fix, nominate targets to be attacked, prioritize among them, co-ordinate their operations, find suitable weapons platforms and get them to the target area, account for weather, choose the best route to the target, watch out for friendlies and, when the strike has been made, assess the damage.

Network-targeting promised to do all this in minutes by automating it.
Semantic SOA - Key Technologies for DoD Net-Centric Computing - Computer Technology Associates - 2007.png
Intelligence databases

Net-centric targeting relies on a process called data fusion, or semantic interoperability.

That means storing your data in ways that can always be cross-matched.

Not just military data. Net-centric targeting developers wrote civil databases into their plans too, such as immigration databases and feeds from civil intelligence agencies.

Automated targeting

Combined with algorithms that watch for target signatures, this creates the means to spot targets on the fly.

And it creates the means to spot targets as small and as fleeting as people. And to kill them within minutes, as illustrated in the diagram above, published in 2007, the year the US Network-Centric Collaborative Targeting System (NCCT) went into operational use, by Computer Technology Associates (CTA), a defence and intelligence systems contractor that helped develop the system.

The example describes a target signature: an algorithm tells the targeting system that in the event of an emergency it should look out for a particular person, known to the Central Intelligence Agency (CIA) as "target ID 1454".

The targeting system keeps watch for them with its Blue Force and Red Force Tracking systems. The military uses these to trace the movements of those they've classified as goodies and baddies.

The targeting system keeps track of immigration and airport databases as well. In the example, somebody on the Red target list (the general hit list) has popped up in the immigration database.

It checks to see if they match against CIA records. They do. And they match against the CIA file with the same target ID as specified in the target algorithm: "target ID 1454". The targeting system sends geographical co-ordinates to people in green uniforms.

Time-sensitive targeting

This sort of computer vigilance, combined with networked intelligence, threw up new targeting possibilities.

The US started building common surveillance systems with its partners in the North Atlantic Treaty Organization (NATO). The more sources of intelligence they had, more targets they could see.

NATO usually took days to plan a strike against even a fixed target. If it could do that it minutes, it could spot targets that were too elusive before.

Examples of Potential TSTs - Time Sensitive Targeting - Architecture Considerations - NATO - 2013.pngThese Time-Sensitive Targets (TSTs) could be threats that emerged so quickly that they had to be attacked within minutes if they were to stopped.

Or they could be "lucrative" targets that appeared in the surveillance net only fleetingly, and would escape if they weren't attacked quickly.

"TST gives friendly forces the option of striking targets minutes after they are identified," said this presentation by NATO's chief scientist in 2013.

CFBLNet Participants - CFBLNET 2012 Annual Report - Combined Federated Battle Laboratories Network - 2013.pngNATO targeting

The US formed a coalition to develop a web of NATO net-centric targeting systems.

It would get target intelligence on the fly from surveillance gathered by any number of NATO countries that happened to have forces, sensors or databases with something to add to the kill equation.

The Multi-Sensor Aerospace-Ground Joint ISR Interoperability Coalition (MAJIIC) worked on making NATO ISR sensors produce data in the same formats.
Coalition ISR Sensor Environment - MAJIIC -  NATO NC3A - 2006.png
MAJIIC aimed to make innumerable surveillance platforms compatible: electro-optical (EO), infra-red (IR), synthetic aperture radar (SAR - high resolution video or still images), moving target indicators (MTI), and Electronic Support Measures (ESM - electronic emissions).

Their aim is what US strategists call "dominant battlespace awareness" - having more eyes and ears feeding more situational awareness back into the network than anybody else.
NATO TST Tool - FAST - Flexible, Advanced C2 Services for NATO - Joint - Time Sensitive Targeting - 2013.png
Afghanistan strike

NATO's chief scientist gave a recent example of net-centric targeting using its own TST tool last year.

The screen shot demonstrates a NATO strike against armed opponents of its military invasion in Kabul, Afghanistan.

It shows a map of Kabul with the location of the targeted people, as displayed in its TST tool, called Flexible, Advanced C2 Services for NATO (Joint) Time Sensitive Targeting (FAST).

This is the view that would appear on the computer screen of an intelligence officer, perhaps at a desk in Djibouti, Bahrain, Stuttgart, or Tampa, Florida.

The intelligence officer has named his target "Terrorist Group Meeting" and given a track identification code "ZZ008" to distinguish it from other targets in the system.

The software supports internet chat between military personnel overseeing the operation. This is the sort of communications handled by SIPRNET over the DISN. The FAST tool handles multiple target tracks at the same time.

An intelligence chief and another Senior Intelligence Duty Officer (SIDO) are also in the system, pursuing tracks on targets "ZZ004" and "ZZ005".

"Any word on the Predators yet?" says a message from a chief of Intelligence, surveillance, target acquisition, and reconnaissance (Istar).

"ETA predators 5min," he is told.

Just a minute before, Air Traffic Control sent a message saying an aircraft had been launched against another track ID, "ZZ006".


Six-Phase Time-Sensitive Targeting Process - Time Sensitive Targeting - Architecture Considerations - NATO - 2013.png

Kill chain

Computerizing weapons targeting involved breaking it up into a series of steps.

It was systematized, as business functions like purchasing and manufacturing were when they were computerized: where human actions were classified into distinct processes like 'produce purchase order', 'send invoice', 'receive goods'.

Time-Sensitive Targeting is commonly known as the kill chain. This has six steps: find, fix, track, target, engage, and assess.

CESMO Test TH05 - NATO Cooperative ESM Operations - NATO C3 Agency - 2007.pngUK gizmo

The UK developed a system that feeds NCCT with target data gleaned from conventional signals intelligence.

Called Cooperative Electronic Support Measures Operations (CESMO), its target data is merged with other intelligence in NCCT.

A NATO test of CESMO in 2005 produced this map, showing line-of-bearing (LOB) readings from ISR sensors.
SOA & XML Security Experiments - Cooperative ESM Operations - CESMO - Norwegian Defence Research Establishment - FFI - 2008.png

Each single LOB is a single signals intelligence reading from a surveillance aircraft.

As expected, the test found a single reading was too unreliable to get a fix on an elusive target.

Even a single aircraft with two LOB fixes would have such a large "error ellipse" that it could not be used to target a weapon, said the NATO C3 Agency.

Error Ellipses for different permutations of Line-of-Bearing - NATO C3 Agency - 2007.pngA target leaked electronic emissions for less than 30 seconds in NATO test TH05.

The area of a poor target fix - called the error ellipse - was 11km by 600m.

This was clearly far too large to risk an attack.

But with five sensors on the lookout, they got a fix.

"Most of the data pertaining to CESMO is classified," said a paper by the NATO C3 Agency in 2007.

But, it said: "It is possible to show how [CESMO] can geo-locate targets that cannot be found by stand-alone operating ELINT or ESM platforms."
Google-like DCGS - Key Technologies for DoD Net-Centric Computing - Computer Technology Associates - 2007.png




Target intelligence

The US military stores ISR data in its Distributed Common Ground System (DCGS).

This is commonly described as the imagery intelligence store queried by US defence and intelligence agencies alike when planning operations and forming target tracks and fixes.

Allied nations use it too. As do targeting systems. It gives them a common view of the battle field and everything on it: common ground.

Common ground means the same surveillance from platforms such as drones, the same human intelligence, the same geo-location co-ordinates from target tracks, the same signals readings from CESMO, the same aerial photography and satellite images.

Drone operations

This screenshot purports to be taken from a DCGS tool in 2003 when the system was still in early development.
DCGS Tool Screenshot of operations over Croatia - Computer Technology Associates - 2007 - small.pngDCGS - Conops - Semantic SOA - Key Technologies for DoD Net-Centric Computing - Computer Technology Associates - 2007.png
The image is of Croatia, Montenegro and Bosnia-Herzegovina on 13 June 2003, the date Croatian defence minister H.E. Zeljka Antunovic welcomed the opening of NATO expansion talks among former Yugoslavian states and Baltic countries at the spoke at the NATO Euro-Atlantic Partnership Council.

It shows the flight paths and surveillance nets of various aircraft including Global Hawk and Predator drones.

Time-Critical Targeting was part of the DCGS concept of operations (Conops), according to this 2007 presentation by net-centric developer Computer Technology Associates.

The image depicts ISR data stores being combined with civil and military intelligence databases to create time-sensitive target tracks and strikes.

DCGS - Crossbow Capability at Royal Air Force Marham - RAF Benson - 2011 - SCALED UP.pngUK drone intelligence

The UK has access to this DISN data store as well.

Intelligence analysts at the Royal Air Force base in Marham, Norfolk, used DCGS imagery to direct UK operations in Afghanistan, said an RAF press release in 2011.

The RAF was building "real-time interoperability" with the DCGS, it said.

"Analysts receive feeds from the US Distributed Common Ground System (DCGS), which provides globally-networked Intelligence Surveillance and Reconnaissance (ISR) capabilities.

"This is the first time that the UK will have the capability to provide near real time imagery intelligence support to Afghanistan from the UK," it said.

Murky area

National intelligence agencies use the DCGS as well, according to some descriptions of the system. That includes the CIA, which is reported to operate some of the controversial drone strikes.

The last 10 years have seen persistent references to the Intelligence Community as an influence and contributor to developments of the GIG, DCGS, net-centric systems and intelligence sharing. The likelihood of the Intelligence Community's dependence on the DISN cannot be ignored.

Anup Ghosh, the former chief scientist of the Defense Advanced Research Projects Agency (DARPA), said in a 2005 speech that intelligence agencies were part of the DoD's GIG vision. David Smith, a DISA consultant who worked on the DISN/GIG transformation wrote in 2006 that it was driven by and would serve both the DoD and Intelligence Community.

The US established a Unified Cross Domain Management Office (UCDMO) in 2006 to "address the needs of the DoD and the IC to share information and bridge disparate networks", director Marianne Bailey said in a 2008 paper.

DoD told Congress in 2006 that tests of the GIG at the Naval Research Laboratory (NRL) would in 2007 include "end-to-end testing with DoD, Intelligence Community, Allied and Coalition activities". The tests would incorporate JTRS, TSAT, Teleport, GIG Bandwidth Expansion, and Net-Centric Enterprise Services (NCES).

Rene Thaens of the NATO Communications and Information Agency said in a 2007 paper that signals intelligence sharing systems would get developed now that the Intelligence Community had discovered their benefits.

The US Under Secretary of Defence's Joint Defense Science Board / Intelligence Science Board said in 2008 that investments by both the Intelligence Community and the DoD had created the GIG network infrastructure. It said excellent progress had already been made in "aligning meta-data from various sources across the Department of Defense and the Intelligence Community".

DoD chief information officer (CIO) John Grimes formally committed in 2008 to ensure information and network situational data sharing with the Intelligence Community. The DoD and intelligence CIOs also formally agreed to recognize one another's network security accreditations. DoD told Congress in 2009 that it was co-operating with the intelligence agencies on the development of its net-centric systems.

Mitre Corporation, a company that did software engineering on the DCGS, helped develop NATO ISR data standards and worked with MAJIIC, said in a 2009 paper about Net-Centric Enterprise systems that US intelligence agencies used them too.

DoD harmonised its IT standards and architectural processes with federal and intelligence agencies, and coalition allies in 2010, it told Congress in 2011. The alignment was done under the Command Information Superiority Architecture (CISA) programme, the Secretary of Defense office formed to develop the GIG architecture and net-centric reference model.

The US Navy told Congress in 2011 it was developing a system to fuse biometric data it took from people on ships it boarded with Intelligence Community counter-terrorism databases. DISA implemented an Intelligence Community system in 2011 that exposed DoD data to users with appropriate security clearance, it said in its GIG Convergence Master Plan 2012. It told Congress in 2012 the DISN carried information for "the DoD Intelligence Community and other federal agencies".

USAF Major General Craig A. Franklin, vice director of Joint Staff, issued an order in 2012 specifying conditions for the Intelligence Community to connect to the GIG, and for IC systems to connect to "collateral DISN" systems. He charged the UCDMO with establishing "cross-domain" computer services between the DoD and Intelligence Community. The UCDMO simultaneously published a list of network services that would work across DoD and intelligence domains.

The National Geospatial Intelligence Agency (NGA) said in 2012 it had "aggressively" broken barriers to imagery intelligence data sharing between civil, defense, and intelligence agencies. The US Navy said in its 2013 Program Guide the next increment of its portion of the DCGS (DCGS-N) would "leverage" both DoD and Intelligence Community hardware and software infrastructures. It said upgrades on the Aries II aircraft, its premier manned ISR and targeting platform, would "enable continued alignment with the intelligence community".

Teresa Takai, DoD chief information officer, ordered in 2013 that all DoD systems would be made interoperable with the Intelligence Community. She committed formally to agree meta-data standards with the Intelligence Community ICO. And she formally requested agencies and government departments including the CIA, Treasury, Department of Justice, NASA, and Department of Transport agree cyber security procedures for connecting to the SIPRNET by June 2014. The NGA said in its 2013 update to the National Imagery Transmission Format Standard (NITF/S) that developments had been driven in recent years by a need to share intelligence data between the DoD and Intelligence Community. It was developed in collaboration with DoD, Intelligence Community, NATO, Allied Nations, technical bodies and the private sector.

"Intelligence Community" is an official designation of 17 agencies by the US Director of National Intelligence that includes the CIA, Federal Bureau of Investigations (FBI), Department of Homeland Security (DHS), Treasury, Drug Enforcement Administration (DEA), Departments of Energy and State, coast guard, NSA, and intelligence agencies associated with each of the US military forces.

The progress of their net-centric integration appears from public records to have been long, arduous, partial, reluctant, ongoing, yet undeniable.

DI2E SvcV-4 Services Functionality Description - Mission services excerpt - 2013.pngEven if the CIA has been averse to conducting its drone operations directly over the GIG/DISN, it is unlikely the DoD network has not carried intelligence and other data essential to its missions in Yemen and elsewhere.

The CIA was directly associated with a more recent evolution of the DCGS.

Central Intelligence Agency

A more substantial computer framework for sharing data between defense and intelligence agencies and their international allies, called the Defense Intelligence Information Enterprise (DI2E), has subsumed DCGS.

At the heart of DI2E is the DCGS Integration Backbone (DIB), a set of data fusion services said in a 2012 Overview by the DCGS Multi-Execution Team Office at Hanscom Air Force Base, Massachusetts, to have delivered a system for the DoD and Intelligence Community to search, discover and retrieve its DCGS content. USAF characterised it as a cross-domain service.

DI2E delivered a plethora of cross-domain services for net-centric missions as part of the Department of Defense Architecture Framework in 2010, listed in the flesh-pink graphic above, which links to a sheet given to developers who intended a May 2013 DoD/IC "plugfest and mashup" at George Mason University, Virginia.

Sensor and target planning are included in the list of Mission Services on the sheet, a collection of over 150 net-centric software services called the DI2E SvcV-4 Services Functionality Description.

They also include SIGINT pattern matching, target validation, entity activity patterns and identity disambiguation for human intelligence (HUMINT), and intelligence preparation of the battlefield.

DI2E Interrelationships - DI2E Summary - Under Secretary of Defense for Intelligence - 2013.pngThis is a defense intelligence initiative. That means it comes under the direct remit of the Defense Intelligence Agencies. But as always, it is described as for the benefit of both the DoD and the wider Intelligence Community.

The Under Secretary of Defense for Intelligence published a diagram of the stakeholders in DI2E in a presentation last year.

D2IE was owned by defense intelligence. But the CIA and other intelligence agencies used it.

The US military was meanwhile reported to have stopped sending drones over Yemen, in April. The CIA was said to have continued, but from an erstwhile secret base in Saudi Arabia.

Yemen

Even when drone attacks on Yemen were reportedly launched from Djibouti, the picture was murky enough for UK officials to dismiss a complaint by legal charity Reprieve that the UK connection made BT, its contractor, answerable for resulting civilian deaths.

The conflation of military commands around Yemen was complicated. It was hard to point at a drone strike and say who launched it, from where, with comms directed down what pipe. The US wouldn't say. BT had ignored the question.

Transfer of Africa operations from US Central Command to Africom - 2008 - Congressional Research Service - 2010.jpgCJTF-HOA - Combined Joint Task Force Horn of Africa - Operational Area and Areas of Interest - GAO - 2011.pngUS Central Command (Centcom), the military group that invaded Iraq, ran Lemonnier until October 2008, when it handed control to US Africom.

Centcom kept Yemen as an operational area. But its base in Bahrain was almost 1,000 miles away.

Africom kept Yemen as an "area of interest". Lemonnier was separated from Yemen by a finger of water just 20 miles across, called the Bab-el-Mandeb straight. Reports continued to cite Lemonnier as a launch site of lethal targeting drone missions.

US Africom would not tell Computer Weekly what drone missions launched from Lemonnier. Not even whether they did. Nor what mission support it gave Centcom. Nor whether it did. Nor whether Centcom had continued operating from Lemonnier after command passed to Africom. Nor whether Africom carried out missions in Yemen under Centcom's command.

US Africom spokesman Army Major Fred Harrell said a lot of assumptions were made about the drone strikes. But like the White House, he refused to clarify who, what, where, when.

But he did confirm that Centcom co-ordinated Yemen operations with Djibouti.

"Our area of responsibility borders that of Central Command and also US European Command," said Harrell.

"So it's safe to say that anything that occurs across what we call the seam between where our area of responsibility ends and where theirs starts, there's always co-ordination between combatant commands on what goes on.

"We do co-ordinate with our neighbour combatant commands, such as European Command and Central Command," he said.

This article has illustrated amply how such co-ordination is conducted over the DISN.

Earlier reports in Computer Weekly described how a 2012 DISN upgrade at Lemonnier coincided with the BT contract to extend the line from Croughton and a 2012 DISN upgrade in Stuttgart. And how an intelligence contractor was hiring analysts to work on targeting systems over the DISN from Stuttgart.

DISA Unified Video Dissemination Service - DoD - Unmanned Systems Roadmap 2013-2038.pngDrone video feeds

Upgrades including the UK connection have made the US network wide enough to yet another development in drone targeting and intelligence: real-time video feeds.

DISA's Unified Video Dissemination Service (UVDS) takes live video streams from Predator and Reaper drones and transmits them via Teleports such as those at the DISN comms hubs in Naples and Landsthul and Bahrain.

UAV video gets streamed via the Teleports and over the the DISN, according to the graphic below, from the DoD 2013-2028 Unmanned Systems Roadmap.

The graphic illustrates how their imagery is thus stored in the DCGS, and in archives at the NSA. From a DISA presentation last year, it illustrates how the whole system depends on the DISN.

Managing the Enterprise Infrastructure - Operating and Defending the DoD Information Networks - DISA - 2013.pngIt shows drones and surveillance aircraft associated with Camp Lemonnier, otherwise known as Headquarters of the Combined Joint Task Force-Horn of Africa (CJTF-HOA) under US Africom.

The drones feed their video streams via wideband satellite back to Lemonnier, as well as a nearby DISN trunk gateway - a Teleport.

DoD records occasionally state that its net-centric and DISN investments aimed to give simultaneous views of the battlespace to any personnel or commanders anywhere in the world.

This was for example one reason given for the DISN investment at Lemonnier. The idea was that it might help commanders at bases in different places like Bahrain and Djibouti, and commands with different headquarters in places like Stuttgart and Tampa, and perhaps even intelligence analysts in different domains, to co-ordinate their missions. Streaming drone video was a part of that.

Drone over IP

The DISN core, built with trunk lines like the one between Djibouti and the UK, provided the basis of this strategy, as of all the other net-centric services.

It would allow staff in different locations to use the same systems, to see the same intelligence, collaborate in the same operations.

DISN in Current - 2012 - UVDS Operational Architecture < DoD Unmanned Systems Roadmap 2013 to 2038.pngThe DISN Core fibre network is hence at the centre of this network diagram showing how the Unified Video Dissemination Service (UVDS) operated in 2012.

The diagram shows video feeds running from drones, over satellite links and finally via Teleports, over the DISN.

The Teleports at Lago Patria, Italy and Landstuhl, Germany are shown distributing live video feeds to the DISN.

The Italian Teleport, on the DISN between the UK and Djibouti, was made capable of live drone video comms in May 2012, the year BT was contracted to make the UK connection, the diagram notes.

Published in December 2013, in the DoD's 2013-2038 Unmanned Systems Roadmap, it shows how full-motion drone video is carried across the network to UVDS storage points, where they can be made available to users on SIPRNET.

The graphic has two distinct components: the part in green that gets the full motion video onto the network, and the part in red that makes the video available to military users and their systems. Both parts operate over the DISN.

DISN in this diagram links satcoms directed through Teleports with theatre communications for bases such as the one in Djibouti. It links those with the classified SIPRNET network that also runs over the DISN, and with UVDS systems operating from DISA's regional computing centres, called Defense Enterprise Computing Centers (DECCs).

Network infrastructure

The network components in the diagram above match both those described in the official notice that described BT's contract for the UK connection in 2012, and the US Navy's Congressional budget justifications that described the same DISN connection upgrade as an operational need for Djibouti.

The key components are the MSPP (MultiService Provisioning Platform - a device to connect the DISN line at bases such as Camp Lemonnier) and HAIPE (High Assurance Internet Protocol Encryptor) encryption devices.

These were devices specified in US Congressional justifications for building the DISN into the GIG.

Implementing the Global Information Grid - DoD - 2004.pngNew Global Information Grid-Bandwidth Expansion DISN Node - IA Newsletter - 2006.pngMSPP devices form the major junctions of the Global Information Grid (GIG), as illustrated in this graphic from a 2004 presentation by Frank Criste, then director of Communications Programs for the US Office of the Secretary of Defense.

The diagram shows the GIG test environment most likely operated by NRL in 2004. It portrays components that would later be used to build the GIG in the real world.

The network infrastructure specified for BT's UK connection was also illustrated in this diagram from an article in the summer 2006 DoD Information Assurance Newsletter.

It illustrates the GIG-Bandwidth Expansion programme, a scheme to upgrade the GIG for net-centric warfare, and to carry the imagery intelligence spewed out by burgeoning numbers of drones.

The new DISN infrastructure would include OC-192 fibre-optic cables, ODXC and MSPP network devices, and "KG"-class high-speed High Assurance Internet Protocol Encryptor (HAIPE) devices, all devices specified in BT's contract for the UK-Djibouti connection, and all also matching US budget justifications for expenditure.

"With the expanded bandwidth provided by GIG-BE, DISA can address high-capacity applications (e.g., imaging, video streaming) and provide a higher degree of network security and integrity," said David Smith, the DISN programme manager who wrote the article.

"The GIG-BE program is the first of its kind to bring high-speed High Assurance Internet Protocol Encryptor (HAIPE) devices to a DoD network.

"The HAIPE devices, introduced because of the National Security Agency's anticipatory development, will greatly increase the ability to bring secure, net-centric capabilities to the Intelligence Community and DoD operations," he said.

BT has consistently said it could not be held responsible for what anybody did with the communications infrastructure it supplied.

"BT can categorically state that the communications system mentioned in Reprieve's complaint is a general purpose fibre-optic system.  It has not been specifically designed or adapted by BT for military purposes. BT has no knowledge of the reported US drone strikes and has no involvement in any such activity," said a spokesman for BT in response to questions earlier this year.

Profit lures London councils into IT sales venture

| No Comments
| More
Lupa Capitolina.JPGTwo London councils have embarked on a scheme to sell IT services for profit after concluding their cost-saving backoffice merger could grab them a big slice of the £144bn local government market.

Giving their project the code-name Romulus, after the rapacious founder of the ancient City of Rome, the London Borough Councils of Havering and Newham drew up a plan to compete for back office business in the health, education, police and charity sectors as well.

Now they are looking at plans to spin Romulus off to compete in the private sector, an idea that has sent them goggle-eyed over the chance of making even more lucre, and bringing it back to the people of East London like sacks of pre-historic booty from raids on neighbouring tribes.

Havering and Newham adapted their public ethos to the profit motive with gusto, choosing for Romulus the business model that could make the most profit.

They had a few options on the table, including good old fashioned public service. But they stacked the odds in favour of what would bag them the biggest wad.

Their Full Business Case said in December they had ruled out simply outsourcing their backoffices because they wanted to keep the fees for themselves. Why pay an outsourcer a commission to do the work when you can do it yourself?

But that was not all. It was not just a do-it-yourself decision. They had not merely decided not to outsource. They had decided to go into the outsourcing game themselves. Now government rule changes have allowed public bodies to start getting some of those outsourcing fees for themselves, well why don't they just do that then?

Mine


The same profit motive also made them rule rule out merging their backoffices with other groups of councils that may have already done the hard work. Because that would mean paying commission to some other group. They wanted to be the ones earning the commission.

Government funding cuts left them no choice: find a way to earn profit from other public bodies or cut essential services to people in the nascent principality of Havering and Newham.

It is each man and woman for himself in the meantime, as one public sector managing director told backoffice staff recently, after their attempt to compete like a private company failed. Staff had been through the mangle for years already. Their loyalty to the public ethos is what kept them going through the quasi-privatization, bizarrely. But now they're being out-competed, broken up, repackaged, rewarmed, sold off, palmed off, to be put through the mangle again to see if there's any more die still to come out. At some point, perhaps in some generation down the line, they will find a job tenure secure enough to invest belief in their work again. But someone else whose profession was downsized by technology recently told your correspondent the workplace had become a place of fear. People used to have fun and some degree of autonomy. There are now a fraction of the people doing the job. They are time-and-motioned to within an inch of their lives. And the management made it clear they are looking for any excuse to sack anyone because they want to cut some more.

Back in Havering and Newham, where Romulus launched as OneSource this month, the profit question is a matter of ongoing debate, as though they are not quite prepared to face it yet, or to stick their necks out.

The business case made it clear. The plan was to generate profits (the word it uses often is "income" - the American word for profit - though Romulus means sales, while it sometimes says "savings" when it means profits). But Romulus has not yet launched a vehicle to carry the profit.

Tony Huff, who wrote the Romulus business case as OneSource business services director, said the councils had not decided what they would do with the income they generated, or profits they made, or savings they made, or whatever you would call it.

"Contributions to overheads is what we would call it," said Huff.

This could come from persuading other councils to join their partnership, like they ruled out joining anybody else's. Or it could come from the sale of backoffice services to other councils, or to other public bodies, and so on.

Look at it like this though. You are in the business of selling something. Say you generate more money from your sales than it cost you to run your business. What do you call that? You call it profit, don't you? That's the dictionary definition. These councils have not even decided what to call it, let alone what to do with the money when they get it.

That though, said Huff, was an aim of the joint committee the councils set up to run the venture in the meantime - to decide what to do with all the money.

"We have been asked to go away and look at how we might deal with new business," he said.

The most likely option, according to the Business Case, is launching a private company. Romulus will have to do that if it pursues its aim of selling to other public bodies, and charities, and private companies. It's a legal requirement under the coalition government rules that have made it possible for councils to get in the outsourcing game. The business case said they could sell to other councils without any restriction. But they had to form a private company if they wanted to make profits by selling to anyone else.

Mine

If Romulus doesn't launch a private company, it is easy to imagine a time when some other once-public backoffice from some other group of councils, or some HE College or NHS Commissioning Support Unit or police service spin-off, has become so big and successful that Havering and Newham can't justify doing their own any more. So they are damned if they don't. And if they do, they could still crash and burn amidst the Titanic clash of backoffice spin-offs that will surely commence with much blood-letting as public sector bodies up and down the country try to get on the profit-making game as well.

For now, said Huff, their main driver is making savings for Havering and Newham. The business plan is to make the savings and develop the business model, then bring in sales.

Savings means consolidation, means staff cuts, means automation of council services using web apps.

Geoff Connell, OneSource CIO, told Computer Weekly Havering and Newham had saved £10m per annum by getting 50 per cent of their citizen contacts online. Most of the savings had come from job cuts of people who used to deliver the services in person. Connell had the saving on the tip of his tongue. He didn't have the number of redundancies to hand. Coalition estimates have it at an average 78 per cent job cuts.

Redundancies will account for half OneSource's set up costs this year, three quarters next year, and £2.6m in total by April 2019, according to the Business Case. A quarter of the rest this year, and half of the rest next year will be the cost of developing Oracle software applications to run the back office services it has not yet designed: asset management, FOI, risk management and forecasting, and time costing.

The proprietary One Oracle platform, which Havering and Newham developed in conjunction with seven other councils, will form the heart of its sales business operation: back office processes such as finance and human resources. Connell said last week it also had sales of self-service apps on the cards as well, driven by a proprietary Microsoft system.

Mine

By choosing proprietary software systems over the open source software ventures Newham and other councils had toyed with in the past, it has protected its commercial interests. That is, the act of choosing Oracle puts Romulus' ventures in its own private interests, rather than the public interest. That should ultimately help clear up any confusion about terminology, and where this profit motive fits in the public ethos.

The public ethos resonates in the voices and publications of Romulus executives like a prideful vein of self-denial. Connell and Huff seem like the nicest young men who ever privatized a back-office. They have the citizens of Havering and Newham at heart. Maybe not the citizens of Barking. And maybe not the citizens of Bangladesh. But Havering and Newham are going to be all right.

Because Romulus has its eye on the prize.

"Local government is worth £144bn each year," said the business case. "This offers a great opportunity for this Programme in terms of business growth."

That's Romulus, son of Mars, the God of War, and the daughter of a deposed King: outcast, raised by a she-wolf, murderer of his brother, rapacious warlord, snatcher of women, violent oppressor, founder of Rome. What a start in life.

Councils stage open source revival

| 4 Comments
| More
Carbon_cycle.jpgLocal councils behind an ambitious public open source software scheme that flourished briefly with boom-time investment under the last government are attempting to revive it under the cost-cutting coalition's digital strategy.

But their old rival Microsoft is making its local government come-back too, after a 10-year gestation with the London Borough of Newham under a deal that became the focus of bitter opposition between the proprietary and open source software camps.

Last time, with central government funding but only lacklustre policy support, Bristol City and Camden London Borough built an open source content management system that was propagated as far afield as India and Bremen.

This time, arms twisted by budget cuts, but with government policy paving the way and hindsight in their favour, they have widespread sympathy but face boggling competition.

Newham is about to turn the game upside down by launching a commercial venture to sell tailored Microsoft software to other councils for a profit.

Just as Newham's Microsoft partnership helped bust the last open source alliance in local government, it looks set to put the cat amongst the pigeons, or doves, again.

John Jackson - Camden CIO.pngJohn Jackson, chief information officer of Camden London Borough Council, portrayed the latest open source scheme as a bold move when he announced it last week: an open systems alliance to bust the proprietary software ecosystems of companies like Microsoft.

"We've drawn a line. We've said its going to be different. We said to suppliers, 'What you are delivering is rubbish'.

"I think its about time we stood up, and about time we changed a very tired market place," he told the GovNet Open Source conference in London on Thursday.

"I'd like to see local authorities creating an alliance to work together to deliver things like common APIs (application programming interfaces), to deliver common web services, to deliver code sharing, to innovate and to drive systemic disruption.

"Camden and Bristol are going to be engaging in a very strategic partnership to work on promoting open systems, promoting open source, and helping other councils deliver an open systems vision into government."

He laid great emphasis on disruption. He wanted to disrupt proprietary software markets, which he likened to medieval fiefdoms that held unhealthy power over public bodies.

google_ecosystem_large.pngCouncils should band together, said the Newcastle history graduate, to create an Open Source Alliance, and write a "bill of rights" or "charter for change" to "break the medieval market".

Part of his solution was open source software. He said councils should put it at the centre of what they were doing, when they have kept it at the margins.

His pitch was essentially coalition government policy, re-purposed for local government: embrace open source to smash proprietary software ecosystems and markets.

But Jackson was also doing a plug for a local government procurement framework the Crown Commercial Service (CCS) put to tender nearly a fortnight ago, and which Camden, Newham and other London councils helped draught.

£300m motive

That was the £300m Local Authority Software Applications framework that will introduce its own modest market reforms when CCS - an arm of the Cabinet Office - completes its half-decade re-let in July.

Billed by Jackson as the first national procurement ever to specify open systems, the LASA framework contract notice did not quite fit the billing. It asked only that suppliers "assist our aims of accelerating development of open systems, data sharing and the interoperability of IT systems in government". It made no requirement of the official coalition policy preference: open source software.

Jackson was nevertheless adamant that councils could improve their lot by rallying around the framework. It's target market - local government line-of-business applications such as benefits, social care and libraries - is deeply, unhealthily proprietary.

Take for example Dudley Metropolitan District Council, whose information systems manager, Andrew Tromans, told Computer Weekly its own proprietary line-of-business applications had prevented it pursuing its open source ambitions.

Dudley had considered using MySQL, an open source alternative to Oracle's relational database. But all the council's line-of-business applications were based on Oracle. It had to buy Oracle anyway. So it seemed pointless implementing MySQL as well.

Gavin Beckett - Bristol City Council - cropped.jpegSimilarly, both Bristol and Solihull Councils, once among the staunchest open source advocates, both recently gave up decade-long attempts to replace Microsoft with open source desktops because their line-of-business applications used proprietary Microsoft formats.

Gavin Beckett, chief enterprise architect at Bristol City Council, told the conference how he implemented a proprietary Tibco enterprise service bus as well.

Misdirection

Jackson lightly goaded Steve Halliday, Solihull CIO, from his conference platform last week. He asked Halliday, who is also Socitm president, "What are you going to do to promote a new approach?".

Halliday had promoted the local government framework. Socitm had moreover adopted parts of the coalition's Government Digital Strategy, which had codified the government's preference for open source software.

But it dropped the bit that specified open source. Like the framework, Socitm adopted a less disruptive, loose definition of open systems: common APIs, compatible data.

Steve Halliday - Solihull CIO - cropped.pngHalliday was not available for comment. But a Socitm source sought to discredit Jackson's Open Systems Alliance off the record: councils had tried and failed to do this before when the Labour government was pumping hundreds of millions of pounds into its e-government drive to put council services online; what chance did they have now in the age of austerity?

Take Newham, said the source: Newham had collaborated with other councils to build a CRM system, and distributed it through the Local Authority Software Consortium (LASC) - but that floundered. The same would happen to Camden's Open Systems Alliance.

The source was being naughty though. Newham's LA-CRM was never open source. It was a proprietary system that led Microsoft's push into big local government software. It had not failed at all. It had been in gestation and was about to be relaunched.

It had been the nemesis of local government open source. Socitm had been instrumental in it. And it was about to rise again.

Metamorphosis

Newham is now preparing an unprecedented plan to relaunch LA-CRM as a commercial venture: the public body will sell the Microsoft system to other councils for private profit.

It developed the next-generation product under a Microsoft contract that had been bête-noire of the open source movement in the middle of the last decade: its dreaded Microsoft memorandum of understanding (MOU).

Newham had been a leading player in local government open source before it formed the Microsoft partnership, partnering with Bristol and Camden to develop and implement Aplaws - an open source content management system that promised for a time an alliance of councils like the one being posited again now: a stronghold to nurture public open software. Infamously toying with ditching Microsoft entirely, Newham called all eyes to its attention and convened a fierce competition for its loyalty.

Thumbnail image for 2003 - Newham and Microsoft - Memorandum of Understanding - cropped.pngNewham ended up signing a Microsoft MOU that - unbeknownst to everyone - laid the foundations for the profit-making, proprietary software venture Newham is about to launch, to sell Microsoft CRM software to other councils.

It created a stronghold for proprietary software development and forged an alliance of councils using Microsoft software. It broke the open source alliance and enticed other councils who might have joined it into Microsoft's ecosystem instead.

The MOU committed Newham to migrate its major business systems to Microsoft platforms and to form a joint sales venture with Microsoft.

Newham promptly migrated its Aplaws implementation onto Microsoft's first CRM platform.

"It is assumed that the customer works with Microsoft as the 'platform of choice' and is prepared to work towards migrating all relevant business and technical solutions onto a Microsoft Platform over a defined time from competitive technologies," said the 2003 MOU.

"It is expected that these solutions may be developed jointly and sold collaboratively with Newham and significantly change the landscape of the application market place," it said.

Public authorities weren't allowed to sell software and generate private profits in 2003. So Newham gave its Microsoft CRM software to Belfast City Council (yes, Belfast) so it could do the selling instead.

Geoff Connell - Newham CIO < Microsoft promo shot.png"We worked in conjunction with Belfast City Council because at the time they were able to trade in a way that English authorities weren't," Geoff Connell, Newham CIO, told Computer Weekly in a telephone call after the conference.

"So we gave them free use of the product and then they were the ones who ran around the country helping to install it for different authorities," said Connell, who had led LA-CRM development at the council when it signed the Microsoft MOU in 2003.

The 2003 MOU said Newham would help Microsoft consider "shaping the MS CRM as a viable cost effective alternative for LG [local government] clients". That they did.

Microsoft CRM subsequently evolved into Microsoft Dynamics CRM, a more substantial system within the .NET framework. Newham duly built its next-generation system on Microsoft Dynamics CRM.

The council obtained power to sell this system in 2012, when the coalition government passed a General Power of Competence into law, allowing councils to trade for profit like private companies.

dynamics-gp.pngConnell said Newham was considering hosting its CRM system in a Microsoft cloud data centre and then selling it to other councils as a Microsoft Dynamics software service. Customers would pay Microsoft for Dynamics and Newham for those specialised, local government line-of-business modules.

It would work like an appstore for people who bought Dynamics. They could pick and buy Newham modules as add-ons.

"We haven't taken it to market yet. But we intend to take it to market," said Connell.

"We will probably use a cloud offering. And then we will look to license the use of other specific code we've developed to do things like parking permits.

"The reason we wouldn't just give it away is the local authorities we work for have spent millions of pounds, invested in developing this capability.

"So it seems fair for the tax payers of our boroughs if others are going to use it, which we would want them to, that we get something back to recognise the sunk investment we've made," he said.

Well, sort of. Newham spent £5m on its entire customer service redesign, Connell later clarified. That included process redesign, staff cuts, website, master data management, systems integration and CRM.

It saved £10m-a-year, but mostly from staff cuts because citizens didn't need them when they used self-service apps. That is the part of the coalition digital strategy being pursued by all protagonists in this tale.

Seeding the market

Belfast had installed LA-CRM at 25 councils at its height, trading on kudos borrowed from Microsoft's rival open source movement.

It told customers last year it had ceased development. It would reassess its options in a couple of years when it had consolidated its own various LA-CRM implementations and Northern Ireland had completed a round of local government mergers. It is not hard to imagine it buying the next-generation from Newham.

Socitm Better for less report - December 2013.pngOther things are also not quite as they seem. When it comes to sharing software code, Newham and Socitm have both been using the same language as Bristol and Camden.

Connell talks passionately about Newham sharing code. It has already employed code distributed by other councils, and has even shared code itself with Camden.

But he said the council would attempt to sell any code deemed valuable.

Sharing was for scraps.

Where they differ, apparently, is the extent they wish to collaborate, or compete with other councils.

"We develop an awful lot of code," said Jackson. "What's the point of developing lots of code in isolation? Why can't we share it? Let's re-use and develop web services and APIs."

With the proprietary camp using the same language, it becomes hard to boggle the difference. They all say they want web services. They all say they want common APIs. Newham's Microsoft system has meanwhile already done what the coalition Digital Strategy ultimately strove to do, with all its talk of disrupting proprietary markets with open source: forced by a £106m budget cut, it has turned half its government services into web apps and made the staff who deliver them redundant.

Becket, who has the same aim for the same reasons, said the difference between what the Open Alliance and Socitm strove to do was the difference between words and action.

"What John's talking about is doing something - actually making that change," he said. "So if I sponsor a piece of work to develop a particular aspect of our portal, let's make that available to everybody. Let's just do it. Its actually the doing of it. That's what we are doing."

John_Prescott_on_his_last_day_as_Deputy_Prime_Minister,_June_2007.jpgThey are striving to do less than when the Office of the Deputy Prime Minister put up the money for Aplaws 15 years ago.

They have no plans for similarly ambitious software developments. Jackson's pitch was not as bold as developing an open source line-of-business application that would really upset the market. They don't have funding. They have cuts.

But they did envisage an alliance would create a common basis for LA computer systems and gradually erode the power of proprietary line-of-business vendors like Capita, Civica and Northgate. That would allow councils who wanted to share code to gradually build a shared resource for council functions - like the Microsoft/Newham alliance, only public.

The first step was to create a free market for software by agreeing common interfaces, getting enough councils collaborating to arm-twist vendors into doing things their way. That meant government taking a hand in defining the software ecosystem in which it existed, as opposed to an ecosystem a proprietary software vendor defined to maximize its own gains.

Ding ding

This view assumes the similar calls Newham and Socitm have made for common APIs and code sharing are compromised by their allegiance to proprietary ecosystems.

Yet Camden's Open "Systems" Alliance amounts to an admission that open source has been relegated to a mere middling ambition - not only as a government policy objective, but also among those public officials who have been its champions.

Newham has meanwhile begun talking to Camden about joining the alliance. It too has middling ambitions for sharing. It wants to share code. But it primarily wants to sell it. It just doesn't want to call the proceeds profit.

While their attitude is collegiate, their philosophical positions, once deeply opposed, are in danger of forming a tepid solution: a sort of harmony where bacteria - the lowest common denominator - thrives, because there is no resistance.

Each seems in denial: Camden that the Government Digital Strategy is a programme of privatization, especially in local government where no open source edict holds; and Newham, that its path leads to a universal market, where councils sell each other software and they call their surplus income 'profit'.

The digital strategy, as a Trojan Horse for local government privatization, will create the market. Newham will compete in it, with platform-dependent code that will help lock other authorities into a Microsoft ecosystem. Newham looks like Microsoft's Trojan Horse. Round two has just begun.

Telecoms contractor could be called to account for drone deaths

| No Comments
| More
Hear no evil, speak no evil, see no evil.jpgTelecoms supplier BT could be asked to account for drone attacks in Yemen and Somalia after connecting a fibre-optic cable to a US military base conducting the strikes.

The revelation comes from a high-level review of a complaint that the £23m BT communications line supported drone missions that had accidentally killed between 426 and 1005 civilians in the last decade in the course of strikes on suspected insurgents, according to estimates of the Bureau of Investigative Journalism.

Officials at the UK Department for Business, Innovation and Skills threw the complaint out last October, saying there was no evidence to say whether the comms line supported the drone attacks or not.

A review of the decision has since raised the prospect that BT could be asked to gather evidence to answer the question itself.

The big question - whether the fibre-optic cable is infrastructure used in drone strikes critics say are illegal - remains unanswered.

Legal charity Reprieve used an international agreement on corporate ethics, called the OECD Guidelines for Multinational Enterprises, to complain that BT should never have taken the contract because it obvious what the cable was for.

But the only evidence BIS had to go on was due diligence BT did to satisfy the OECD guidelines when it took the contract in 2012. BT's due diligence ignored the drone controversy. BIS said it therefore didn't know and couldn't say.

The review, published at the end of February, said companies shouldn't get away with glossing over controversies in their due diligence. They shoudn't turn a blind eye when they took on a customer with a bad reputation. They should ask awkward questions and address them specifically in their due diligence.

BT had done only general due diligence when it took out its contract with the US Defense Information Systems Agency on 26 September 2012.

Now the review, though indirect, has turned the spotlight back on BT.

BT refused to comment on the review's conclusion.

Wise monkey

It denied knowledge of drone strikes. It also tried to portray the fibre-optic line, which it laid between the US military intelligence communications hub at RAF Croughton, Northamptonshire, and Camp Lemonnier, the base in Djibouti, North-East Africa, that launches the drone strikes, as a civilian cable not suitable for military applications.

"BT can categorically state that the communications system mentioned in Reprieve's complaint is a general purpose fibre-optic system.

"It has not been specifically designed or adapted by BT for military purposes, including drone strikes," said the statement.

"[It] could be used at the base for a wide range of day-to-day activities, such as general housekeeping/internet browsing, email, communications, stores ordering, data functions and voice communications," said a BT spokesman in an email. He refused to rule out the possibility that it might serve a military function.

Neither BT nor BIS would release the due diligence BT had done. As a system of self-regulation, the OECD rules left it to those best placed to ask and answer the awkward questions, and to allay public concerns that they conducted their business ethically. But it did not allow the public to see the deliberations. Only officials could see the due diligence. The due diligence failed. The officials rubber stamped it even after the complaint.

Could try harder

BIS should clarify its position on how companies should address awkward questions in their due diligence, said the report, written by legal experts Jeremy Carver, a lawyer at Clifford Chance who advised Ulster Unionist Party leader David Trimble on the Northern Ireland peace process, Peter Astrella, head of corporate policy for UK Trade and Investment, and Daniel Leader, who has fought cases over rendition, torture and death of prisoners in Guantanamo Bay and Iraq.

BIS refused to say whether it would do this or not.

The report said general due diligence was not good enough when there was an obvious controversy - what it called a foreseeable, heightened risk that the contract would relate even indirectly to a human rights abuse.

Sheldon Leader, director of the Essex University Business and Human Rights Project and an advisor to the Law Society, said the OECD guidelines were clear on this point and companies were already required to address specific, foreseeable risks in their due diligence.

It would therefore be possible to bring a complaint against BT for not addressing the foreseeable human rights risks of the DISA contract when it did its due diligence in 2012. BT could meanwhile be held liable for civil damages if a specific link could be established between its comms line and drone attacks that have killed civilians.

The professor, an expert on the OECD rules and father of BIS review committee member Daniel, said BT's position that a comms supplier is not liable for what someone does with its services did not stand up when it came to potential human rights abuses.

"If I have a dangerous swimming pool, I don't intend anybody to misuse it, but I don't pay enough attention to the fact that someone could misuse it, then I'm responsible.

"The fact that BT put out a platform that is able to be misused is certainly something that could get the attention of the courts," he said.

Awkward questions

The complex context of the US base in Djibouti make concerned, civil observers more reliant on those involved to clarify the ethical questions and either reconcile their consciences with the conflict or, as the OECD rules say should be done, use their commercial influence to "prevent or mitigate" wrongdoing.

The drone strikes had been a matter of public controversy, particularly in the US, in the year BT took the contract. BT admitted it was aware of the controversy. The big question it refused to face, and which raged while BT was bidding for the work, was whether it was legal at all.

Critics said the strikes were illegal and constituted inhumane summary executions. These intelligence-led "targeted killings" of suspected terrorists, without trial, in areas outside official war zones had eliminated between about 2,800 and 4,400 people in three brawling states over the course of a decade, according to Bureau estimates.

The death statistics were drawn from a decade when the US War on Terror, in whose name they were first conducted, was refashioned into a more general mission to support fragile African and Asian governments against armed insurgents, with an emphasis on local military partnerships, medical intervention and construction projects. The drone attacks nevertheless had the trappings of war without being formally, legally-declared war. The US insisted it worked in collaboration with governments fighting violent uprisings, and the consequence of a ground-led counter-insurgency would have been many more casualties and a runaway escalation of violence. But its rules of engagement permitted strikes where local governments were unwilling or unable to co-operate. The White House press office would not say when it acted alone.

Those it executed were suspected to be the sort of terrorists who killed 74 people in Westgate shopping mall, Nairobi, in September. Critics said drone attacks would make things worse. The public outcry nevertheless grew loud enough last year for US President Barack Obama to appoint a committee of US judges to vet them.

Ongoing Bureau investigations reported between 17 and 26 civilians killed by US drone strikes in Yemen last year. The country has been fighting an al-Qaeda-led armed uprising with US support, but one derived from a long-standing North-South religious, economic and political division with colonial roots and a recent history of war.

Reported civilian casualties dropped to nil in Pakistan, where the number of strikes was cut right back. But Military drones strikes and civilian deaths have been ongoing in Afghanistan and Yemen, the UN reported last month, while the questions of their legality has still not been settled under international law.

Legal straits forced NHS delay on care.data

| 2 Comments
| More
1874 Paris Morgue Crowded Viewing Window Death Coroner.pngA legal pinch forced NHS England to delay its "care.data" system, Computer Weekly can reveal.

The NHS body portrayed the delay as a sensible - even generous - response to concerns raised by GP doctor groups, who said patients should be better informed about care.data before it was switched on. There was nothing about any legal obligation. This was merely the decent thing to do.

Care.data's delay even had none of the usual software gremlins. It wasn't another IT bodge. The scheme is esteemed by what appears to be the entire medical establishment, including those GP groups that raised the alarm.

So there was nothing but courtesy that stopped NHS England cranking care.data up as scheduled in April.

Nothing but privacy law, that is.

Or what remained of relevant privacy law after the government cut it back in 2012 to set care.data up. There wasn't much left. But it was enough to ensure care.data would breach the law if NHS England cranked it up in April, when it was due to start collecting patient records from 34,000 GP practices and combining them with hospital records into a medical research database at the Health and Social Care Information Centre.

The bit of the law the coalition government revoked would have given people power to stop it extracting their GP records. It effectively claimed power to requisition the data - a sort of data tax. The remaining bit of data protection said GPs would have to tell you before they handed your records over - a sort of courtesy. If they didn't tell you, you could take them to court.

That meant GPs would fall foul of the law if they started sending medical records to the HSCIC before anyone knew what it was all about. With April fast approaching, few people do. GPs would have been liable if care.data went ahead as planned.

Thumbnail image for Dawn Monagan - ICO - cropped.pngWarning

The Information Commissioner warned NHS England about this on 12 February when Dawn Monaghan, its public sector liaison, met with data chiefs at NHS England and the HSCIC.

"We raised criticisms early this month," said a spokesman at the ICO.

"NHS England said they would look at those concerns and come back with a solution. The solution seems to be that they've introduced a six month delay," he said.

BMA GP committee chair Chaand Nagpaul.jpgThe Royal College of General Practitioners and British Medical Association were meanwhile calling foul on NHS England. BMA GP committee chair Chaand Nagpaul and RCGP Honorary Secretary Professor Nigel Mathers said the government needed to do more to inform patients about care.data.

What they didn't say was, patients didn't know because GPs had neglected to tell them.

Both GP bodies thought it their responsibility last year. Both lobbied their GP members to tell patients about care.data. But by October the ICO had stepped in and NHS England had agreed take responsibility to persuade patients to sanction care.data. Its campaign was lacklustre. And now GPs say it's not their responsibility.

That's what an RCGP spokeswoman insisted. It wasn't GPs' responsibility, "Because it's an NHS England initiative".
 
"GPs need to be able to inform patients about it, and obviously they have a role as data collectors. But it's an NHS England initiative," she said.

Destroyed Jacob Epstein Sculptures at the old BMA building in London - cropped.pngNothing to do with us

A BMA spokesman said GPs did their bit last year by putting up posters they'd been sent. This was as far as their responsibility went.

"They've been given materials by NHS England. It's NHS England who are responsible for the communication campaign. Not GPs. And not the BMA."

"It's not GPs' responsibility to write to all their patients," he said.

And why not?

"Because they've not been asked to do that."

Do they need to be asked to do it?

"Writing to all of your patients costs money. And it's obviously a time consuming thing to do. So we would need to look into the details of how that's done. But the fact is it's up to NHS England to make sure it's properly resourced."

It always comes down to money with GPs?

"GPs have been under huge workload pressures. They've had reductions in their funding over many years..."

They don't get paid enough already, do they?

"... So if they are going to do extra things, they need extra funding for it. It's a government initiative".

The spokesman insisted GPs had not taken an 'us and them' position against the NHS. That though is how it seemed.

'Us and them'

The government's 2012 Health and Social Care Act legally bound GPs to feed patient records into care.data. This has helped critics of the programme portray it as an imposition on GPs.

Yet both GP bodies back the initiative. The BMA supports care.data even though it still opposes the 2012 Act. It said it has no issues but NHS England's lack of communication. The BMA said the same.

Professor Nigel Mathers - Royal College of General Practitioners Honorary Secretary.jpgProfessor Mathers' dramatic letter made six "demands" not about problems he saw in the scheme, but problems patients thought they saw in the scheme and had not been addressed in NHS England communications.

GPs also stand to gain from care.data as much as patients, "the NHS", and anyone else who cares about health, which is just about everyone except for psychopaths (that's prejudice), the most tragically indifferent (that's experience) and national socialists.

GPs may moreover be best placed to communicate with patients. A GP letter has more credence than a leaflet from NHS England. GPs are legally bound to submit the data. They are legally bound to tell patients about it.

Yet they complain to NHS England their patients don't know enough about it, nearly a year after they were charged with telling them.

Computer Weekly put it to NHS England data chief Tim Kelsey that NHS England had failed to sell the idea not to patients but GPs.

Tim Kelsey - NHS England data chief - flipped.png"The BMA and RCGP sent materials out to GPs six months ago," he said. "I think there are GPs who feel they haven't had adequate information, even though we sent out information. This is just one of those things where there's no-one to blame. To be be honest, there's no reason that outweighs the importance of their being a proper, informed public debate. That's what we prioritised and that's the decision."

Ignorance

Yet many GPs are themselves so confused about the programme that they don't even have the answers to awkward questions being raised by their patients.

A survey by Pulse magazine found last month 41 per cent of 400 GPs were planning to withhold their patient records from care.data.

They were worried about the same things patients were worried about: that care.data was giving patient data to third parties and private companies. They had like most people not had time to digest the implications of a society where people's most sensitive biological markers and parameters where hoisted up on a database like a cadaver put before a lecture hall in a Victorian medical college.

GPs were in fact being responsible data owners in going one step further than telling their patients just to hand it over. They were not responsible enough though, as they gawped at the approaching April launch, to have found the answers to their questions. GPs were in fact about as fearful in their ignorance of care.data as their patients. A BBC survey found last week that about the same proportion of patients were ignorant of care.data as their GPs - 45 per cent. This is the decentralised NHS the coalition government's reforms sought to create.

Yet is actually devilishly hard to learn the answers to the sorts of questions GPs and patients would have about this scheme. That is the details: who specifically will get your medical records and what will they do with them under what conditions.

ICO infographic on care.data.pngThe ICO said its warning to NHS England this month concerned a leaflet the NHS body sent to patients. The leaflet was promotional, placatory: look how great our scheme is. It had a link to a website for details. But the details were too vague to satisfy the ICO, too vague satisfy what remains of the law.

Who, what, wher.. heh?

The ICO produced its own cartoon leaflet with more information about the contentious areas. But it also raised more questions than it answered.

Everyone's medical records would be made public after being scrubbed of identifying attributes, it said.

But some other medical data that might identify you would be shared with unnamed organisations for purposes that were not specified. Other medical data most definitely about you would be given to other unnamed organisations with some unelaborated legal basis to get it. It linked to a website itself that also neglected to give such details.

NHS England has meanwhile agreed with GPs to step up its general awareness campaign. More people will consequently understand vaguely that their medical intimates have been requisitioned for the greater cause of medical science. It is a worrying precedent only partially placated by NHS England's late offer of an opt-out for patients who would rather their donate their medical intimates at all.

The 2012 Act gave government power to simply take what it wanted. It appointed the Health Secretary as some sort of data Sheriff of Nottingham. The data protection law it revoked was a right to refuse. The NHS subsequently re-introduced it as a mere courtesy with provisos - what it now calls an opt-out.

Kelsey, deputy data sheriff, is attempting to tell people about this courtesy in his care.data publicity campaign. But without legal protection, with so much uncertainty about the details, with DNA science rekindling interest in eugenics, with powerful drugs and insurance companies slavering over patient data, with organisations like the US National Security Agency and Britain's own security agencies hoovering up as much human intelligence as they can, with predictive and pre-emptive crime science on the horizon, with the world-wide economy in trouble, civil war fermenting in Europe and nationalism on the rise, it may be a faint courtesy. Medical science needs no more data than people are willing to give. Data science is a statistical science. It doesn't need total surveillance. But if it does get it - if the do-good banner of medical science is used to justify the state's requisition of all medical records - the rest of our data protections will fall like dominoes.






Google chief's Tory transparency speech in full

| No Comments
| More
Just for the record.

That speech... this is the only transcript of it anywhere on the internet.

That speech Google chairman Eric Schmidt used to sprinkle internet magic dust over prime minister David Cameron's election campaign in 2006; that magic Cameron put on as like Elvis Presley or Arthur Daley donning a coat, with a grab of the lapels and roll of the shoulders and a promise that you! yes you! would get some of its magic too! if you just vote for me and my pals Eric and George. Yessirie, you got it coming your way too, aint no mistake.

Steeped in irony - some would say. Seeping with hypocrisy, you might otherwise see it.

Schmidt's Tory transparency speech invoked human values to tell us what the internet and Cameron were all about: no more of this business where politicians and businessmen hide inconvenient truths so they can tell you one thing while doing another. Here's our cards on the table and that's where they'll stay. Then it disappeared off the net, with Google having not found it convenient for publication, and Cameron's Conservatives deleting their history because it made an inconvenient fit with their forthcoming election rebranding. Brylcreem all round boys. But ditch the spangly coats.

Schmidt and Cameron didn't realise till afterwards the internet had a delete button.

Dubbed 'Politicians and Lie Detectors'...

This is that speech.
George Osborne 2006 Conference Speech web promo photo < vid-osborne-conf06.jpg

    <George Osborne: ... [intro cut off] ...
                                ... "Chief executive of Google!">



Conservative Party Conference Applause.jpeg
    <applause>


eric_schmidt_lg.jpg
Thank you very much [sir|George].

   
George Osborne 2006 Conference Speech web promo photo < vid-osborne-conf06.jpg
   
    <Osborne: self-appraising laugh>




    <ongoing applause>Conservative Party Conference Applause.jpeg



eric_schmidt_lg.jpg
Wow.


Conservative Party Conference Applause.jpeg
    <applause>


eric_schmidt_lg.jpg
What a... what a wonderful welcome.



Conservative Party Conference Applause.jpeg
    <applause>


eric_schmidt_lg.jpg
...what a...


Conservative Party Conference Applause.jpeg
    <applause>


eric_schmidt_lg.jpg
...thank you...


Conservative Party Conference Applause.jpeg
    <applause>


eric_schmidt_lg.jpg
Ahhm


Conservative Party Conference Applause.jpeg

    <applause>



eric_schmidt_lg.jpgYou are too kind, you're too kind.

That was a wonderful welcome.

Ahhm, ah, I came here because I wanted to talk about how the world is changing, and the role all of us are going to play in all of the things that are about to happen.

I want to talk about basic values - basic human values. The- about freedom of speech. About individual freedom and how individuals will operate in this new world.

And about the, the strength of the free market, and how fundamental that is to the things that all of us, care about.

But even more importantly, the responsibilities of corporations and the individuals that use these amazing new phenomena that is the internet and the companies that are being [sporned] about it.

I want to represent to you first and foremost that I am an inherent optimist about human nature. And that it is that optimism that I think drives a lot of what is going on around you. That optimism is fundamental to how we should view what's about to happen.

Now the internet as we all know has changed everything.

It's broken down all sorts of barriers - democratizing access to human knowledge - you can just type a few words into your favourite search engine and know everything.

Conservative Party Conference laughter < _55788357_laughter304.jpg
    <laughter>


Ahhm. It's written the rules of production and distribution. The globalization that has so wonderfully occurred involving the United Kingdom in the last 20 years, is directly related to this access to the information, to the role of English as the language, and to the role that Britain in particular has played in all of that.

Now all of a sudden, instead of just dealing with record companies and television companies and so forth, you too can be a producer - of music and of television and of movies and so forth - and get them everywhere. We don't know if it's any good or not, but you can do it!

Conservative Party Conference laughter < _55788357_laughter304.jpg
    <brief laughter>


And we'll see whether people will watch what you are doing.

The internet is much more than just a technology, it's a way of us really organizing our lives. So my position is first and foremost, don't bet against it.

And I see over and over again, folks - making decisions where they do not reflect, the fact that the internet is all pervasive now in everything that we do. They try to hold information back, rather than recognizing that the inter-the information-the internet is about making information generally available.

Betting against the, the, the net is a bad bet because you are fundamentally betting against human nature.

And human nature wants to communicate. They want to be informed. They want to be, have fun. They want to know what other people are doing.

This has a lot of implications for how we build computers - how underlying technologies are changing - which is a very interesting subject, but for a different kind of an audience.

Conservative Party Conference Feet.jpg
    <murmur>


Rest assured that the technologists and the physicists are building massive super-computers that are behind all of these services that you use. And thesemassive supercomputers carry all of this information in these amazing ways and gets that information to you so quickly. And not just text, but email and voice and video and movies and this sort of thing. We call them clouds. It's as if all that information is in some cloud out there, and off you can get to it.

Search turns out to be... the unifying principle because there's too much information and the only way you can keep it organized is to search for it. Interesting conclusion, after all these years.

So search is the inevitable outcome - drives all of these companies, all of these new ways - you can search your emails, you can search your personal files, you can find all that stuff you wrote last year and forgot about it. This happens every day all the time.

Google itself is built around what I call wow moments. All of a sudden you realize the world has changed. For me, it turns out that, I've always wanted to climb, Mount Everest. Ahhm, I know this is a shock.

Conservative Party Conference laughter < _55788357_laughter304.jpg
    <laughter>


And... there's no scenario where I am going to make it to the top of Mount Everest. But you can take Google Earth. You can start at the bottom and you can climb Mount Everest in the comfort of your office, and go all the way right to the top. And I thought, isn't technology wonderful - I've made it!

Conservative Party Conference laughter < _55788357_laughter304.jpg    <laughter>

    <applause>


The...

Conservative Party Conference Applause.jpeg
    <applause>


The-there are more serious stories. Er, we get er, we got a letter when I started my job of the form, ahhm, 'It does matter that Google is fast.

And, the letter continues.

'I had these symptoms, and I wrote them into Google, and I pressed the button and it said, "You are having a heart attack dial emergency services!"'

Conservative Party Conference laughter < _55788357_laughter304.jpg
    <laughter>


And he did. The ambulance came and said if you had not done that immediately, you'd be dead. So we tell our employees that this is why it's important Google be fast! People might die.

Conservative Party Conference laughter < _55788357_laughter304.jpg    <laughter>

    <light applause>

Ahhm. True story.

We get sorts of stories about people who er - and very, very serious stories - about people whose children were abducted - right - Where they found the child in the town next door after many years.

The power of information - and the families were re-unified and all of that - the power of information - is so profound and so liberating, I've come to respect it even more than I always have.

So we value, 'More information means more freedom'. We value the individual responsibility that comes with that - market-making, market driven, not regulated, and very much based on the rule of law.

Go directly to the source! And figure stuff out for yourself. If you're tired of somebody else telling you what you should think, read it for yourself. Read it and decide. It makes sense. It's a nice principle. And it puts you in charge.

We run Google in this bizarre way which we call seventy-twenty-ten. Seventy percent of our resources are applied to our core business, 20 per cent on adjacent businesses and 10 per cent on new and innovative things that nobody could possibly have ever thought of, that well up from our creative employees.

I was thinking from this conference, that Mr. Cameron and the leadership could say 70 per cent work on our core activities, 20 per cent work on our adjacent activities and 10 per cent of you are in charge of inventing completely new ideas - half of which are wacko and half of them are brilliant!

Conservative Party Conference laughter < _55788357_laughter304.jpg    <laughter>

    <applause>


Right?

Conservative Party Conference Applause.jpeg
    <applause>


So, it...


 Conservative Party Conference Heckler.jpeg   <applause>

    <muted heckling>


This, this model, this model of creativity represented by the incredibly smart people in this room, is a good way to run a company. It may or may not be [chortle] a good way to run a political party! [humble intake of breath]

Conservative Party Conference Feet.jpg
    <isolated laughter>


As people move and as life online today continues, more and more power goes into the hands of end users, and there's more and more competition. Some interesting statistics, ahm - you probably know what a gigabyte is. A petabyte is a lot bigger and an exabyte is a lot of petabytes. Five exabytes of data, ah, of data, were produced in 2002. If you wanted to watch that on television, our estimate is it would take you forty thousand, seven hundred years.

Conservative Party Conference Feet.jpg
    <murmur>


We have an information explosion in the world that is unparalleled. And with the rise of China and with the rise of India, this will only continue. We have thirty five million blogs doubling every six months. The average blog has exactly one reader - the blogger!

Conservative Party Conference laughter < _55788357_laughter304.jpg
    <laughter>

Okay.

Conservative Party Conference Applause.jpeg
    <Applause>

So.

Conservative Party Conference Applause.jpeg
    <Applause>


It's fine. There's an awful lot of successful blogs and there's an awful lot that are not!

Conservative Party Conference laughter < _55788357_laughter304.jpg
    <isolated laughter>


These paradoxes - the paradoxes of the-of the web, like being lone-lone-lonely in a city - the internet is ah-both a great unifier and a polarizer. The truth emerges much faster on the internet, but falses also emerge more quickly too. More globalization, more tribalism. Mor-the internet makes you mor-mor-more cosmopolitan, but also more hate groups.

These are the paradoxes that you as our political leaders - in the globe - will have to deal with. So, so when we think about it and I'l-I'l-I'll try and put this in context because you all are are are leaders of-of-of-of a great country.

The role of governments has really got to be thought about now. This is a quote from a US paper: "The internet is in, in, in its relative infancy is like a child exercising new freedom, primarily through challenges to orderly systems - old retailing models, old media, old privacy rights, old libel standards, even old notions of parental control. Some of the pushed - notably governments with statutory power - are going to push back."

Governments are struggling with what to do with all of this. There are privacy concerns that are very real. The core message that I have for you is that information is power and that we in Google and others are pushing that information into the hands of end users.

Markets that are more efficient - markets with more information - serve everyone. And that we succeed, and I think we all succeed, when speech is the most free and the information is most available.

So in this case, we have to work - we as a company - we need to work with you to make sure that governments do the right thing - that they set the rules of the game, that they operate fairly, that they make sure that people have access, that all of the sorts of things that you could imagine in a good regulatory paradigm for this innovation.

When I think about it - and I'll put this together and give you a sense of the future - the impact of users of information has just begun.

One of my rules - the first rule that I propose for the internet - is people have a lot to say, and they are going to say it on the internet. Here we have emails, coming up, along the way of our conference.

Experts - you know, we talked about y'all go to college and you university, ah t-to have this tremendous amount of learning that you do. In the future, experts will be - fast learners, fast searchers, because it changes so quickly there will be people who can learn it as quickly as it evolves. The new expert will be someone who knows now something of a lifetime of lear-learning.

What does the world look like? In five years. Or ten. Maybe lets pick five years.

Let me tell you some products that I'd like us to build. Serendipity - I'm typing, this page tells me, what I should have been typing!

Conservative Party Conference laughter < _55788357_laughter304.jpg
    <laughter>


Right.

Conservative Party Conference laughter < _55788357_laughter304.jpg
    <laughter>


Anybody have this problem?

Conservative Party Conference laughter < _55788357_laughter304.jpg
    <laughter>


No, that's wrong, Eric, as usual! This is the correct citation.

We're build-we're building and on the way to having simultaneous translations in all languages. Language barriers are a huge barr-a huge issue around peace and prosperity around the world.

Truth prediction... I hesitate to discuss this in a political climate!

Conservative Party Conference laughter < _55788357_laughter304.jpg
    <laughter>


What probability should we think this statement is truthful? Our computers can make a pretty good guess.

Markets of information and user feedback, and also a political market for political information.

Prediction markets - in the mathematical sense of prediction markets - of information and what will happen.

New modes of search - take a picture on your phone - everybody has their phone, has everybody taken a picture? We'll figure out who it is, index it and send it to them!

Or not.
Conservative Party Conference laughter < _55788357_laughter304.jpg
    <laughter>

Paper lengthening - another one of my favourites! - please make my paper 10 per cent longer.

Conservative Party Conference laughter < _55788357_laughter304.jpg
    <faint laughter>

And you can do it again.

Right? So you lengthen the paper as you submit it to college.
Conservative Party Conference Feet.jpg
    <silence>

Mobile phones everywhere - pioneered here.

Google and an iPod - all of the the world's information, literally in a device this big - so you always know whether something is true or not.

But I think perhaps the most important thing that will happen...
George Osborne 2006 Conference Speech web promo photo < vid-osborne-conf06.jpg

    <muted cough>


...is that the internet - and I think this is probably a shared value of everyone in the room - the, the internet can be and I hope will be a revolutionary force in refresh- in repressive societies, driving both literacy and expression to new heights, and to new prosperity.

Ah, it's, its great to be alive, it's great to be here - thank you very much for inviting me...

Conservative Party Conference Applause.jpeg
    <applause>

...and taking time.

Conservative Party Conference Applause.jpeg
    <extended applause>


[self-appraising laughter, to George Osborne] Thank you.
George Osborne 2006 Conference Speech web promo photo < vid-osborne-conf06.jpg
    <Osborne: Thank you very much! That was a great speech!>
   
    <Osborne: Thank you very much!>

[self-appraising laughter, to George Osborne] That's wonderful. Thank you!

Linux users still obstructed from Parliamentary debates

| No Comments
| More
North Korea conference hall edit.jpgParliament is still treating Linux users as though they aren't citizens. It's website, for them, is like the door of an exclusive Soho gentleman's club.

If you aint got Microsoft, you aint getting in - though we might give you a second chance if you go home and change that boho suit.

Video broadcasts of Parliamentary proceedings are designed to be watched by people with Microsoft software.

The Parliamentary ICT Office had been trying to solve the problem as part of its attempt to turn parliamentary video recordings into a public asset anyone could incorporate into their own websites. But it still gives precedence to people using Microsoft software, after more than seven years.

If a Microsoft user pitches up to Parliamentlive.tv without the required Microsoft Silverlight media player installed on their computer, they get a page with a button that will install it conveniently. Linux users just get fobbed off.

Parliament's error page tells Linux users they must go elsewhere to install alternative software before they can watch its broadcasts.

Parliament Microsoft Windows Silverlight Linux Moonlight.pngIt says Linux users should go to its help pages, where it says they should install a program called Moonlight that might be unstable. This is enough really to raise questions about Parliament's decision to run a public forum on Microsoft software. But the place is says Linux users should go is a page that says: "There is currently no text in this page."

Aside from being titteringly untrue, the software it is not possible to install on this page has for years been difficult, if not impossible to install for any Linux user who doesn't also happen to be a computer expert or at least have a lot more patience than parliament expects of anyone who is prepared to pay a premium for Microsoft's monopoly software platform.

One day, says Parliamentlive.tv, Linux users might be able to get an up-to-date stable version of  software they can use to watch its proceedings. If ever in the future parliament's contacts with the electorate through internet video were to converge with teleconferencing technology, a la Star Trek, anyone with Microsoft software would have a democratic advantage. Public debates would be full of Microsoft twerps. Linux users would be left outside in the straw.

Technical support

The link to the promised stable version seems fruitless anyway. It takes you to another page, on the website of a Linux technology called Mono, that says in language comprehensible only to computer experts, you can't install Moonlight there at all, but you should be able to get it elsewhere using standard Linux software repositories:

"The Mono project does not provide binary downloads for Moonlight; however, packages for most Linux distributions can be found online, either in official distro package repositories, or provided by popular third-party repositories."

Aside from being unhelpful gobbledygook, this statement also appears to be untrue. The Moonlight player appears not to be available even to those users confident enough under the bonnet of their computer to search software repositories.

The public debate our Linux user had gone to view at the Parliament website has meanwhile passed, and there may be nobody other than Microsoft users who witnessed it.

A search for help on this matter goes round in circles. Documentation for the popular Ubuntu Linux operating software (dated 2010) says users should install Moonlight by going to the Mono website. But that directs users to the same page-to-nowhere we've already been to.

Elsewhere you might see mention of a Moonlight plug-in for Firefox. But the Firefox plug-in library contains no mention of it.

This might be a familiar pattern for any potty-trained Linux user who has attempted to watch a parliamentary proceeding at any time in the last seven years. Why does Parliament still treat Linux users like second class citizens?

Gnarled finger of blame

It might be said the blame lays with Linux for having a software ecosystem often only navigable by computer experts. But standard Linux software has long since installed at the click-of-a-button. And the problem only usually arises when you try to play media formats created by companies like Microsoft, that have long used their formats as a form of extortion, to make sure you have to buy their software platform in order to play it.

Microsofts Partners Conference 2011.jpgMicrosoft-platform formats like Ape and Microsoft formats like Windows Media Player, effectively obstruct anyone who doesn't pay up to Microsoft.

The blame therefore lays with Parliament for not choosing to broadcast its proceedings in a standard of communication that is open to anyone.

What on earth is Microsoft trying to do in domineering protocols of communication anyway? What sort of tyrannical pursuit is that?

Silverlight is what Kim Jong-un would do if North Korea was a software corporation.

Near anywhere else Linux users go to watch video online, they don't have to bat an eye-lid - the video just plays. Even the BBC's own live video broadcast from Parliament plays instantly, without a hitch.

The BBC's live parliament broadcast only shows one channel though. And yesterday, when your correspondent had a job to do, that was not showing the committee debate that happened to be high on the public ICT agenda.

BBC

That was, ironically, a Public Accounts Committee hearing about the failings of the BBC's own ambitious attempt to digitize its television programming. The BBC was showing instead a debate about the Ministry of Defence.

Nothing's changed really, since Parliament's Microsoft supplier, Two Four, and the Parliamentary Broadcast Unit first started digitizing their recordings in Microsoft's cul-de-sac media format, Windows Media Video.

Linux users are still treated like second class citizens. Parliament still effectively forces people to use Microsoft software.

And MPs on the committee are still mithering about the cost of pioneering software developments like the BBC's Digital Media Initiative. Yet they can't even get the simplest software right themselves.

To be fair to the committee, its own Parliament's undemocratic video broadcasts were themselves pioneering in their time. Parliament attempted to do something that hadn't been done before. It did it quite well. But it made some decisions that in hindsight it really got wrong. That's the risk and inevitability: some bits of a pioneering software project simply won't work out.

Expensive mistake

It does look now that there may be a way round this Microsoft Video problem for Linux users. It is not obvious, there's no mention on if in the help pages, and there's no button you can press on Parliament's website, but it may be that a Linux user searching for a solution to this problem might come across something called the Mono libraries. And if they have the requisite computer skills they might install these libraries and they might - might - find that if this does not give them software to play the British Parliament's Microsoft(tm) broadcasts, it might at least give them the means to install Moonlight if they can find it elsewhere.

That's what Parliamentlive.tv says to Linux users, thanks to Two Four and Microsoft. If you run Microsoft Windows, it says, click here to install Silverlight. If you run Linux, you're on your own.

That raises a funny question for Parliament's own failed attempt at pioneering video broadcasts, aside from why on earth it ever chose a proprietary Microsoft software format in the first place. That is, what exactly has failed? What software has Parliament got to scrap in order to rectify its mistake?

Parliament can't ban Linux. It can't decree that everyone uses Microsoft - though Microsoft's longstanding monopoly and its supporters in places like Parliament effectively force people to use Microsoft. It can't simply turn off its video broadcasts. It might be more democratic to turn them off. But there's no going back.

Isn't it therefore simply about time Parliament scrapped Microsoft Silverlight, wrote off its years of investment in Microsoft as an expensive mistake, and then held a meeting of the Public Accounts Committee, in a Linux-compatible format, about how it came to waste all that money in the first place?

Datacasts to put powerful under scrutiny

| No Comments
| More
Transparency campaigners have launched a bid to turn the speeches of powerful people into data the public can easily scrutinize.

In the wake of the Conservative Party deleting its public speeches and election pledges, the initiative may put pressure on political parties not only to publish their speeches but to make them machine readable and primed for mining by data scientists.

Campaign group MySociety yesterday unveiled a system to capture public statements and keep them in the public domain, and to bring the past speeches of powerful people back into the sunlight.

MySociety founder Tom Steinberg cited politicians, business executives and law courts as particular targets for a campaign that proposes to make them more accountable to "normal people".

It plans a SayIt system to monitor parliamentary proceedings in South Africa and has established a partnership with Chilean transparency group Fundación Ciudadano Inteligente.

But it launched SayIt in the UK first - with a call for people to capture public meetings and statements made at local councils, where most of what is said goes unrecorded.

Steinberg told Computer Weekly the system could also address the problem raised by the Conservative Party's deleted speeches archive.

"The whole business of transcripts - whether its parliament or celebrity interviews - needs bringing up to speed. It's woefully old-fashioned.

"We want people all over the world to find out when powerful people talk about things that matter to them. If someone is talking about your road in a local government meeting today, you will never know. It's impossible to say ping me a tweet if someone in a council meeting talks about my child's school or whatever," said Steinberg.

Councils historically simply didn't publish transcripts, he said, for reasons of cost and politics. The meeting minutes they published instead were "deracinated" and lacked humanity. MySociety was attempting to develop a system that would cut the cost of producing transcripts and turning them into a structured, machine readable form. It is exploring the use of speech-to-text software.

Speech data format

It has also adopted an open data standard called Akoma Ntoso (a West African phrase meaning "linked hearts"), and called on others - including the UK parliament - to adopt it too. It has envisaged a world in which all public statements anywhere could be automatically cross-referenced by person, say, or topic, because the transcripts all used the same open format.

"It's frankly unlikely that any two transcripts anywhere in the world have the same data structure today because each transcription company, and each court and each parliament will just open Word and start typing," said Steinberg.

"Whereas structured data is what makes each Tweet a beautifully-structured thing. It's what makes Facebook work. And it's what makes everything on the internet work. But as of today transcripts are not structured at all," he said.

The UK Parliament already publishes Hansard transcripts of its daily proceedings, which MySociety turns into a structured format for publication on its TheyWorkForYou website. But data discrepancies require manual intervention on a daily basis, said Steinberg.

It converted transcripts of the UK Leveson Inquiry and the trial for war crimes of Charles Taylor, former president of Liberia, West Africa, to publicise the launch of SayIt.

It took two weeks for a programmer to convert about 400,000 trial transcripts because they contained so many inconsistencies in naming conventions and other data elements.

Officials

Shela Husain, deputy director of accountability and transparency at the Department for Communities and Local Government, said it directed councils to publish minutes of public meetings in its Code of Recommended Practice on Transparency. But it would not ask any more of them.

"We certainly won't be stipulating that they publish transcripts of meetings," said Husain. She insisted local authorities should be open, transparent and accountable. But transcripts would incur costs and the question of producing them had never come up.

"It's up to local authorities," she said. "We wouldn't say they shouldn't do it. We just wouldn't stipulate that they should."

Councillor Nigel Murphy, cabinet member for digital at Manchester City Council, said: "It's something we would look into because its a way we could engage and be more accountable. Transcripts can help people with disabilities. But there's obviously a cost involved."

Datacasting

Steinberg said he hoped journalists would use the system to datacast transcripts of interviews and conferences. He said they could share the labour of keeping tabs on powerful people.

One of those powerful people - Google chairman Eric Schmidt - had helped the Conservative Party election campaign by backing its claims that the internet would make people like him and them more accountable. But Computer Weekly showed recently how they had let the ball drop when his and the party's transparency speeches were purged from all but the most obscure corners of the net.

Google speech not just missing but purged

| No Comments
| More
01 DEC 2006 - Stop Brown's NHS Cuts - banner from Conservative website - campbanner-nhyespetition-2006.jpgHere is an irony upon an irony upon an irony. You heard how Google chairman Eric Schmidt's political transparency speech was not to be found on the internet. You heard too how he used it to endorse the political transparency pledge prime minister David Cameron made while seeking election.

Now get this: the Conservative party pulled the speech from its website just weeks before Cameron announced Schmidt as an advisor in February 2009. How's that for transparency?

It was deleted amidst an earlier purge of speeches the Conservative Party did after Cameron told voters they could use the internet to hold it account. This preceded the purge it did in October last year - when the Conservative Party deleted its "Speeches Archive" and records of other public statements going back 10 years.

You heard it here first: it did the earlier purge when it overhauled its website in Autumn 2008, after its political transparency schtick was well established. This deleted at least its video archive of speeches from the 2006 Conservative Party Conference. It effectively deleted the only copy of Schmidt's political transparency speech that existed anywhere on the internet.

Asked why it had deleted the speech, a Conservative Party spokesman said it had never existed at all. The Conservatives had therefore not deleted it. Because it was never there.

"The Schmidt speech was never on our website," he said. "We had a policy that only elected representative's speeches would go on our website."

Google itself too claimed the speech never existed.

But prominent data transparency expert Owen Boswarva found it in an obscure corner of the internet early this week.

He found it by navigating through a copy of the Conservative website, recorded by the UK Web Archive in 2007. The Archive withholds its records from search engines. So the Schmidt speech does exists on the internet, but only in the way Lord Lucan exists on the planet. You certainly won't find him on Google.

Gone

UK Archive records confirmed that its bot clocked Schmidt's political transparency speech on the Conservative website numerous times in 2008. But the speech was gone on its last visit that year, on 9 December. Two months later, on 9 February, Cameron announced Schmidt's appointment as a Conservative policy advisor.

Schmidt not being a member of parliament had nothing to do with the deletion either. The Conservative Party published speeches other non-members gave at the 2006 conference. It published a transcript of a speech Zoya Phan, human rights activist, gave on the same day as Schmidt's. It published a transcript and video of a speech senator John McCain gave two days before Schmidt.

All these speeches were deleted in the 2008 purge, along with chancellor George Osborne's "Economic Policy for a New Generation" (that's now gone), and Cameron's "The Best is Yet to Come" (gone). And all the others, like McCain's and Phan's. All gone.

That conference was the big launch of Cameron's bid to be PM. The Conservative Party billed it as "A New Direction". It was nevertheless magnanimous enough then to keep an archive of old speeches and conferences. That is the sort of thing the new Conservatives did - even if it meant giving punters an archive of old, negative campaigns. Sunlight simply doesn't discriminate.

Regardless of the reasons why it went on to delete those conference speeches in 2008, the act suggests the Conservative Party was not as resolute in carrying out its pledge of political transparency as the now prime minister said.

It did subsequently establish a brilliant archive of news and speeches, going back more than 10 years. But the 2008 purge and other investigations show it was incomplete. This may have only been tardiness. But tardiness may have given cover to nudge-nudge deletions. Or the holes may have been wholly curated.

The Conservative Party's new, new direction does not appear to include political transparency, which may explain why it deleted its entire archive once and for all in October. Conservative Party HQ said it would make more important information more visible. It would not say who will deliver Cameron's pledge of political transparency if not himself.

Explanation

A spokesman would only say: "Our website makes it quick and easy to access the most important information we provide. Our long-term plan to secure a better future for Britain and our children and the difficult decisions we've taken to clear up Labour's mess.

"Archived content is available for anyone to view on the internet."

... if you happen to know precisely where it is or can spare hours to look for it.

The party's U-turn is inexplicable. It seemed irreversible in Google's heyday, when Cameron stepped into the light and declared it would edify us all. But politics may have decided it is inconvenient to treat voters as adults. There is too much to be gained by treating them like idiots. Edifying party archives will be replaced by soundbite video-clips, bright lights, and big buttons you can stab your fingers at. Ug.

There's no need for past speeches when politics is led not by debate but marketing. Even speeches like the one given by Google's billionaire chairman, that the Conservative Party used to make its transparency schtick look credibile in the eyes of voters. You would think a party that stood for transparency would make sure it kept Schmidt's speech in a prominent place, since he did it for the sake of a party that would do business with him when it won power.

Before it won power, and after his transparency speech was deleted, Schmidt helped formulate Conservative Party policy as a member of Cameron's Economic Recovery Committee.

After it won power, the Treasury appointed Schmidt to its Business Advisory Group, and while Google was left paying very little tax, the search giant gave its backing to at least one controversial political initiative championed by the PM. And still Schmidt's transparency speech is remains all but deleted history.

Google fails Tory lie detector test

| No Comments
| More
To sum up 2013 take this untold tale concerning prime minister David Cameron and Google chairman Eric Schmidt.

It is a droll sequel to the Conservative Party's misadventure in transparency, when it deleted its election speeches and sought to purge them from the internet.

The setting - the theme for 2013 - is disillusion. Cameron and his Conservative compadres had offered the electorate transparency and accountability. But their plan would in effect decrease it. They meant to apply it only to the public sector. And then only to help private companies get a grab-hold on public markets. Similarly, they promised to make people with power more accountable. But they only meant public sector people - not the private companies they sought to do public work. And not themselves. So transparency and accountability would decrease under the Conservative coalition. Because it was shrinking the bits it made transparent, and growing private business, whose business remained private.

Cameron got kudos for this plan from Google's billionaire chairman Eric Schmidt. But they promised something more inspiring than accountability merely for the sake of markets. Theirs was the internet politics. You will know its credo if you have been determined enough to root out those now deleted Conservative election speeches: information liberation equals people power.

Cameron and chancellor George Osborne had spelled it out at Google conferences. Schmidt underlined the point in a keynote at the 2006 Conservative Party conference.

And how the Conservatives chortled when Schmidt indulged them with that speech, titled "Politicians and Lie Detectors". In five years time, he said, the internet would be so clever it would tell you when politicians were lying.

But Schmidt got it wrong. Not simply because seven years later the Conservatives deleted the speeches you would need to read to see if they had been lying or not. But because Schmidt's 'Lie Detectors' speech can't be found on the internet either.

Computer Weekly spent hours looking for it. It wasn't there. Google never published it at all. It was determinedly withheld.

This is important because while Schmidt's backing lent the Conservative transparency pitch a vital air of credibility, his corporation has continued to be a substantial partner in the Conservative-led coalition government.

After Cameron and Osborne appointed Schmidt as an advisor, they put £50m and considerable political gusto into the Google-backed "Tech City" - a shoal of web and design startups located around the London headquarters of various US tech giants they billed as the UK's answer to silicon valley, instead of putting it somewhere like Bletchley and Milton Keynes, which sit at the centre the Oxford-Cambridge Arc, an extension of the famous Silicon Fen cluster of high-tech software and biotech startups that was once earmarked as the UK's answer to Silicon Valley.

Google meanwhile paid 0.09 per cent tax on its UK earnings. Public accounts committee chair Margaret Hodge said in April this was a conflict of interest that warranted striking Schmidt off the Business Advisory Group where he sits as government advisor. Schmidt stayed. He has meanwhile been putting his weight behind the prime minister's personal political initiatives.

It is not possible to know what advice Schmidt has given the prime minister. The Business Advisory Group meets in secret. It is convened by the chancellor and chaired by the prime minister. Yet the Treasury says it won't publish the group's meeting agendas and minutes because transparency might discourage members from giving frank advice.

There is another reason why the whereabouts of Schmidt's 'Lie Detectors' speech is important. That is the same reason the deleted Tory speeches are important: they are people power pledges people can't read.

Google may not have deleted Schmidt's lie detector speech like the Tories did theirs. But that is because it never published it in the first place.

It was unusual for either a Tory conference speech or a Google speech not to be published. And though it may be just one speech, it was the moment Google gave chorus to the party that went on to treat it so favourably in power. In the reckoning of transparency and accountability in the internet age, its as rum as they come.

Now if you were the sort of person this transparency and accountability was supposed to benefit, you might reasonably look to read the speech that sealed the Google / Conservative alliance. You would expect to find it by simply rat-a-tat-tatting for it in Google.

Slough of despond

So you would have done a Google search on something like >>'eric schmidt' speech transcript 'conservative party' conference<<. And you might then have wasted a lot of time running up dead-end roads before you would accept it wasn't there.

Google's YouTube website, for example, has a tantalising, 1:57 minute clip of Schmidt's speech. It doesn't have the whole speech. But there are a couple of articles The Guardian newspaper wrote about it at the time. They feature a 2:50 minute version of the same clip you already saw on YouTube. But they don't have the whole speech.

The Guardian links to a George Osborne speech that followed Schmidt's keynote. But that is one of those the Conservatives deleted. If Osborne's speech was once there you might have found Schmidt's in the same place?

So you would look the Conservative website up on the Internet Archive, and trudge the records for the page where Schmidt's speech might have been published on 3 October 2006. And you would see the Conservative Party published three speeches from their conference on that day. But not Schmidt's.

You have to be bloody determined to get even that far. The old Javascript buttons don't work in the Archive. You can't just go to the 'speeches' section on the Tory site like you could in 2006. You have to try and find an archive snapshot taken when the speech was available as a fresh html link on a Conservative home page. It's not easily done. But if you do get to an old homepage in the Archive, you may find there a link from an old side-panel to another page that itself has a side section where another link will take you to the speeches the Conservatives published on the day Schmidt made his at their conference. You'll see it's not there.

But you would see here that it was once possible to get, say, Arnold Schwarznegger's 2007 conference speech via a Javascript drop-down on the Conservative website. The drop-downs are still there. The links just don't work. Schwarznegger's still there. Schmidt was never there.

Having come this far, you might trudge further through the Archive to see if Schmidt featured in any Conservative Party press releases at the time. But the idiosyncrasies of the archived web mean you can't actually reach any notices nearer than a week after the conference. The press release archive is blocked by an out-of-date press registration form. The public news section didn't mention Schmidt in October 2006.

Likewise, you might try different search terms or engines. You might chance upon the odd contemporary report. You might find with a tiny bit more of Schmidt's speech.

"The internet.. will expose them to online 'truth predictor' tests and affect the outcome of general elections," one reports Schmidt saying. "Imagine being able to check instantly whether or not statements made by politicians were correct."

Yes, just imagine. You won't find the transparency speech though. There's a blog by Tom Watson, a Labour MP, about Schmidt's appointment as Cameron's economic advisor in February 2009. Wikipedia links to anthologies of Schmidt speeches: back to The Guardian with no luck; over to The New York Times to no end.

Google itself has an archive of press statements going back to 2001. But you can't get to Schmidt's statements on its US website if you are logged in to Google from the UK. You have to log out and pretend to be someone else to check on Google's accountability pledge. It uses your login to control what you see on the internet.

But Google must have published it somewhere. Publishing speeches was simply the sort of thing Google did in those days. The Archive records that Google made 21 press releases in October 2006. But none for Schmidt.

Bingo

Ah but look. The 2006 archive has a link to a Google "Media Room". Ignoring wrong turns, and old media releases now erased from the archive, there's a link to "Talks & Presentations".

Google did publish various Schmidt speeches and interviews in 2006. But not his Conservative election speech. Its transparency seemed unambiguous. But it was selective. This was not the internet politics - the transparent society.

That's what it's like for plebs using Google to hold politicians to account in 2013, seven years after Schmidt's 'Lie Detectors' speech.

Yet journalists have one more avenue of enquiry not normally available to other plebs: the press office. It doesn't usually lead anywhere. But it can take less time to get a negative than other routes.

Two Google press spokespeople looked into the matter after Computer Weekly made enquiries. Each referred it upstairs and looked themselves. They said it did not exist. They said the only available relic was that clip on The Guardian and YouTube websites.

"Google do not keep a record of executives' speeches," said one of the Google spokesmen after looking into it over a number of days last week.






Universal Credit security fears put DWP job cuts off till next parliament

| No Comments
| More
Devastating public sector job cuts will be put off until half-way through the next parliament after the Department of Work and Pensions obstructed the government's "digital" reform of its Universal Credit benefits system.

The DWP had put an epoch-making jobs cull on the cards when tore up its plans for Universal Credit last autumn and signed it up to the coalition Digital Strategy - a Cabinet Office plan to cut up to 80 per cent of jobs by making government services "digital", meaning to automate them and administer them through mobile apps and web browsers.

The plan would have done away with benefits processing staff, and begun a cull of staff at jobs centres as the state-run job-matching service was replaced with web apps - fulfilling a Conservative election pledge to replace "bureaucratic" public services with private sector competition.

But the department's 90,000 staff were given a stay of execution last week when it said it was not ready to go through with the digital transformation after all. Of those staff - who constitute the largest body of staff in the UK public sector

The DWP was sticking to Plan A, it said in a statement, which was to build Universal Credit on top of its existing "legacy" software systems and to bolt web apps on at a later date.

Iain Duncan Smith, work and pensions secretary, told the BBC Radio 4 Today programme that the "digital" aspects of Universal Credit had been delayed over fears that the security system was not ready.

"The reason why I put the red team report in a year-and-a-half ago [was] because I was concerned that the relationship between the security and the online aspects wasn't going to work," said the minister this morning.

"Outside [advisers] agreed with me. What we actually did was, we reset the programme so we do the volumes later in the rollout.

"I think that's fair because the lesson we learned from the rollout of the last government, for example, of tax credits, where they put huge volumes through very early on and the whole system crashed, costing £13bn in fraud. that's a lesson I was very certain we needed to learn and not repeat," Duncan Smith said.

Asked to clarify the minister's statement, a DWP spokesman said his transcript of the radio interview was missing the part where Duncan Smith referred to security and benefits fraud. The minister would not normally refer to security in this context, he said. He had no idea what security concerns had delayed Universal Credit.

The DWP had nevertheless agreed to make Universal Credit a test bed for the Cabinet Office Identity Assurance Scheme (IDA) - an online, market-led alternative to the last Labour government's Identity Card Scheme on which the Conservative plan to automatize government would hinge.

The National Audit Office said in September that the Cabinet Office's "digital-by-default" plan to have all benefits transactions processed online had with IDA been two of three things that went wrong for Universal Credit.

DWP had long experience taking precautions against the fear that a fully-digital benefits processing system would be a bonanza for fraudsters.

Those fears have now led to delays in the roll-out of the Universal Credit system because, the work and pensions secretary effectively told Today, the Cabinet Office IDA scheme is not ready. Whatever the whys and the wherefores, the delay will ensure the next government will be forced to face the prospect of computers replacing jobs at the largest single employer in the UK public sector.

Christmas comes early for the Open Document Faithful (ODF)

| No Comments
| More
Jingle Bells. The UK government has spruced its open document policy up for Christmas.

The Cabinet Office began a public consultation on open document formats this week, three and a half years after it came to power promising they would be one of the first things it delivered.

The consultation might signify the government has renewed its commitment to the policy. It had struggled so much since the coalition's first failed attempt to introduce it in 2011 that it seemed it would never deliver at all.

The Cabinet Office Open Standards Board issued a "challenge" for public comment on a proposal this week that government documents be published in a format that anyone can read.

"Citizens, businesses and government officials need to access government documents," said the challenge.

"[They] must not have costs imposed upon them, or be excluded, by the format in which government documents are provided," it said.

It said people should not be forced to buy special software just so they could read government documents. Government, in other words, must publish documents in formats that people can read without condition.

Cabinet Office minister Francis Maude said in a written press statement on Wednesday said open formats would make government communications more efficient.

Linda Humphries, policy lead at the Cabinet Office, said in a blog post that government officials were "frustrated" at being "locked-in" to buying particular vendor's software because it was the only software that read a particular format.

She said at least one business had complained to her about being forced to buy particular software just to read communications issued by a government depart. She neglected to name the firm, the department, the software or the format.

The coalition has been reluctant to name names of software firms that pose a problem. But its policy is well understood to be an antidote to proprietary software fiefdoms controlled by Microsoft and Oracle. Opposition by these, along with Apple and the Business Software Alliance and international standards bodies, had forced the Cabinet Office to retract its policy months after it was introduced in 2011. It has already had two public consultations on open standards and formats since it pulled the policy, in an attempt to strengthen in fear of legal action from proprietary opponents.

Other governments have made more rapid and bold declarations for open standards under the same intense opposition from proprietary vendors. Portugal wrote open standards into law last November.

The Microsoft monopoly that inspired the policy meanwhile seems as strong as ever.

Bristol City Council, the coalition mascot for open source and open standards in the public sector, abandoned this year a decade long effort to use alternatives to Microsoft software. It was forced to abandon the effort under a government that promised to deliver open source and open standards across the whole public sector. The heart of government policy was always the open document format (.odf) alternative to Microsoft's dominant .doc format.

Gavin Beckett, enterprise architect and pioneer of Bristol's open source strategy, told a conference in April the council had been struggling alone against the tide. The council did not have the power the change the world on its own. Everyone used Microsoft formats because everyone used Microsoft formats - including all the other major software suppliers to government. So Bristol had to bite the bullet and buy Microsoft.

The European Commission is meanwhile coming to the latest break point in contracts that have made Microsoft the sole supplier of desktop office and operating software for more than 20 years. The Commission had been aspiring to find an open format alternative to Microsoft standards even when it signed the first contract to buy Microsoft Office in 1992.

An optimist might say the Cabinet Office's latest consultation has come in the season of hope and new beginnings. A sceptic might say, 'tis the season of magical fantasy.

Law change put UK internet archive beyond reach of plebs

| No Comments
| More
p. 220px-John_F._Kennedy,_White_House_color_photo_portrait.jpgAs experts pointed out after prime minister David Cameron's Internet Summit at Downing Street last Monday, you've got to be a fairly feeble child abuser to be typing taboo words into Google - especially after the NSA scandal.

Search engine giants Google and Microsoft had offered to filter paedo words from their search results. Some experts didn't credit them because horrid criminals do not usually advertise themselves to search engines in the first place. Stuff you can find easily on the internet is controlled by people who want you to see it.

One thing that got lost amidst the hullabaloo about Cameron's paedo Summit (and other moral matters stoked by his office last week) was his response to the revelation he had deleted (literally deleted) his old claim that the internet would make politicians like him more transparent and accountable; and that his party had taken steps to ensure Conservative election pledges became one of those things you couldn't find on the net so easily.

One reason Cameron's response got lost amidst the hullabaloo was because he neglected to make one. His office said it was a party political matter. It was not Number 10's business, said a spokeswoman.

Cameron had made the promise of transparent politics the primary characteristic of his opposition leadership in 2006. It defined his election campaign. He even repeated the pledge when fresh in office. If his government was not going to concern itself with carrying out that pledge, then who would?

Number 10 said the coalition government had done lots of things to fulfil Cameron's transparency pledge. It had published government spending data. It had published other government data. It had (like the last government) published speeches by sitting government ministers. It had done lots of things. These things were however all about government transparency.

David Cameron from Conservative Website - 12 OCT 2007.jpgCameron always pitched that as only half the story. He promised to deliver the other half as well: political transparency.

Patter

Take for example the PM's "Podcast on a train", made on 29 May 2010, shortly after he became prime minister.

"It's not just about efficiency and saving money," said Cameron then.

"Transparency can help us to re-build trust in our politics.

"One of the reasons people don't trust politicians is because they think we've always got something to hide.

"Well by the time we've finished there will be far fewer hiding places," said the eager young PM.
He said he was planning a "transparency revolution".

It was the same patter he had before the election: political transparency and government transparency. That was what David Cameron stood for.
 
It wasn't just a passing fad. It wasn't just sales patter to steal power. He really meant it. (Didn't he?). There on that train, fresh from his election win, Cameron could have made his people's podcast about anything. He used it to utter those words that have been repeated often since he deleted all his transparency speeches last month - words that are almost never referenced because the reference is hard to find.

"I want our government to be one of the most open and transparent in the world," he said.

He went one step further even than that.

"Its going to make a big difference," he said.

"People will be the masters. Politicians the servants. And that's the way it should be."

There was some alarm when he first deleted his transparency speeches. It subsided too soon. The Conservative Party had not just deleted them from its website. It imposed a block that would have removed almost any trace of them from the net - by issuing a computer instruction to have them removed from the major internet repositories. It responded to the outrage by removing the block. The outrage subsided.

But even now, the speeches are not readily available elsewhere.

Not in the way Cameron, champion of the plebs, meant. Not in the way he meant when he talked about making politicians accountable to the people.

When Cameron used to talk about political transparency it was actually true. You could stick something like "Cameron transparency speech" into Google and the whole lot of them would come up in one proudly convenient batch of links to the Conservative Party website.

The Conservative website itself was astonishing. You could select a Conservative MP from a drop-down list and see any speech they had given in any year you chose. There, it was true. You could use the internet to hold Cameron to account like he said you could. There were no smarmy obstacles or small-print letdowns. Cameron stood for decency. Like English cricket. Like cucumber sandwiches, elderflower cordial and ice-cubes tinkling in late afternoon sunshine and jolly jolly tittering good humour.

But no, the speeches are not readily available anymore. Not in the way Cameron meant.

Tradition

The Internet Archive - the first stop for things people have deleted off the internet - cannot be searched in a way that really satisfies Cameron's pledge. You have to know what you are looking for. If you don't know the exact URL - and who does? - you have to know the website where the information you want used to be when it was available, and then you have to navigate laboriously through a calendar of copies the archive took before the speeches were deleted to get just the right spot where you can find a single speech. A lot of the navigation pages are simply inaccessible. Some of the content cannot be found. It's like driving to the commons across a mud flat in a Reliant Robin. This was not what Cameron meant when he said he would make politicians accountable to the people.

There are other places where Cameron's information revolution doesn't quite deliver either.

p. Bodleian Library Doors < 220px-Bodleian20040124CopyrightKaihsuTai.jpgThere's the Oxford University Bodleian Library, which published a misleading note about the deleting affair, in apparent defence of the prime minister, against the people.

"As disappointing as it may seem to many conspiracy theorists," it said, "these speeches are all safely in the care of the Conservative Party Archive (CPA) at the Bodleian Library in Oxford, and still freely available to anyone who wishes to consult them."

It then published one of the missing speeches in a series of 16 .jpg images, which stretched down the page in mocking parody of the accountability its words described. (.jpgs, really). If you saw this article and felt placated you were a fool.

If you scrolled - and scrolled, and scrolled - down beyond the .jpgs you would have read that Cameron's deleted speeches had indeed been donated to the Bodleian Library.

But you could only read them if you actually visited the library in Oxford. What are the chances you would get a day pass if you turned up in your mud-spattered Robin Reliant and said, "Excuse me, I'm a pleb. Please can I read the prime minister's speeches?".

The Bodleian assured readers of its blog that it had been working for two years on a system that would "soon" present the deleted speeches to people on the internet.

"Ultimately, when funds allow, the intention is to enable free-text searching of all the Party's vast number of speeches, online, and free-of-charge," it said.

This must be what the PM called, on launching his bid for government in 2006, the "democratisation of information". What are the chances the Bodleian will have it finished before the next election? And what are the chances it will be searchable by plebs using Google?

Take the British Library, for example - an institution your correspondent looks up to with a sort of plebeian awe (like Jude the Obscure). The British Library had taken copies of the Conservative speeches and put them on a curated collection of websites called the Web Archive. But it blocked access to that archive by search engines using a robots.txt exclusion, in the same way the Conservative Party obscured the prime minister's speeches from the internet after it deleted them in October.

User-agent: *

Allow: /wayback/archive/20100921095059/http://www.gtce.org.uk/

Disallow: /wayback/
Disallow: /wayback2/
Disallow: /waybacktg/
Disallow: /ukwa/search/
Disallow: /aadda-discovery/

http://www.webarchive.org.uk/robots.txt
The Library's exclusion would prevent anyone finding the prime minister's speeches unless they went specifically to its domain to look for them.

They would have to know where to look. The same is even true of the Internet Archive, which may have a permissive robots policy ("Please crawl our files," it says), and even has a boggy collection of the prime minister's speeches, but they still did not come up in Google and other search engines polled during ongoing research conducted for this story since September.

This was not the vision David Cameron conjured when he flattered Google on the campaign trail for its "truly amazing" democratization of information.

He didn't say: "You won't be able to find my speeches pledging transparent politics when you look them up in Google in a few years time. But you might find a few of them if you scratch around long enough. With any luck you won't have any time left to hold me to them because it was just a load of sales patter anyway."

The British Library said in a statement it had just finished its first crawl of the entire UK web domain after it got legal powers in April (under the Legal Deposit Libraries (Non-Print Works) Regulations 2013 Act). But plebs were again given seriously limited access to this information.

The 2013 regulations said the UK Web Archive would be available only to people who visited a "Legal Deposit" like the British Library in person and sat at one of its terminals. Any pleb with ideas above his station knows not just anyone can get a reader's pass to the British Library.

Bill Woodrow - Sitting on History - British Library Sculpture.jpgLikewise, the list of other Legal Deposit Libraries reads like a who's who of gated cloisters: the Bodleian Libraries of the University of Oxford, Cambridge University Library, The National Library of Scotland, The Library of Trinity College, Dublin, and The National Library of Wales.

Can I av wonerem reader pass fings?

Get this though.

Cameron's 2013 internet legislation said that not only would you have to visit some posh library to read the UK web archive, but it could only be viewed by one person at a time! If you were sat at terminal A in the Bodleian Library or whatever and you wanted to look up that other speech Cameron gave at a Google conference when he was crawling to be prime minister, and someone else was already reading it on terminal B, you would have to wait till they'd finished reading it first.

That other speech, by the way, was the one where Cameron said:

"Technological advance - supported by a liberal regulatory regime - has transformed the amount of information that's available, the number of people who can get hold of it, and the ease with which they can do so."
He did not say, "When I come to power I will make sure the internet's historic record is locked up in some inaccessible library vault where it will be as inaccessible to plebs as information ever was before."

He said his transparency ticket was "noble" and he sort of likened himself to John F. Kennedy.

Cameron reckoned the dead president's "Ask not what your country can do for you - ask what you can do for your county" skit really meant something, now that plebs had information, had knowledge, had power: "So a new generation of politicians can help make that noble dream a reality."

Conservatives erase Internet history

| 12 Comments
| More
The Conservative Party has attempted to erase a 10-year backlog of speeches from the internet, including pledges for a new kind of transparent politics the prime minister and chancellor made when they were campaigning for election.

Prime minister David Cameron and chancellor George Osborne campaigned on a promise to democratise information held by those in power, so people could hold them to account. They wanted to use the internet transform politics.

But the Conservative Party has removed the archive from its public facing website, erasing records of speeches and press releases going back to the year 2000 and up until it was elected in May 2010.

It also struck the record of their past speeches off internet engines including Google, which had been a role model for Cameron and Osborne's "open source politics".

And it erased the official record of their speeches from the Internet Archive, the public record of the net - with an effect as alarming as sending Men in Black to strip history books from a public library and burn them in the car park.

Sometime after 5 October, when Computer Weekly last took a snapshot of a Conservative speech from the Internet Archive, the Tory speech and news archive was eradicated.

Conservatives posted a robot blocker on their website, which told search engines and the Internet Archive they were no longer permitted to keep a record of the Conservative Party web archive.

The Internet Archive was unavailable for comment. But a fortnight after Computer Weekly started asking its San Francisco HQ for an explanation, the Conservative speeches have begun reappearing on its site.

CW had asked the Internet Archive to explain how the historic record of the lead party in the coalition that holds power in the UK could simply be erased.

The Conservative Party's robot blocker forced the Internet Archive to remove the entire record of speeches and news it had collected, in 1,158 snapshots it took of the Conservative website since 8 May 1999.

The Conservative bot blocker listed all the pages barred for public consumption thus (excerpt):

Disallow: /News/News_stories/2000/
Disallow: /News/News_stories/2001/
Disallow: /News/News_stories/2002/
Disallow: /News/News_stories/2003/
Disallow: /News/News_stories/2004/
Disallow: /News/News_stories/2005/
Disallow: /News/News_stories/2006/
Disallow: /News/News_stories/2007/
Disallow: /News/News_stories/2008/
Disallow: /News/News_stories/2009/
Disallow: /News/News_stories/2010/01/
Disallow: /News/News_stories/2010/02/
Disallow: /News/News_stories/2010/03/
Disallow: /News/News_stories/2010/04/
Disallow: /News/News_stories/2010/05/
Disallow: /News/Speeches/2000/
Disallow: /News/Speeches/2001/
Disallow: /News/Speeches/2002/
Disallow: /News/Speeches/2003/
Disallow: /News/Speeches/2004/
Disallow: /News/Speeches/2005/
Disallow: /News/Speeches/2006/
Disallow: /News/Speeches/2007/
Disallow: /News/Speeches/2008/
Disallow: /News/Speeches/2009/
Disallow: /News/Speeches/2010/01/
Disallow: /News/Speeches/2010/02/
Disallow: /News/Speeches/2010/03/
Disallow: /News/Speeches/2010/04/
Disallow: /News/Speeches/2010/05/
Disallow: /News/Articles/2000/
Disallow: /News/Articles/2001/
Disallow: /News/Articles/2002/
Disallow: /News/Articles/2003/
Disallow: /News/Articles/2004/
Disallow: /News/Articles/2005/
Disallow: /News/Articles/2006/
Disallow: /News/Articles/2007/
Disallow: /News/Articles/2008/
Disallow: /News/Articles/2009/
Disallow: /News/Articles/2010/01/
Disallow: /News/Articles/2010/02/
Disallow: /News/Articles/2010/03/
Disallow: /News/Articles/2010/04/
Disallow: /News/Articles/2010/05/
For pages at these addresses, the Internet Archive reported: "Page cannot be crawled or displayed due to robots.txt".

An administrator at the Internet Archive HQ in San Francisco said its guidance for lawyers explained the mechanism. That was that if a website, like Conservatives.com, put up a robot blocker, those pages it blocked would simply be erased from the record as a matter of etiquette.

The erasure had the effect of hiding Conservative speeches in a secretive corner of the internet like those that shelter the military, secret services, gangsters and paedophiles.

The Conservative Party HQ was unavailable for comment. A spokesman said he had referred the matter to a "website guy", who was out of the office.

It wasn't always going to be like this.

Such as when the prime minister first floated his groovy idea that the democratisation of information would transform politics, at the Google Zeitgeist Europe Conference, on 22 May 2006.

"You've begun the process of democratising the world's information," he told the Googlers. "Democratising is the right word to use because by making more information available to more people, you're giving them more power.

"Above all, the power for anyone to hold to account those who in the past might have had a monopoly of power - whether it's government, big business, or the traditional media," said Cameron, who was then campaigning for power as leader of the Conservative opposition.

Cameron was going to make sure the information revolution would hold people like prime ministers to account, he said another speech on 11 October 2007, at the Google Zeitgeist Conference in San Francisco.

"It's clear to me that political leaders will have to learn to let go," he said then. "Let go of the information that we've guarded so jealously."

Transparency would make public officials accountable to the people, said Cameron then. He was riding at the front of the wave that would wash us into a new world, and a new age.

Likewise the chancellor, who on delivering his landmark "Open Source Politics" speech at the Royal Society of Arts on 8 March 2007, declared his ambition was "to recast the political settlement for the digital age".

"We need to harness the Internet to help us become more accountable, more transparent and more accessible - and so bridge the gap between government and governed," said Osborne.

"The democratization of access to information... is eroding traditional power and informational imbalances.

"No longer is there an asymmetry of information between the individual and the state, or between the layperson and the expert," said the Chancellor when he was campaigning for election.

If the Conservative Party had moved its speeches and news archive to a more convenient location it had managed to do it in a way that hid it from the search engines. It might before long end up at the Oxford University's Bodleian Library, which keeps the official Conservative Party archive of really old stuff like speeches from the days before the internet.

The robot blocker - a robots.txt file - tells software bots run by sites like Google and the Internet Archive to bog off. The bots grab web pages for the benefit of plebs like those Cameron and Osborne claimed to be speaking for in those years before they were elected. The bots were what made the democratization of information possible. It was bots that inspired Cameron and Osborne. It was bots that were going to free us from serfdom in the way they said we would be. Without the bots you just had pockets of power and privilege for those in the know. Without the bots you just had the same old concentration of wealth and power there had always been, since long before the Internet Archive started taking snapshots of the Conservative website in 1999.

Halloween Special
Open source government: smelt in England

| 2 Comments
Halloween Special
Open source government: smelt in England"> | More
Thumbnail image for macbeth_5_lg.gifSlag is what's cast off when you smelt metal. Excrement is what an intestine chucks out after it has sucked the life out of some juicy morsel. The Government Digital Service is a zinc-alloy digestive tract eating its way through Britain's government administration.

All there will be - aside from the discards - when it has done eating, is what the coalition government really meant when it said it wanted open source software in government: an automaton as efficient as a perpetual motion engine and as cold as steel - shiny, oiled and ready to be sold off.

Just how close GDS gets to licking the plate will be entirely dependent on how willing government departments are for it to subsume their services. They have been ordered to give up their ken but the bell has not yet tolled.

If the coalition sees its ambition through there won't be much open source in the UK public sector because there won't be much of a public sector.

This was presented to government departments, remember, as a plan for open source software and open standards. The idea was sort of homely: the public sector would use public software, so its systems would be like a sort of sandpit that everyone could have a play in. That was the idea of open source, wasn't it?

Cabinet Office assembled a group of open source all-stars called the Public Sector Group and charged them with overseeing strategy implementation. They were supposed to cajole systems integrators into supplying open source software, and persuade public bodies to procure it. But nobody would listen, they kept saying when they convened in their PSG meetings.

It all ground to a halt about a year ago. Cabinet Office and OSS campaigners had been ranging all over the shop for the first couple of years the coalition was in office. Open source was the thing to be. Now everyone is on sabbatical.

Tariq Rashid, who led the Cabinet Office open source strategy as its senior responsible owner, has left and apparently gone into hiding. Qamar Yunus, author of the ICT strategy and architect of the IT spending moratorium that made its implementation possible has gone back to finish a mathematics PhD at Cambridge. Mark Elkins, who as head of the British Computer Society's Open Source Specialist Group led the open source road show round the government departments has gone to finish his PhD at Southampton. Gerry Gavigan, former Treasury official who led the public campaign as chairman of the Open Source Consortium, has laid down his gavel, and his chairmanship. Graham Taylor, who heavied things along in the back as head of the Oracle-backed PSG sponsor Open Forum Europe, has gone to spend more time in Brussels. Basil Cousins, UK chair of Open Forum Europe and PSG secretary, has gone into retirement.

Most of the dissolution happened this time last year after government CTO Liam "Gnasher" Maxwell told the PSG the coalition's ICT strategy was being scrapped.

"The current Government IT Strategy is being refreshed in line with the Digital Strategy," Maxwell told the PSG on 19 September last year. Then they all cleared off. The Digital Strategy was unannounced at that point. The PSG had one more meeting after it was announced, in November, with some more of the same talk there had always been and the same apparent lack of progress.

But that didn't really matter in the long run. There had been this Digital by Default Strategy that web entrepreneur Martha Lane Fox fronted in 2010. That had sent GDS off, scoffing up as many of the public sector's 820 independent public websites as it could get its chops round. It has been foisting their carcasses up on Gov.UK, the single web domain from which GDS asserts its dominion.

Digital by Default got thrown in the forge with the open source and open standards policy, and the "Skunkworks" cell that had just been turned into GDS, and the Gov.uk website got thrown in there too, and some hubble bubble toil and trouble, and some cheap smoke and slime effects courtesy of HM public relations department. And - cackle, cackle - what erupted was a white hot alloy they called the Digital Strategy, that burned like a Halloween phoenix with its given orders: that it must consume each and every major "legacy" government computer system, gnash it up, sack its back-office staff, and reassemble it with Frankenstein staples and masking tape into a web mobile app with some whizzy Java buttons.

The Digital Strategy is in effect the mechanism by which the coalition will privatize the UK's entire public administration. It has been taken up by local government too, after central government squeezed budgets there. The impetus is the same there as for central government: spending moratorium, budget cuts, and the transformation of public services into automated web apps.

GDS is cranking up like a Hadron Collider with a missing wing nut, like, it wasnae meant to turn into a black hole but I cannae do anythin cap'n, she's started sucking in the entire public sector.

The Cabinet Office says now it never developed the metrics it promised would hold it to account for its promise to deliver OSS to the public sector: records of what software was being used where. It appears no longer to have an open source SRO. It is reluctant to answer questions. Its companion strategy for open standards was laid up and then morphed into an open data initiative.

The Digital Strategy says departments should make these apps open source, when there isn't a good reason to do otherwise. It seems at this early stage that there may more often than not be a good reason otherwise. But that is irrelevant anyway. Much of anything of substance is going into the cloud. Much of anything else will be in the hands of the market, which is inclined to use open source only as an unscrupulous market stall trader sticks the nice fruit on top of his foetid punnets to create a fool's eyeful at the end of the day. There's the coalition ICT strategy for you.

There are signs now the PSG might try and get together for a come back tour. OFE convenes tomorrow, All Hallows Day, under a new chairman - Peter Dawes-Huish, head of Bristol-based Linux-IT. Top of its agenda is whether the old gang should get together again - whether the PSG should reform - those that haven't cleared off already. Mind how you go now.






Welfare state infrastructure eyed for scrap

| No Comments
| More
ICL System 25 1980s VME system.pngThe Department for Work and Pensions has refused to answer allegations that it is attempting to scrap the core computer systems underpinning Britain's welfare state.

It was due to conclude a 100-day, x-ray review of computer systems being combined under troubled Universal Credit benefits reform programme on 30 September, in collaboration with the Cabinet Office. But has said the review is ongoing.

A source at a DWP IT contractor said the department had started a pilot to see if it could replace its core benefits systems with something more in keeping with the Cabinet Office's disruptive ICT policy.

It was a Cabinet Office 'rip and replace' strategy that would put £70bn in social security payments delivered by seasoned IT systems being subsumed into Universal Credit at high risk of failure.

It would also be a radical change in the Universal Credit strategy agreed by Cabinet Office, DWP, industry and advisers in 2010/11. The consensus had been to keep existing systems and construct Universal Credit on top of them. The consensus had been, if it aint broke don't fix it.

The supplier source said DWP was trying to determine if it could scrap the old systems after all.

"DWP got eight SMEs to see which ones can get it off the old systems the fastest," said the source. "GDS (the Cabinet Office Government Digital Service) are trying to seed better processes in the DWP - the DWP pilot is to get off these old mainframes."

Stonewall

A DWP spokesman said rip-and-replace was one options being considered as part of the redesign of Universal Credit with the Cabinet Office. He later retracted his statement.

He said the DWP/Cabinet Office review of Universal Credit - due to conclude nearly two weeks ago - was still ongoing. Its outcome would be reported in sometime this Autumn.

He had said he must consult with the Cabinet Office communications department before making any statements about Universal Credit. He later retracted his statement.

The Cabinet Office has for more than a month ignored repeated requests from Computer Weekly for details of its review of Universal Credit with DWP. A spokesman has claimed ignorance of any terms of reference for the engagement or indeed, yesterday, of any 100-day review.

Cabinet Office doors.jpgThe National Audit Office said last month the 100-day review began in May when Cabinet Office delivered a blueprint to remodel Universal Credit with 99 detailed recommendations - the product of an earlier "reset" of Universal Credit conducted by the Cabinet Office.

Sauce

The source pitched the pilot as a last-ditched attempt to save Universal Credit from problems caused by its legacy systems.

He said ministers were afraid to make the pilot public because it would look like Universal Credit was a failure: so much so, that DWP employed companies including legacy system consultants BluePheonix on the pilot without following proper procurement rules, so it could keep the project secret. BluePheonix was unavailable for comment.

The idea was that DWP had spent half its budget to date building Universal Credit. Yet it had to try and find the money to rip-and-replace all its old systems. The race was to get them replaced quick lest it must turn round when Universal Credit was complete in 2017 and admit it still had all these old systems, as though they were an embarrassment.

NAO - SEP 2013 - Managing the Risks of Legacy IT graphic.pngAn NAO report on legacy systems had last month given the Universal Credit review team justification they needed to scrap the old systems, said the source.

Spice

But the NAO report was misleading.

While the report made a show of condemning legacy systems, it actually reinforced the 2011 consensus that stable, dependable systems should not be replaced with high-risk rebuilds.

Most of the key arguments it made against "legacy" systems were clearly misleading, subtly couched, or unsustainable. Most of its conclusions that were simply convincing were those few it found against scrapping existing systems.

Its key graphic gave the first-glance impression that the risks of keeping legacy systems were overwhelming, when it actually showed the risks of replacing them were at least as great.

In short, it undermined the Cabinet Office's case for replacing existing systems. It was as though the NAO had got one set of findings but attempted to write a report that said the opposite.

Those legacy systems it examined - including national pension systems at the DWP - were satisfactory. It would be "impossible" for Cabinet Office to make a "robust case for change" because it had not done a full analysis of legacy performance, efficiency and costs, said the NAO.

Warning: change

Meanwhile, it warned reformers: "Major change that involves underlying ICT will create a new set of risks which will increase as the degree of system change increases."

The case study it used to justify many of its claims against legacy systems was moreover not a legacy system at all. It was an administration system called PROMOD the Office of Fair Trading installed in 2007. Legacy systems are usually understood as those using old, unfashionable technology, like the major systems used by the big departments of state: written in Cobol and run in a mainframe operating environment like Fujitsu/ICL's VME.

The NAO included the recent PROMOD because it had defined "legacy" so broadly for the purposes of its report that it could encompass almost any implemented computer system: those superseded by new technologies, or those used in an organisation where business needs had changed.

"PROMOD is an illustration of how relatively new systems can develop the characteristics of legacy IT," said the NAO, prejudicially.

But the characteristics of legacy IT are widely considered to be reliability and stability. The characteristics of PROMOD were software glitches and high costs. PROMOD had in fact the characteristics of new systems, which it was.

PENSIONERS - MONEY.pngNAO's idea was that PROMOD was like a legacy system because statutory uncertainty had prevented it from making improvements. The myth is that legacy systems of themselves prevent improvements.

The real legacy

Elsewhere, NAO cited how legacy systems could be modified with great success, and with great significance to Universal Credit.

DWP's major 2005 modification of legacy pension systems led to a 30 per cent cut in its cost of payments to pensioners.

This was a customer account management system (built in the vein of pre-reset Universal Credit), "that draws together customer information from multiple legacy ICT systems to simplify the processing of pension cases."

Conversely, the strongest example the NAO had of a new system was a cause not of celebration but concern.

The NHS had cut costs of processing prescriptions 47 per cent by introducing a new system, called Capability Improvement Programme (CIP). But its rate of error had increased. With the system processing 0.6m prescriptions every month, even a small margin of error could have serious repercussions for the sick, the elderly, and the very young. The NHS still keeps the legacy system going, though CIP was switched on in 2007.

NAO had apparently cited PROMOD and other examples to draw conclusions against legacy systems, when it actually illustrated the case against hasty change.

Cobol

Industry makes a similar case for cautious change. The old Cobol systems are still often the work horses of computing. Companies with man hours and money sunk deep for decades into complex Cobol systems that handle millions of transactions dependably are not always in such a rush to get rid of them. Yet Cabinet Office policy is to replace all major government systems with "digital" web and mobile services by 2017.

"If you look at the major airlines, all the ticketing works at the back-end but the parts that go wrong are always the new applications," said the director of one legacy systems supplier who asked not to be named. "Web applications are far more fragile than these applications are meant to be."

This is the conventional view of Cobol - that it can be depended on to deliver £70bn in social security benefits payments to 8m households efficiently and without a hitch.

Derek Britton - director of solutions marketing - MicroFocus.jpg"The question is if we modernize a system will it be as robust and reliable as our existing systems," said Derek Britton, director of solutions marketing at Micro Focus, a supplier of Cobol tools said to have been deployed in the Universal Credit revamp.

Seasoned Cobol systems typically promise 99.99999 per cent reliability, said Britton. "It's hard getting that on a new system. Cobol just runs and runs. It's the other parts of the jigsaw that aren't as reliable - that's a major consideration of modernisation."

"Legacy IT" myth

Cobol experts likewise refuted the myth that informed the Cabinet Office ICT strategy, and had been parroted by the NAO, that a legacy system was an albatross around the necks of anyone wanting to do contemporary things with their IT.

It was standard practice for big organisations to modernize their workhorse Cobol back-end systems by 'wrapping' web services around them. This was the industry and government consensus for Universal Credit in 2011.

The same strategy was adopted by the Ministry of Defence, which also depends heavily on VME Cobol for its core systems. It was still the consensus in banking, where the immense volume and critical quality of transactions was most comparable to the core systems of the big departments of government.

Thumbnail image for Bev Brookes EZSource.JPG"No bank we have ever spoken to is looking to remove or replace their core systems. They are looking to protect their core systems. They are looking to put more wrappers on the same bank-end," said Britton.

Beverley Brookes, vice president of marketing at EZSource, another Cobol tools vendor, said concerns for security, regulatory compliance and reliability had led banks to keep their Cobol systems going.

Experts said banks and utilities had implemented web front ends and nifty developments like client banking without having to rip out their core Cobol systems.

Old school

Harold Cloutt, a consultant in VME Cobol systems since the 1970s, said it was feasible to build web interfaces on seasoned government systems. Even self-service Web benefits were possible.

"If it aint broke, don't fix it," he said.

The Cabinet Office on the other hand decreed last November that the seven major departments of state must complete an "end to end" transformation of three major transaction systems each by 2015 "as a first priority". It aimed to replace all major government transaction systems by 2017.

Where banks had clung to their Cobol systems as the bedrock of a reliable service, Cabinet Office had ordered the opposite. At least, it ordered them scrapped. It must be assumed that it has other priorities than the stable delivery of social security.

What it seeks to do with digital services (web shops and mobile apps) was always deemed possible with the legacy systems it has - and has proven possible in other industries. The seasoned Cobol systems that run the welfare state embody the investment, intelligence and construction work of generations. Their work secures the security system. If the Cabinet Office razed it to the ground it may create greater possibilities for change, but the trade off in risk, cost and disruption to the department of social welfare may be too great. It has not made the case for a blank sheet.

Thumbnail image for Francis Maude makes further savings that beat expectations - AUG 2012.pngWhen Cabinet Office introduced its Digital Strategy last November, however, it said it had exempted Universal Credit from the 2015 deadline for a full rip-and-replace.

"We recognise Universal Credit means a major transformation programme is already under way," it said. "As such, Cabinet Office will take a flexible approach as to any further commitments to redesigning services prior to March 2015."

Now word is the legacy systems are a hindrance. But it is not clear how or why. A lack of transparency may hide poor decisions. Legacy systems, suffering under the myth of waste and failure perpetuated by the Cabinet Office, may become scapegoats for Cabinet Office failure to implement its reformist policies: barriers to total change. Perceived hindrance like that apparently supported by the NAO's legacy report may create a false case for seasoned systems to be scrapped where there was no case to be made.

Battle lines

Cabinet Office has said its reforms are about cost, competition, innovation and IT failure. But up to 80 per cent of £1.7bn Cabinet Office minister Francis Maude says his Digital by Default Strategy will cut in central government costs will come from jobs replaced by automated web services.

It looks more like a strategy to decommission the big departments of state. It has at least pledged that their automated replacements will be made in open source software: they will initially at least be a form of licensed public property.

The Digital Strategy was however held up by the Cabinet Office's failure to deliver its answer to the last government's failed Identity Scheme. This may have been one of the main contributing factors to Universal Credit's supposed development problems. It's adoption of Digital by Default was dependent on having a sure way of transacting government business online: identifying social security claimants, for example.

ICL2966-600x400.jpgChris Pennell, principal analyst at Kable Research, said DWP had given up an aim to administer 80 per cent of claimants online and was ushering them back into job centres.

The Cobol crowd see this is a battle between the open systems fanatics and history. It looks also like a political battle. The departments of state seem like the last bastions of public ownership. Their Cobol systems embody their business processes.

Those systems are said to be black boxes people do not understand and are afraid to touch. That is the basis for rip-and-replace: that these systems are too big and complex for anyone to change.

Out of fashion

Cloutt said they it could be difficult to change business processes embedded in these old Cobol systems. The primary reason was the people who built them and had the skills to operate in mainframe environments like VME were now, increasingly, reaching retirement.

Fujitsu Primergy Server.pngThe myth, by the way, is that DWP's core benefits systems are run on creaky old mainframes. It goes with the fusty image of a public sector that is not as dynamic and sharp-suited as private sector hipsters.

Yet VME mainframe systems are not run on mainframes any more. They are run on Intel PC servers.

VME nowadays runs as a virtual system on a standard Linux or Windows server, said Cloutt. It emulated the old ICL mainframe hardware, to give the VME system something to put its roots into. But you still needed VME skills to operate it.

"It shouldn't be difficult to change the application to change business processes," said Cloutt. "But you have to get the skills to do it. It's not fashionable to write Cobol applications for VME any more."

Fujitsu's solution was to redevelop the VME operating system. It told customers it would bring VME in-house in 2020. It would no longer support the emulated VME environments.

It would instead emulate VME itself. Where before the emulation was to convince VME it was running on a mainframe when it was in fact an Intel server, the emulation would now convince the Cobol application it was running on VME when it was just talking to an interface, he said.

The older emulation was a big undertaking: a fully-fledged operating system environment is a lot to develop, maintain and administer, and it had to be adapted to changes in its underlying hardware platform. The newer emulation would run on anything. It was a greater level of abstraction and would require fewer outmoded skills to maintain it.

r2.gifLong live VME

The result, said Cloutt, would be better for customers like DWP. It would be cheaper to support. Fujitsu would operate it like a cloud computing service - over the wires. Customers would not have to find dwindling skills to maintain it.

"We are all getting older and retiring," said Cloutt, who was president of Axis - the VME User Group - until it wrapped up in January after 38 years. (He is talking to the National Museum of Computing about receiving the user group's archive).

"People who host VME will not be looking after VME. [Fujitsu's 2020 change] reduces your support commitment. You can't afford to rewrite the application. It's not trivial to change these big applications, but if it's Cobol it's portable. That's where Fujitsu is coming from," he said.

Yet Fujitsu's 2020 deadline is being touted as another reason why DWP should redevelop its core Cobol applications. It plays into the Cabinet Office legacy bogeyman - that government should not be locked into a sole supplier. It is being said that DWP must after all be hasty about redeveloping those core applications it had been the consensus before to leave well alone.

The Fujitsu deadline seems instead to ease the pressure. The skills pressures are not as great as they were either, says Brookes, though they did reach crisis point.

"If its mainframe, its retired. In the 90s companies were not training mainframe people because analysts and publications like Computer Weekly were saying, 'Mainframe is dead, long live open'. But not given the number of mainframes we see around," said Brookes.

Mangle < Aunt Mable

Big organisations outsourced their skills offshore in recent years, she said. But they were bringing them back in-house or building their own offshore teams. They were training again too.

Political uncertainty has nevertheless put upon Universal Credit the same sort of risks the NAO had conflated with "legacy" computing. Like at the OFT, all development has been frozen for months while the DWP waits for permission from the Cabinet Office to continue. The apparent failures of the OFT's 2007 system were attributed to this freeze.

These failures contrasted with the DWP's trusty legacy pension system. As a percentage of the full cost of service, said the NAO, the pension's legacy system cost 2 per cent. PROMOD was costing 7 per cent of the licencing system it supported at the OFT.

There must be another explanation for the whisper of skulduggery you may hear about the rewrite of DWP Cobol apps being rushed through under the guise of a Universal Credit rescue: crossed wires, perhaps - Chinese whispers.

Anonymous.png"They are putting the benefits systems through a mangle, so they end up with something being run on a different platform," said the source.

"It gets you off the platform so you don't run into a Fujitsu brick wall in 2020. They are putting them through an automated mangle to turn it into Java code. But if someone needs to make changes you will still be screwed. Having someone mash it around makes it no more understandable. It's still completely inscrutable code."

The solution, apparently, is incoming Cabinet Office procurement reforms. That's the same thing people in the open source lobby are saying to watch in place of policy delivery in their neck of the woods.

Tools

Simon Jennings EZSource.jpgEither way, Cobol tools vendors are quids in. Simon Jennings, vice president of sales at EZSource, said it just delivered to Fujitsu tools to get Cobol applications off VME and into the cloud. It did so on 30 September, the same day the DWP's 100 day review of Universal Credit was due to conclude. Micro Focus was unable to confirm that its tools are being used in the Universal Credit mash-up.

Those tools do application analysis: cutting the time it would normally take to understand an application - prising the lid of the black box open. They could help Cabinet Office justify a 2015 rip-and-replace on Universal Credit. They would just as easily justify stopping a high-risk rip-and-replace of the nation's social security infrastructure - by making it possible to shine a light in the black box and allay any fears about what lives therein. It would help DWP do what the Cobol crowd say is popular in banking: a spot of pruning, lopping off some bad branches, replacing some bits, keeping others, but essentially keeping the core Cobol applications.

Anything else may be reckless. Even fancy-pants Java programmers turn out black-box software when they don't write documentation, say the Cobol crowd in not so many words. That's the problem. Not Cobol - a famously straightforward language, long-suffering for being unfashionable.

"A typical story we have," says Brookes, "is front office systems in Java or .NET, then ERP - Oracle or SAP common services - but then you will still see the mainframe doing a lot of the business-critical, highly customised heavy lifting. Banks are not wanting to get rid of it, they are trying to upgrade it and improve what they've got."

The Cabinet Office blueprint for redesigned government systems is otherwise compelling: agile methods, open source software, public code, and open standards for universally compatible systems. It has the Zeitgeist. It has long been inevitable. It may be universal in a few years. But the Cobol crowd are sceptical.

"It isn't a new phenomenon," says Britton. "It's a tale you hear - 'The bleeding edge sets the exception'. It's the utopian tomorrow of the universal adaptor that can plug into anything else."

And then there's the computer cock-ups that helped catapult the coalition to power with its current code for change. Those were over-ambitious modernizations that went wrong. The US military went between $6bn and $7bn over budget on attempts to replace Cobol systems with ERP systems. ERP was last decades' corporate computing fad. Last year, exasperated with ERP, the US Department of Defence adopted the latest fads - a similar programme for change to the one the Cabinet Office is in such a hurry over.

UK debuts first open standards with fanfare of kazoos

| No Comments
| More
Carry on Up the Kyber.pngUK open standards policy was going to revolutionise computing - it was even going to revolutionise society. It has instead been subverted itself, by an irony of heroic proportions.

Open standards - the keystone of the coalition government's "post-bureaucratic" reforms - the acid that would eat away the digital cuffs that held British society in hock to corporate power and "vested interests" - has been eaten itself by the very bureaucracy it set out to disrupt.

Britain's legendary bureaucratic machine consumed the coalition government's open standards policy like a boa constrictor would consume a child: one languorous gulp at a time.

Last Friday, three and a half years after the coalition promised revolution, the bureaucratic boa regurgitated a remnant of that policy to a fanfare of kazoos.

This came two and a half years after it first published its strident policy, two years after it revoked it, a year after it restated it again but weakly, and six months after it gave the policy to a committee. That committee last week week issued a statement that approximated the obvious. The post-bureaucratic age has surely arrived.

The Cabinet Office Open Standards Board declared loosely for two data standards that were already loosely de facto standards.

It said British government would henceforward conduct all its digital business in UTF-8 - a character set: a system of codes a computer can recognise as letters (so data in spreadsheets and whatnot can be produced by one computer and processed by another).

And it agreed in principle the idea that there should be a single method of describing data items across government.

UTF-8 is already becoming the de facto character standard, having been sanctioned by the Internet Engineering Task Force and World Wide Web Consortium (W3C) - industry bodies with more clout, apparently, than HM Government's Cabinet Office. Its opted method of describing data across government was also already being implemented across government.

It was as though the Cabinet Office had hopped on the back of the number 12 bus to Trafalgar Square, yanked the bell-chord and declared: "Trafalgar!".

Open Data people were glad to hear it. They are on their way to Trafalgar already. There is more than one way to get there. The computing consensus for serious applications appears to be the number 12 bus. It is nevertheless feared the route might be criss-crossed with wayward public techies stumbling into cul-de-sacs and dead-end roads if there wasn't a spot of British brass button (leading from the back). The open data movement wants government to shepherd the strays. But it might have wanted more shepherding than it got. The wider computing industry got even less than that. And it had been promised a lot more.

That was then

Coalition ICT policy had promised to shake the software market up first - before it dealt with open data standards.

Open data was uncontentious. The software market on the other hand was powerful and corrupt. The coalition's open standards thing was about breaking the power large companies had over government. It was going to do this by using open standards in place of standards set by these powerful companies. It promised liberation.  

So it was going to declare open standards for file formats like documents. These mundane schema were the fulcrum of power by which Microsoft gained monopoly power that made it, for example, the single largest supplier of software to local government. Open standards policy - remember - was about saving people from being "locked-in" to buying licenses from powerful software companies.

It was tied up with Conservative rhetoric about the last government being in hock to big computer companies who claimed monopoly rents even while they cocked up big computer projects.

"Tony Blair tried to do this," said Liam Maxwell, Cabinet Office CTO, in a 2010 paper for the Network for a Post-Bureaucratic Age, a Cameronian think tank where the Conservative IT reformists came together.

"But the ambition was never matched by delivery - even a powerful premier found it difficult to continually push against the stasis of Whitehall.

"Open Source and Open Standards ... they were squashed by a lazy establishment," said Maxwell's NPBA paper.

Within 9 months Maxwell had taken the very same baton up at the Cabinet Office. His orders, set by the coalition ICT Strategy:

"The first wave of compulsory open standards will determine, through open consultation, the relevant open standard for all government documents."

This is now

Two-and-a-half years on, that policy has been squashed too. It is not just a U-Turn. It is a momentous let down.

Open document formats - and the resolve to tackle the $multi-billion software giants that oppose them - have been relegated to a point of further business so far down the Cabinet Office agenda that at its current rate of progress will take many years to fulfil. There's no drive left in the policy. It's got flat tires. It's going nowhere.

A spokesman for the Cabinet Office said today: "The ICT Strategy said document formats would be in the first tranche of compulsory open standards.

"However, we have chosen to focus on less contentious standards challenges first, to test the process and the Standards Hub," he said.

Maxwell's effort on open source and open standards went the same way as the last government's. Only he and his government made more fuss about doing it. We are going to make things better, they said.

But Cabinet Office faced intense opposition from Microsoft and other powerful software suppliers - most notably, Apple and Microsoft. It faced opposition from international standards bodies. It faced opposition in Whitehall, from the Department of Business, Innovation and Skills. The threat of legal action from software suppliers sent the Cabinet Office scurrying for help from public consultations: twice. It led to this snail-speed, pigeon-chested open standards board that represents all that is left of the coalition rally cry for open standards and open source. It's open source initiative has run out of steam as well.

When last year Maxwell finally slipped the overdue standards policy out, his office insisted it would apply to the likes of Microsoft and Oracle. But that was only in theory.

The small print said any new open standards would have to be approved first through yet more public consultation and a hierarchy called the Standards Hub on top of which sat the Open Standards Board, the bureaucratic body that as head of the bureaucratic boa took a languorous year to form and make its first approximate statement of the obvious.

Other governments and public authorities around the world have opted for open formats without such a fruitless and prolonged rigmarole. The Cabinet Office has effectively given up on software standards. It has been cowed by the industry forces it puffed itself up to oppose.

That be that

It has been left with open data, the field where its endorsement of standards last week was essentially, prosaically, unimpressively but quite respectfully bureaucratic.

The Open Data movement - introduced to government in the UK by Sir Tim Berners Lee before the coalition came to power in 2010 - is developing and is largely in agreement over its own technical standards. The government has endorsed them only as far as it can be sure it is endorsing the direction of travel.

Sir Tim Berners Lee's idea was that data linked by the sort of standards loosely approved by the Open Standards Board last week will subvert the document as the dominant form for storing, transporting and searching for data.

It may not depose the document completely - or not perhaps for a long time to come. But open data is one way at least where the post-bureaucratic banner raised by prime minister David Cameron and his IT reformers in 2010 will subvert state power. But it is a way that was already under way, largely determined by forces beyond the government's control, and the coalition has done little more to help than was already being done.

It is an area not yet asphyxiated by greedy corporations, as closed standards like Microsoft's proprietary document format had asphyxiated software. But when they make their move - as is their way - could you count on this government to fight for the open standards principles it still claims to champion, and stand for the common good against them?

Doofus alarm over £10bn NHS IT bill

| 1 Comment
| More
Lorenzo_de_Medici2.jpgDon't stop the clocks. The NHS might have spent £10bn on its controversial IT programme. But that might be a decent price.

It did sound bad, when the Public Accounts Committee report on NHS IT summed it up this week.

The NHS National Programme for IT will not only have cost £10bn by the end of life of its systems in 2024, according to a National Audit Office review that informed the committee report. A lot of it remained undelivered. The NHS had derived just £2.7bn financial benefit out of £6.4bn spent on NPfIT between late 2003 and March 2011.

These numbers loomed in the media like a dreaded borak - like the abomination NHS IT is perceived to be. No matter that the NAO review - written under instruction from the committee - had to make do with flimsy numbers. Just saying the words £10bn and NHS IT is enough to incite hatred and get headlines, perhaps the other way around.

Richard Bacon.png"Cost of abandoned NHS IT system hits £10bn," said the papers.

The coalition government shelved the scheme two years ago but it's still wasting money!

"The taxpayer is continuing to pay the price for the ill-fated National Programme for IT in the NHS," said Richard Bacon, who has led the scandal over NHS software as the Public Accounts Committee's glitchfinder general.

But this latest condemnation of NHS IT was tired and grubby.

Glitch

It was true that the Department of Health calculated the NPfIT bill would be £9.8bn by 2024. But that was half only half the story.

The number came from a cost/benefit calculation. The other half of the story - the benefit - was £10.7bn by 2014, more than the cost. Bacon would have been more justified to say the taxpayer continued to reap the benefits from NPfIT.

It even seemed like a decent price, £10bn - even after all the software glitches that have beset the department, and the Übermenschen day-rates enjoyed by private sector IT consultants.

chrisyapp.jpgChris Yapp, an IT consultant, has estimated that programme-related IT cost the NHS about £1bn-a-year when NPfIT was conceived in 2002. Think of inflation since then, and how organisations' needs for computing have inflated too. Now look at the NAO numbers again.

The department estimated NPfIT cost was £7.3bn between late 2003 and March 2012 - roughly £900,000-a-year. NPfIT's lifetime cost, to 2014, is roughly £500,000-a-year.

Prank

The cost/benefits calculations were meaningless anyway. They were an accounting tautology. They said that by the programme's end of life, when it had delivered its systems - when it had achieved its aims - it would have realised all its benefits.

The committee didn't need the NAO to tell it that. But it did need the NAO to say it was difficult to measure the financial benefit at all.

Financial benefit - the financial return on investment - the ROI: the auditor viewed the numbers with "very considerable uncertainty".

Since when do you measure the return of your investment in a health system in pounds and pence? The department never set an ROI target for its NPfIT investment, said the NAO. NHS organisations never sought to measure it. An apparently sizeable proportion of 4,715 NHS organisations involved in the programme nevertheless pulled together to do the calculations.

They did time and motion studies to try and measure the ROI. They did staff surveys. They struggled. The resulting data was inconsistent and unreliable. The ROI was immeasurable. The NAO said some of investments had no measurable financial benefit at all. Infrastructure costs, it said, had no measurable benefit.

It's not that you wouldn't claim there was no benefit in laying computer infrastructure. You just couldn't measure it in pounds and pence, apparently.

"For some elements only qualitative benefits were identified," said the NAO.

Still, they declared NPfIT had delivered ROI of £2.7bn between 2003 and 2011, against £6.4bn costs.

Overlooked

Those benefits that were immeasurable may have been of more interest - they were invaluable. Yet the NAO neglected to break the cost and benefit figures down to show what derived from different parts of the programme.

So it was impossible to verify what seemed the obvious flaw in the exercise. The costs and benefits could not be meaningfully compared because they measured different things. The costs were comprised mostly with things that had been delivered but had no measurable benefit. The benefits meanwhile comprised mostly a forecast of things not yet delivered.

Microsoft NHS.pngWhile infrastructure and similar costs accounted for most of the £6.4bn cost to date, in other words, they had no impact on ROI because their benefits were immeasurable. The cost/benefit ratio was immeasurable.

The NAO used words to convey how much more there was to the programme than a flimsy ROI sum. It had already delivered most of a national network, medical image archiving and despatch, email, the choose and book appointments system, and what is known as the spine.

Blurb by BT, supplier of this backbone, reckons it has 899,000 users, processes 150m transactions-a-month, recording births, deaths, and GP registrations, processing 675,000 prescriptions-a-day, making 400,000 security authentications, 39,000 hospital appointments, and administering NHS business intelligence, management reporting, processing two billions queries-a-month on what it says is one of the world's largest data warehouses. It also administers Payment by Results - an NHS accounting system - processing £35bn transactions-a-year.

So this is what the NHS got for £0.9m-a-year between 2003 and 2011. What was the benefit?

Half-cocked

Someone might have told the public accounts committee there was little benefit in a benefits assessment that couldn't measure the benefits.

The NAO said something along those lines in the report the committee instructed it to write on the matter.

"There is very considerable uncertainty around the benefits figures reported in the benefits statement," it said.

Most of its uncertainty derived from those systems that remained undelivered: local patient records systems, the national summary patient record system and electronic prescriptions.

Unlike infrastructure, these systems would cost least but deliver most measurable ROI. That much could be deduced from the NAO numbers, where the absence of these components could be seen: in a small gap between present total and final estimated costs, and a large gap in present and final estimated ROI.

The NHS had yet to realise 98 per cent of benefits for these systems, said the NAO, because they had not been delivered.

It was one of those statistics that is so audacious in its absurdity that it transcends comprehension, leaving befuddled, yobbish alarm. They aint only aint delivered the systems but they aint got 98 per cent of the benefits of wot they aint delivered neever. The statistic is more damning than the fact.

CSC defense - tanks.pngThe opportunity cost of the these undelivered systems - the real tragedy of NHS IT - may be so great it seems immeasurable: ten years waiting on BT and Computer Sciences Corporation (CSC - primarily a defence contractor) to deliver the next generation of health IT software when everyone could have been getting on without them.

"Have you been to a hospital lately," Tola Sargeant, research director at TechMarketView asked Computer Weekly this week.

"It's scary. They're still pushing around trolleys of paper records."

Political accountancy

The opportunity cost/benefit can be measured. And it may be a more fruitful line of enquiry than the one that gives us the politically potent but fairly meaningless £10bn, 20-year cost.

It would require a functional measure of benefits derived from a public service computer system.

milsatcom.jpgTake the undelivered NHS programme software. CSC (the defence contractor) had implemented interim systems where hospitals needed it, while they waited for Lorenzo, its next-generation health software that was overdue in 2005. The NHS paid CSC £995m since 2003 for 2,665 interim systems, delivered to 179 NHS trusts and 1,800 GP surgeries.

While the NHS was denied 98 per cent of the benefit promised by the unfinished Lorenzo, it benefited instead from alternative implementations of 87 patient administration systems, 128 child health systems, 38 systems for operating theatres, 29 for accident & emergency units, 130 for community practices, 80 for prisons and six ambulance systems.

The NAO said its survey did factor the £benefit of interim systems in its calculations. Those were the systems the NHS used for a decade while waiting for NPfIT - systems NPfIT suppliers delivered in lieu of the finished goods.

These may have measured poorly against the final system: the fully-specified finished goods. The final £benefit measures only what NPfIT promises to deliver in 2024 against what has been done. It has not measured what has been done against what is presently feasible, the NAO told Computer Weekly.

Are there systems anywhere in the world, in other words, that could be installed now to give the NHS what is promised for 2024? Or is 2024 more a measure of what the health software industry can do than what the NPfIT contract can deliver?

A creditable assessment of an IT programme would take all these things into account. Not in the way an accountant or a lawyer might compare a sheet of functional requirements drawn up in 2002 to a list of software functions delivered in 2013. But by a measure of functional benefits of one set of alternatives to another.

So now Lorenzo and the rest of the undelivered software is being delivered, just how good is it, and how much progress could the NHS, and the health software industry more generally have made without it? Where is the rest of the industry, functionally? What was the benefit? Not the £benefit. The benefit.

()Benefit

The accounts committee and the public auditor it instructs may not be geared up for this sort of assessment.

Even what they do never gets to the point of why, for example, it costs a hospital about £0.9m-a-year for the next generation Cerner patient software being delivered by BT.

It is paint-by-numbers accountancy - cretinous number crunching: you said it would cost this much and take this long, but its taken a bit longer and cost a bit more - (it's an outrage!, roar the glitchfinders).

Auditor and committee never get to the point because they aren't allowed to look at the numbers: suppliers like BT and CSC insist it's "commercial in confidence".

So they clod-hop over the big ticket numbers, making big political stink bombs out of them, while the unanswered questions about invoice line items hang in the air, providing cover for political opportunists.

We have for example been encouraged to assume £0.9m is an outrageous sum for a BT patient system. It probably is. But what is it really worth? A functional £cost/()benefit break-down of a system might provide a more meaningful measure of its worth. An effort/benefit might be more meaningful still.

Other ways of measuring benefit - such as cost per user and cost per transaction - might be combined with a weighting of security demands, a measure of change, a consideration of pioneering efforts, a rate of software re-use, time and resource, to give a more useful assessment of cost/benefit.

OConnell_Patrick_CH2M-Hill-150x150.jpgThe debate about public IT has been led by accountants. Yet the accountants have not even attempted to address the most fundamental cost-benefit-function questions in public computing, such as whether open source development would deliver a better - more capable - batch of functions per penny than farming the work out to a product supplier like BT or CSC.

Half-light

We are led to believe that the NHS is stuck in the 50s thanks to NPfIT - that its software evolution has been stunted and what it will get will be backwards.

But Lorenzo is revered as the next generation in health software. They call it the SAP of healthcare. Its ambitions were audacious. Sargeant says it had never been done before. It may represent the evolutionary heights of an industry, not just all that the NHS could do in 10 years.

It is safe to assume the health software industry progressed in disregard of any desire of the legal or accounting professions for the NHS to put its computing needs in aspic - in contract - in 2002.

McKesson.pngThe computing industry did not hold its own progress in stasis all the while. It did not stay stuck in the half-lit days before NPfIT, before CSC bought iSoft and before System C - another important supplier of next generation hospital patient systems to the NHS - was bought by McKesson, the $122bn US medical supplies distributor (and another large US defense contractor); the days before health software became a global business.

It is tempting now to say the coalition vision for NHS IT is itself in stasis: stuck in a romantic, pre-NPfIT rural English idyll of bicycle baskets and antique tea shops: a hollow promise to win votes.

The coalition evoked this vision by seeking to cut CSC's contract scope after it came to power in 2010.

CSC had been contracted £3.1bn to deliver 220 systems to hospitals and other NHS trusts. After (ongoing) renegotiations with the coalition government, its obligation has been cut to just 22 hospitals and £2.2bn.

This has been cause for more bovine outrage: its allotted systems were cut nine tenths, but its pay by only a third - surely no more evidence was required to show CSC continues to fleece the NHS, and the NHS continues to pour money down the drain?

Yet most of the £2.2bn is for work already done. Three quarters of it was for non-Lorenzo work, including the £995m for interim systems at 179 trusts and 1,800 GP surgeries. The other 19 trusts are presumably not in need.

Aroma

The deal has nevertheless created the possibility that the sort of world the coalition envisaged for NHS IT might still evolve out of NPfIT: where the fabled hospital basement industry of do-it-yourself clinical software and small, innovative suppliers might unite to produce an open source software ecosystem that is the only viable alternative to systems developed by the global corporations, by US defense contractors.

It was something like the vision outgoing NHS chief executive David Nicholson relayed wistfully before the Public Accounts Committee in June.

David Nicholson.png"I was running a hospital in 1998," he said, referring to the days before NPfIT. "We had an extensive new system... built up over many years.

"It was a basic patient administration system. But many clinicians created their own systems and connected it to the base system. So over the years we built up this extraordinary conglomeration of systems.

"The National Programme for IT offered to take it all away and replace it with something standard, which will probably be less than you have now. People go, 'Why on earth would we do that?'. The idea of ruthless standardisation - that you can enforce a set of things on the NHS - has proved illusory."

This romantic vision was the crucible of the coalition IT strategy for the NHS. Now the department has put up £500m to fund post-NPfIT systems development, it promises that this is what we will get: not a big wad of cash going to CSC, but a big wad of cash funding local innovation in NHS IT.

It seems, however, the result will be more akin to NPfIT than anything else we might have been told we may hope or imagine.

The department promised £500m over five years for patient and prescription systems on the condition that trusts matched it: £1bn in all, equal to what a department spokeswoman said it cut from the NHS contract with CSC.

Cut on the back of an envelope, at a run-rate of £1m per-hospital per year, that £1bn would cover about as many hospitals as have been cut from the CSC deal as well: 200.

Renewal

In cost benefit terms the coalition may have achieved nothing. In terms of time to delivery it may make little difference now whether trusts implement Lorenzo or not. What will matter to the NHS is the functional benefit of its computing choices, and the £cost/()benefit not only of contracts but of software rights, ecosystems and development models.

It looks now like the coalition may have thrown away an opportunity to build an open source health software ecosystem - one that empowers through unity of purpose and endeavour, and by distributing the means of production, levels it into a collaborative, national venture - a bit like the NHS, only concerned with producing health software for the health system.

This is what the coalition promised for NHS IT. In terms of final ()benefit, it means pooling all efforts by all producers of all systems into the same family of software products. So instead of different suppliers re-£-inventing the wheel decade after decade (a situation described by industry veteran Philip Virgo in the first comment on this article), you get them making enhancements to what was already done. Collaboration trumps competition.

The coalition is instead delivering a market democracy, where software companies have the power, the rights, the spoils and, as healthcare is automated, the future. Without any concerted effort to the contrary from the department to nurture the alternative the coalition evoked when it wanted to get elected, many hospitals will simply contract with BT, CSC and McKesson.

The result will be something very similar to the National Programme for IT: expenditure on systems produced by the major software suppliers, perhaps with some exceptions, at about the same price in about the same time.

The committee smartly dismissed the coalition's broader claim that it was "dismantling" NPfIT. Aside from renegotiating CSC's contract, it had merely assigned each strand of the other 90 per cent of the £10bn programme to independent directors. Remember this the next time you hear the health secretary scoring political points over the National Programme for IT.

Enhanced by Zemanta

Universal Credit failures put coalition ICT strategy in purdah

| No Comments
| More
Coventry Cathedral after WWII.pngAs self-appointed guardian against government IT bodges, the Cabinet Office played a role in deciding Universal Credit could meet its over-ambitious aims. It was not only wrong. It also shares the blame for bringing Universal Credit to its knees.

While Cabinet Office minister Francis Maude and his senior officials were overseeing plans for Universal Credit in 2011, they were simultaneously trying to force their own IT-led political reforms on the Department for Work and Pensions.

The extra work was too much. These initiatives became the most significant points of failure in Universal Credit.

It is incredible the Cabinet Office allowed Universal Credit to go ahead at all on its over-ambitious terms: £2.4bn and two years until full roll-out.

Universal Credit would not only re-engineer the complex administration of £70bn social security payments to 8m households, merging six benefits systems across two government departments and local authorities throughout the country.

It would also rely on councils up and down the country making their own systems and processes compatible. It would depend upon HM Revenue & Customs completing its own income tax system reforms of unprecedented ambition, Real-Time Information. And HMRC would in turn depend on employers, banks and payroll software suppliers reforming their computer systems and processes as well. Universal Credit would have to navigate this exponentially explosive collection of risk factors. This was a spaghetti junction of high-stakes computer gambits from which Universal Credit would have to pull a benefits payment that was reliable and secure.

Impossible

And that was just the half of it. Apparently despite its job as guardian against IT bodge, Cabinet Office required DWP to simultaneously implement its Digital by Default strategy - a budget-cutting scheme to replace comparatively expensive call-centre and walk-in government services with self-service web apps. This was ambitious enough on its own: root and branch process re-engineering and painful organisational culling of people and premises.

On top of that, Cabinet Office also required DWP to pioneer implementation of its answer to the last government's ill-fated Identity Card Scheme, called Identity Assurance (IDA). Just like that. And then, like a big red cherry atop a teetering tower of meringues and squirty cream, the Cabinet Office mandated DWP to replace its usual systems development methodology with a new wheeze, called agile, pioneer also new billion-pound contracts to accommodate new ways, recast relations with its key suppliers and invite a gabble of unknown SMEs onto choice parts of the job.

DWP proposed doing all this while undergoing a massive reorganisation, losing experienced people needed to implement the benefits reform.

Francis Maude.JPGIt was always possible that DWP might have been able to do all this in two years. All things are possible. But anyone with the plans before them might have more reasonably concluded it was reckless.

Maude had the plans before him. He and Ian Watmore, his permanent secretary at the Cabinet Office, had helped DWP formulate the project plan. They ensured it accorded with the Cabinet Office strategy for preventing IT bodge.

Reckless

Maude was personally familiar with even the technical decisions behind Universal Credit: not just what should be built but how it should be built. When DWP was in a quandary about how complex it should make Universal Credit in 2011, the Major Projects Authority - the branch of Maude's Office concerned with preventing IT bodge - concluded that it should reach for the skies. This would be one more cherry upon the cherry atop the already teetering tower of jelly project plans that Maude's department had signed off for Universal Credit.

The MPA gateway review of Universal Credit - the report that gave it the official green light - said DWP brass were unsure whether they should build a system to do just what Universal Credit needed it to do, or whether they should build a system that could be re-applied in any given circumstances - a sort of benefits system for all styles and seasons.

MPA advised that they should follow the latter option. This may not have seemed wise to an impartial observer. But it's reason for doing this was political. The Cabinet Office was then laying the latter option down as part of its ICT Strategy.

"Given the Coalition Government's desire to see re-use built into IT systems from the outset, it would be prudent to consider opportunities for this now," it urged.

Tall Cake.pngIt might have been prudent from some perspectives. From the theoretical perspective of the best way to build an IT system, for example, it was prudent. This was the sort of prudence the Cabinet Office wrote into its blue-sky ICT Strategy - an inspiring work of political wonder. But it may not have been prudent to lump these new-fangled ways on a project team that already had just two years to build the impossible.

Oversight

It was, said the MPA, a question of scope: "What was definitely within the Universal Credit boundary, what could be paid by the Universal Credit platform at a future date, and what was definitely out of scope."

The MPA helped DWP design a roadmap to codify the answers to these sensible questions. This Integrated Assurance and Approval Plan (IAAP) would "ensure the correct internal and external assurance," said the MPA Starting Gate Review of Universal Credit.

Having approved all the plans - what was in doable, what was not, what was wise, what was bonkers - MPA passed them for approval to the DWP, and to the Treasury, which would convene a meeting of the Major Projects Review Group (MPRG) Panel for a last, reassuring stamp of approval.

Representatives of the Cabinet Office, Treasury, HMRC and the Department of Communities and Local Government (representing local authorities) then became responsible for routine governance on the Universal Credit Programme Board. Watmore had a place. Iain Duncan Smith, the secretary of state for Work and Pensions, was chair.

Ian Watmore.jpg"Cabinet Office and Treasury ... ensure that what is going on is appropriate. The role the Cabinet Office can play in those situations is to ensure and quality-assure that what people are doing is the right thing," Watmore told the Public Administration Committee last year.

Agile

Even all this careful attention could not in the circumstances have seemed to the Cabinet Office and Treasury enough to prevent Universal Credit becoming another IT bodge. But they had a magic ingredient.

Sanctioned by the Cabinet Office, DWP went into Universal Credit with the belief that agile systems development would make everything possible.

"By adopting an agile development approach, it has reduced the time it will take to deliver Universal Credit by almost half," the Institute for Government put it in its influential report, System Error, that put wind in the sails of Cabinet Office strategy in 2011.

Mark O'Neil - Government Digital Service - Cabinet Office - flipped.pngMark O'Neil, director of innovation and delivery at the Government Digital Service, the branch of the Cabinet Office leading the agile initiative, was charged with helping Universal Credit directors get their heads round his favoured systems ideology.

They saw agile had a lot going for it. But it is hard to credit that these new ideas blinded them of sense. Even Superman could not rescue a cat from a tree made of Kryptonite.

Cabinet Office made DWP deputy chief information officer Kenny Robertson senior responsible owner of the cross-government agile stream of its ICT Strategy, as if he didn't already have enough to do.

It charged him with formulating a plan for agile development to be replicated across government departments, and pulling together a team of agile suppliers to carry it out. And launching a centre of excellence. And a contract vehicle.

Within months of Robertson completing his tenure as agile SRO in summer 2012, agile was kicked off Universal Credit.

Treachery

The Cabinet Office appears now to be trying to save agile by blaming DWP for mishandling it.

"The Cabinet Office does not consider that the Department has at any point...appropriately adopted an agile approach to managing the Universal Credit programme," said the NAO report on Universal Credit last week.

Yet the Cabinet Office frequently cited DWP's agile work to flatter its own agile policy when promoting its ICT Strategy.

Agile software development happening at DWP Warrington.png"Universal Credit programme is one of the first 'Digital by Default' services, using an Agile approach to reduce delivery risk and improve business outcomes," boasted the Cabinet Office's Strategic Implementation Plan in October 2011.

"Agile ... the best practice that seems to be out there at the moment," Watmore told the Public Administration Select Committee in April 2011. "I went up to see the Universal Credit programme in Warrington the other day, and that is precisely what they are doing."

The MPA was also well aware that DWP had used a variation of agile methods, or had been applying them irreligeously. It said so in the Gateway Review with which it approved what DWP was doing.

Steve Dover, head of major programmes at DWP until last October even spelled this out at an agile industry meeting with O'Neill in 2011.

Steve Dover - former head of major projects - Department for Work and Pensions.png"Its a brilliant, brilliant methodology. [But] it's not always applicable to everything. There's some heavy legacy stuff [where] we implement changes on a 6-month basis. I probably wouldn't apply agile to that. Because that works. I'm not going to break it. It supports the current business," he said.

Übermenschen

This year, when the Cabinet Office took emergency control of Universal Credit and sent in its agile A-Team, it left the legacy systems to the DWP. Would Cabinet Office's crack team save the day? Or would they just build some websites and take all the glory? Neither department will say what they are doing in there.

When last week the NAO reported that the development had been bodged to date, it said the reason was DWP's struggle to get a detailed design. Then without actually spelling it out, it recounted the numerous Cabinet Office initiatives that had put spanners in the works for Universal Credit.

DWP dropped the Cabinet Office's Digital by Default strategy in May, having worked on it for two years. It's attempt to implement the Cabinet Office-designed IDA system also failed.

"Cabinet Office decided in December 2011 that the proposed solution was too expensive and unfit for cross-government purposes," said the NAO report.

"The department continued developing its own solution but there were delays in securing funding and finalising the tender for IDA providers," it said.

Government Digital Service - design-principles.jpgDWP couldn't have implemented Cabinet Office's Digital by Default strategy if it hadn't worked out a way to implement Cabinet Office's IDA scheme as well. It couldn't do online benefits administration for 8m households if their transactions weren't adequately protected. It ploughed on with Universal Credit though. Cabinet Office continued referring to DWP in public as the pioneer of its IDA scheme.

CESG, the government computer security service that reports to the Cabinet Office, rejected these further efforts a year later, in January this year. Where was their help before now?The Cabinet Office had meanwhile rejected DWP's IT infrastructure plan. This would have been the foundation of the finished system. Cabinet Office said you can't have a foundation until you know what the finished system would look like.

Rubble

What is left of the Universal Credit plan now is what there was of it before the Cabinet Office piled DWP to breaking point with all its other radically ambitious IT reform programmes: digital by default, IDA, agile.

How much time, effort and money had DWP wasted trying to shoehorn the Cabinet Office initiatives into Universal Credit?

When the country was suffering its worst recession for generations, and poverty had forced more than half a million people to get their food from charities; when the department of social security should have had its full attention on helping them up, it was instead tying itself in knots over a hair-brained IT project, a hair-brained reform of the benefits system and a clutch of hair-brained initiatives from the coalition ICT Strategy.

The NAO failed to ask the obvious questions in its account of this shambles last week. The important questions: how much responsibility does Cabinet Office carry for Universal Credit's failures? How had Cabinet Office's oversight of Universal Credit failed to forfend all its problems?

How could the Cabinet Office ever have been credited as an authority on sensible IT programme management when it was trying to pursue its own political objectives as well? Should it now be broken up, with its IT governance and procurement functions separated out where they cannot be confused by politics and do any more harm?

Cabinet Office's broader political objectives will emerge best from this mess. It wanted DWP's benefits system - and by extension the DWP - broken up. Conservative think-tank Policy Exchange has begun pushing for benefits administration to be devolved fully to the regions. The DWP's days as one of the great departments of state may be numbered.

Universal Credit will be held up as another example of how big government IT projects always go wrong. Yet this one went wrong under direction from a department that forged its reputation by mercilessly barracking the IT failures of the last lot. The coalition government has lost its right to say any more of Labour's ill-fated National Programme for IT in the NHS. It has made all the same mistakes as the last lot.

In October, when DWP is due to begin full roll out of Universal Credit, it will instead be extending a rump of what it planned for a pilot, from one site to just a few more sites. It's molecular cohesion has been shaken as though through sustained shock therapy. It may be too shaken to suffer the stories that will inevitably now recount, relentlessly, over the years it may now take Universal Credit to catch up to where it intended to be today, its failures and the failures of its kind.

Agile chaos consumes Universal Credit

| 1 Comment
| More
Matador harms bull with banderillas.pngIt was the core philosophy of the coalition government's ICT strategy: creative chaos. But instead of fixing government IT it has let it run rampant, according to the National Audit Office.

The coalition came to power claiming government IT projects failed because they were managed too bureaucratically. Yet its £12bn Universal Credit scheme has gone awry through negligence.

After hounding the last government out of power with a barrage of often overblown stories about IT failures, the Condem government must now take its turn in the stocks for running a big government IT project late, probably over-budget and - here's a new one: with little idea of what it's doing or where it's going.

At least the Department for Work and Pensions, which has spent £300m developing Universal Credit since 2010, knew roughly how it was going to get there: by using the radical "agile" approach to software development.

The Cabinet Office and Institute for Government foisted agile on DWP as way to prevent Universal Credit becoming another IT bodge.

But agile was a poor fit, and had never been applied to an immense infrastructure project like Universal Credit. The agile community has since disowned it, saying it was never suited for to projects held by large contracts to fixed terms with large suppliers.

The National Audit Office report into Universal Credit today tells how DWP's management of the programme - under Cabinet Office guidance - subsequently took on all the trappings of a classic IT bodge.

Yet it also demonstrates how parliament and politics refuse to accept that software projects naturally comprise change; and that when big government allows change in a big project, it can have big consequences. Whether conceit or misunderstanding, this has been one of the main plot lines in the story of IT bodge. It has been misused for political purposes and has brought down both senior administrators and major projects, like tired bulls bayoneted with the spikes of a matador.

Condemned

The NAO nevertheless published its report today under a simple remit: to ensure when a government department says it wants £2.4bn for a large project it doesn't end up paying double because the programme managers were up the pub and the suppliers were on the take.

On the face of the NAO report, Universal Credit has not gone well at all. DWP suppliers Accenture, BT, HP and IBM managed to produce a pilot of the system. But, said the NAO, it was deficient. The programme achieved a fraction of what it planned by now. Its October deadline to begin the national roll-out of a full system will now mark the extension of its pilot to a few more sites.

NAO said part of the problem was the DWP never knew quite how Universal Credit should work. It never produced a detailed technical blueprint. Poor planning meant it couldn't measure progress reliably. It couldn't be completely sure of what the system should do until draft regulations were approved in June 2012. It is even still not entirely sure.

"The department does not yet have an agreed plan for national roll-out and has been unclear about how far it will build on [pilot] systems or replace them," said the NAO report, Universal Credit: early progress.

The project was poorly managed by a procession of executives and poorly overseen by a sequence of management boards, said the NAO. The DWP wrote off £34m of software development. The NAO expects it to write off more soon. It went back to the drawing board and, the NAO said, may have to cut back its ambitions for the social security programme the Universal Credit system was meant to deliver: help for people who need it the most.

On top of all that it had trouble managing the major suppliers it employed to build the system. It didn't really know what they should be doing, so it couldn't really tell if they were doing it properly. There was confusion about who was doing what and how it all fitted together. DWP had "inadequate financial control" over its suppliers. Their payments were poorly accounted.

Agile reassessment

Curiously, the DWP said Universal Credit was still on track to be completed on time and on budget in 2017.

Its remarkable confidence apparently belied the findings of its auditor. But from another perspective, Universal Credit may have been an extraordinary success for a project that had so many odds stacked against it. These have not been clearly set out by the NAO.

The NAO report has also fallen into the same old trap of describing the natural flux of a software project as unwanted flux in a major infrastructure project.

The problem with government IT is not so much that it's always running over time and budget because it won't stay still. It is that parliament and politics insist on treating it as though it should stay still.

Or that when a government says it will build a system that does x,y,z in so many years, that it will crucified if only does half the work in its allotted time.

Read roughly, NAO's account of Universal Credit reads as you might expect from a large-scale agile project. Agile was designed to respect the reality of software development: that it is not like other infrastructure - it does in fact change as you go along because it is an attempt to create a living mould from the intricacies of human activity and the terms and conditions of social intercourse. It's not quite like building a high-speed train line, even when you do account for nimbies forcing route alterations.

Some aspects of Universal Credit may have been commendably agile in the circumstances. DWP started work on it before all the parameters were set because there were things it could be getting on with. It did not fix a final design because it was still learning how the system should work. Its October pilot extension includes as much as it was able to do in the given time and circumstances.

Big mistake

Its circumstances were complicated by taking on a handful of major Cabinet Office ICT reforms when it was already a unusually ambitious. This piled on extra strain.

The risk with a big project run under big contracts - the sort of risk that is the professional concern of the NAO - is that change costs big money. NAO was greatly concerned therefore that the DWP contracts were neither tight enough nor enforced strictly enough. It was a question the Cabinet Office never addressed publicly when it pushed for agile government IT: how do you ensure the creative chaos doesn't turn into an eat-as-much-as-you-like for premium-priced IT suppliers?

Universal Credit was going to provide the answers. It was going to be the test-bed for Cabinet Office's agile policy. Its lessons were going to be used elsewhere in government. But Universal Credit has dispensed with agile and neither department will say anything about it.

Francis Maude, Cabinet Office minister who introduced the agile reforms, hinted at what the coalition thought the answer was when he trailed his ICT policy before the Public Administration Select Committee in 2011.

The problem with government IT, he said, was "the projects have tended to be too big".

"That is partly a function of us being a very centralised country, so a lot of the programmes for which Government IT projects are needed are big national programmes," he said.

The solution was to break big projects up, he said, and distribute their functions. Yet his own Government Digital Service - the special forces for the coalition ICT strategy - has been using agile methods to centralise the government's distributed web estate and put it onto one big platform.

Maude's policy vision was of distributed teams of software developers whose achievements combined to make more than the sum of their parts. What he has done instead is centralise to cut costs - and people. Many government functions are simultaneously being replaced by computers in the coalition's Digital by Default initiative. The result will be small government, and big society.

Maude's policy has meanwhile done nothing to eradicate the need for the big departments of state like the DWP to operate big transactional systems like the one that administers social security.

Neither has the state of Universal Credit proven he was right about big government. If DWP had broken the project up and handed it out to many teams of small, agile suppliers it might have made less progress and cost more money than it has with a few large suppliers on large contracts.

There may have been other improvements, though. It might for a start have made the mistakes and alterations in Universal Credit as indiscernible as the goals.

Subscribe to blog feed

Archives

-- Advertisement --