<part three>
Tax system edit fagged the Universal Credit

| No Comments
<part three>
Tax system edit fagged the Universal Credit"> | More
The Looshai expedition - Goorkhas clearing a passage through bamboo jungle - Illustrated London News - 1889 - CROPPED.pngHM Revenue & Customs' Real Time Information, its radical reform of the tax system, was always going to be difficult.

It was like an audacious expedition into wild territory: very unlikely to go to plan. This wouldn't have mattered if the coalition government hadn't given it an improbable deadline of October 2013. Even that wouldn't have mattered, but for Universal Credit.

The coalition pinned the Department for Work and Pensions' reputation on Universal Credit - an equally audacious expedition - then gave it the same improbable deadline. But Universal Credit wouldn't work if RTI wasn't finished first. It needed RTI data to drive its own engine. It was bound to stall.

Using the stock analogy of audacious public enterprises, it was like NASA planning for the first humanned Mars mission to catapult off the first moon lander. But space was surely simple compared to the jungle territory that social engineering projects like RTI and Universal Credit attempt to tame: between them, every business, every employee, all the unemployed, the lion's share of public bodies, and the electronic banking system. Is that everything?

HMRC didn't do terribly with RTI, as it happened. It got RTI mostly done against the odds.

But HMRC had not done what it said would be required by October 2013 to drive Universal Credit. DWP did heroically badly against stiffer odds. The deal was that RTI would by then be feeding Universal Credit with live data from every employer in the country.

As it happened, Universal Credit wasn't ready itself in October 2013. If it had been, it would have succeeded only if it could have accommodated RTI's teething problems.

Now 16 months past deadline, HMRC is still struggling to ensure RTI's data is accurate, and still getting employers connected.

Simon Parsons - Ceridian UK director of payments, benefits and compliance strategies - chair of the British Computer Society -BCS- Payroll Group.jpgSimon Parsons, chair of the British Computer Society's payroll group told Computer Weekly about 1 to 2 per cent of employers were still having problems with RTI.

"That could effect millions of people," said Parsons, who is also a director of HR and payroll outsourcer Ceridian.

"HMRC would say numbers are low but it's a sizeable group and when it occurs it causes quite a lot of difficulties because the initial contact with the employer is via a debt collection agency. The challenge we have is HMRC never seem to feed back to say what the error is when an employer raises an issue.

"We are positive about RTI but there are some aspects HMRC have pushed into without understanding it," he said.

Reported problems included HMRC duplicating people in its PAYE database because a variety of niggly little ways in which it hasn't quite got its system working properly. On top of that, HMRC had been forced to set up an emergency system to workaround an unforeseen problem it had verifying what wages people where paid through the BACS electronic banking system. It has not been able to assimilate many small employers - especially those whose cash-in-hand employees may be most vulnerable to mistakes in the social security system.

Such teething problems were "bound to happen," said Parsons. Complications were a reality of life. The art of building a computer system was to get stuck in to flush the complications out, and then to adapt the system to accommodate them.

Jungle

HMRC was still learning how to do that. It wasn't a big deal. But for the political pressure on Universal Credit. Even in July last year, just months before DWP was due to launch Universal Credit, HMRC's Annual Report and Accounts said it had only just begun testing links between RTI and its own central database of people's tax records, the National Insurance and PAYE Service (NPS), which had been a source of data quality problems since it was implemented in 2009. HMRC was still in an ongoing struggle to route out mismatches between its own records and those held by employers and pension funds. It recorded a formal risk that it wouldn't manage. It also had software bugs the NAO feared might stop it updating NPS with data from employers.

This wouldn't have been a problem if the coalition hadn't insisted it would have Universal Credit rolling out with the full backing of RTI in October 2013, or if DWP could have worked out a way to accommodate HMRC's teething problems.

It wouldn't matter to HMRC either way. This became apparent in November when the National Audit Office revealed an indefinite delay to a major part of Universal Credit being delivered by HMRC. That was Tax Credits - one of the primary justifications for doing Universal Credit in the first place.

A big sell for Universal Credit was that it would make fewer social security payments by mistake. This had been a political problem primarily for Tax Credits, which was overpaying people about £2bn-a-year because of mistakes in its administration by HMRC.

Tax Credits

Overpayments happened because HMRC's system calculated Tax Credits annually. It supplemented the wages of people in poorly-paid jobs, families with children and so on. But if people's wages went up or kids left home they might carry on getting the old credit until the system was updated. This became a crisis when HMRC started trying to claw back the difference at the end of each tax year.

DWP would resolved this by merging Tax Credits into Universal Credit. It would update them in real-time with data about people's incomes from HMRC's RTI, thus doing away with the annual round of miscalculations. Tax Credit overpayments therefore became a primary justification for Universal Credit.

HMRC was due to phase Tax Credits out as DWP phased Universal Credit in, through to the end of 2017. Now neither was quite ready, it would take till some time "beyond 2019", said the NAO in November.

Red Box - Chancellor George Osborne - 11 Downing Street - 2011 budget day - 23 March 2011.pngThis wouldn't matter to HMRC because it had begun using RTI to update Tax Credits itself. And it was claiming the benefit, without Universal Credit.

Chancellor George Osborne said in his Autumn statement, on 3 December, that Tax Credit overpayments would be reduced "in-year" from April 2015. 'In-year' is Tax Credits parlance for real-time or near-as-dammit. He was able to credit this because HMRC was already feeding RTI into Tax Credits.

HMRC was even geared up to accommodate RTI's data problems. HMRC already had an organisation that administered income tax and credits. The income tax system would still operate in an emergency without RTI, HMRC said in 2013 after the Public Accounts Committee raised concerns about it having no disaster recovery system. HMRC had axed RTI's back-up plan to cut costs. HMRC would be okay. It was just other departments whose own systems could not tolerate an interruption by RTI that it might leave in the lurch.

HMRC would presumably also continue working in those cases where RTI was delivering duff data, while its bailiffs roughed up problem employers and its engineers tried to render reality more accurately in their software. Universal Credit's delays gave them two more years to get things ship-shape.

Coincidentally though, Universal Credit did do one other thing for HMRC. Its own very public farce distracted everyone while HMRC built in RTI an extraordinary state power to keep vigil over people's pay packets, pulling corporations, the banks and the state into one audacious, totalitarian tax collecting system. That would have made a computer controversy more momentous than the trials and tribulations of some social security system that sought merely to improve people's economic welfare, or even some NHS system that sought to improve their health.

  - - - - - - - - - - - - - - - - - - - - -- - - - - - - - -

* UPDATE 2014.02.26.

HMRC has ignored Computer Weekly's questions about the rate of error in RTI for nearly a week. It has ignored other questions about RTI for nearly a month.

But after this article was published, Ruth Owen, HMRC director general of tax, sent a letter to the editor, via her press department.

Owen's letter:

Thumbnail image for Ruth Owen - director general of personal tax - HM Revenue & Customs - 25 June 2014 at Civil Service Live conference in Liverpool - CROPPED.pngDear Sir
 
I refer to your article: Tax system edit fagged the Universal Credit.
 
We have made it repeatedly clear that it's untrue to suggest that RTI had any negative impact on the implementation of Universal Credit. RTI has been rolled out since 2013 and was delivered to its original timescale. It supported the biggest change to the operation of PAYE in 70 years and underpinned the government's Universal Credit programme. 99 per cent of all PAYE schemes are being reported in real time.
 
The feed of payroll data from HMRC systems to DWP's Universal Credit system began operation on time and without a hitch, contrary to your article's suggestion that it was delayed and suffered fundamental problems.
 
Ruth Owen
HMRC Director-General, Personal Tax


The article in question - the one above - said not that RTI's troubles caused Universal Credit's ongoing delays, but that they were significant enough to ask whether they would have undermined Universal Credit if it had been ready to launch on time.

Owen's statement is partially true, though misleading. RTI's problems are recorded in HMRC's own annual reports. More importantly, note that a two per cent rate of error in "PAYE schemes" (employers, roughly speaking) would create a significantly greater number of errors than two per cent of employees. Similarly, 99 per cent of employers connected equates to much less than 99 per cent of employees' records. Nor does it say that 99 per cent of employers are connected without error.

These are the pertinent questions when considering how reliable RTI's real-time data would be for a system that, like Universal Credit, would need a reliable real-time data feed.

They also happen to be the questions HMRC won't answer. They will be revisited on these pages, resources permitting.

<part two>
Universal Credit farce hid tax system bodge

| No Comments
<part two>
Universal Credit farce hid tax system bodge"> | More
The Department of Work and Pensions had enough trouble building the coalition government's flagship Universal Credit system without other government departments adding to its woes.

But it depended on data from another improbably ambitious computer system being developed by HM Revenue & Customs, called Real Time Information. RTI went awry just as Universal Credit was turning into a farce. Who would have thought the taxman had comic timing?

Each computer system was the engineering for a radical socio-economic reform. Problems ought therefore to have been expected. But problems in the sense of problem solving, not in the sense of crises great enough for political scandal. What ought have been expected was engineering challenges. What was expected was crises, since the coalition parties created an association between computer-led social engineering challenges and institutional rot.

Astonishingly then, the coalition government told parliament in 2011 it would build these two systems in two and a half years. More astonishingly, the coalition's guardians of engineering at the Cabinet Office Major Projects Authority (MPA) approved the idea.

These projects were improbable only because the coalition had committed to do them in two and a half years. It set the conditions for its own failure, by the terms it had set itself: the measure of a major computer project's success being whether it was done according to the time and budget estimated at its start. Clearly it would never be done as quickly and simply as it claimed when it sought permission to do it, and neither should it have been expected.

Universal Credit might on its own have been as unchallenging as MPA said it was in 2011. Its core work involved rationalising existing benefits systems. Anybody could have imagined it done simply. But the foundation of Universal Credit was RTI. And RTI looked unlikely, even then. If it wasn't finished in time, Universal Credit would have to wait.

Iffy data

RTI's biggest challenge was iffy data. It had to match the employee and pension records of every employer and pension fund in the country with its own database so they could talk in real time. But their data was inconsistent. And UKplc was running software that wasn't up to the job. UK businesses had to update their software so HMRC could update its software so DWP could update its software so the coalition's Universal Credit benefits reform would work.

HMRC's data problem had been so critical while MPs were debating in 2011 whether to approve the coalition's two-year Universal Credit project that HMRC had to devise a work-around solution: in software terms, sticky tape and staples good enough to tap income records of 62m people from Britain's employers and pension funds.

David Pitchford - former MPA chief - now CEO of UrbanGrowth NSW - at UKTI Australia infrastructure symposium - KPMG Sydney - 12 FEB 2015 - CROPPED.pngHMRC announced its "interim solution" just two months after MPA gave Universal Credit its unreserved thumbs up in its initial "Starting Gate Review" of the project in March 2011. Universal Credit was possible in the time given, it said; which it was, if you ignored the ramifications of its dependence on RTI; which it did.

MPA's then chief David Pitchford was asked to take control of Universal Credit when it was in crisis two years later, and kept control for less than four months before leaving government altogether.

RTI was going to solve its own data problems by forcing suppliers to address them: their errors would be exposed when they connected to RTI. The problem was that industry couldn't do it in the time given. HMRC's work-around would therefore be critical, which is how the National Audit Office described the challenge in July 2011. RTI had to have good data flows from every employer and pension fund in the country by October 2013 or Universal Credit wouldn't work. The interim solution would tide RTI over while Universal Credit was established. Its final solution would be ready within three months of go-live. But even the interim solution looked problematic.

Two months after parliament finally approved Universal Credit on 8 March 2012, HMRC said it would need its interim solution to operate till 2016. Universal Credit itself had just then been thrown into the start of its own crisis, with the coalition putting its designs under root-and-branch review just three months after parliament approved them.

Not-in-real-time

Only then did it emerge that the interim solution would probably not suffice for Universal Credit, which is what MPs in the All-Party Parliamentary Taxation Group said in July 2012, having been told by a senior DWP source. HMRC's "policy driven timetable" would meanwhile break RTI, they said. The only way to accommodate it was either to delay its roll-out, or deliver it on-time but operating not-in-real-time.

Neither would suffice for Universal Credit. But it seemed HMRC planned for its interim solution to process employer records not in real time but monthly even before MPs approved Universal Credit. Coalition ministers meanwhile kept up the pretence that all was on schedule.

East Ham MP Stephen Timms at Tamil Chamber of Commerce meeting in east London - 16 JAN 2011.jpgIt ought to have been expected too that HMRC's pilot of RTI didn't quite go according to plan in 2012. Only a quarter of 250,000 employers needed for the 2012 pilot had taken part. It ought to have been expected to that when its pilot shone a light down the cracks where the iffy data came from it saw spaghetti trains of horrible little bugs crawling round. They came out over the 2013 New Year in a series of parliamentary questions between former Treasury secretary Stephen Timms...

David Gauke - Exchequer Secretary - Spending Review visit to Bryson recycling Ltd in Belfast - 19 AUG 2010 - CROPPED 2.png... and current Treasury secretary David Gauke.

HMRC was due to start its roll-out in three months. By March it was telling small employers they could update RTI monthly at first. They wouldn't have to report in realtime till October, the end of HMRC's live roll-out period and the final deadline for RTI to be ready for Universal Credit. Three months later it said small employers would be allowed to report monthly until April 2014. Then it became 2016, when the interim solution was due to expire.

Amyas Morse - Comptroller & Auditor General - National Audit Office at CIPFA Annual Conference 2012 - 3 - for Redactive Events - CROPPED.pngOther problems emerged. HMRC aimed to resolve them with a second RTI software release in April 2014, six months after the final deadline. With just three months to go, HMRC was trying to finalise software designs while late-running employers were still connecting to RTI for the first time.

That might just cause more problems, comptroller and auditor general Amyas Morse wrote in HMRC's July 2013 annual report. More problems did indeed follow.

<part one>
Universal Credit farce hid tax system gamble

| No Comments
<part one>
Universal Credit farce hid tax system gamble"> | More
The farcical procession of crises that beset the coalition government's Universal Credit programme since its inception five years ago obscured a more fundamental reason why it was flagging.

It was critically dependent on another computer system being developed simultaneously by a different government department - a system of momentous proportions and unrealised controversy. Yet while this other one floundered, Universal Credit copped the blame.

The other computer system was Real Time Information (RTI), which HM Revenue & Customs has used to establish live links between its tax databases and the accounting systems of every employer in the country.

Universal Credit was so dependent on RTI that it simply would not work without it. It was to give people social security payments that adjusted to their individual circumstances. It would top up people's wages so they could take low paid jobs with little security without living in poverty. But it could only do that if it could track changes in their income as it happened - in real time. That's what RTI would do.

RTI had fundamental problems from the start. These became public in the summer of 2011, while parliament was scrutinizing the coalition proposal to build Universal Credit in two and a half years.

Universal Credit Starting Gate Review - CROP.pngYet the Major Projects Authority, which the coalition government set up at the Cabinet Office to fulfil its election promise to stop big IT projects going wrong, and apparently to distinguish it from its incompetent opponents, neglected to mention this when it was called to make a sober assessment of Universal Credit's chances of success.

The day before MPs started debating whether to build Universal Credit on 9 March, the MPA gave the project an unreserved thumbs up in a report supposed to assess its prospects. Universal Credit was critically dependent on RTI, it did admit. But it neglected to mention the problems.

HMRC's biggest problem with RTI was Universal Credit's October 2013 deadline, which it was forced to adopt as its own. It meant rushing RTI. But it was physically impossible to rush it. It involved reforming the entire income tax system - every person's pay packet, every accounting software supplier, every business, every pension fund and the whole electronic banking system: the very skeleton of the economy. It was quite unlikely to be done by October 2013.

Three months after the MPA said how impressed it was with all this, HMRC's Annual Report and Accounts suggested this might be more difficult than the MPA had let on.

Universal Credit Starting Gate Review - CROP.pngChallenge

HMRC was still trying to resolve data quality problems derived from a replacement income tax computer system it implemented in 2009. Those problems would be "critical" for Universal Credit, Amyas Morse, head of the National Audit Office, wrote in HMRC's June 2011 report.

RTI needed data feeds from employers to get realtime updates about people's incomes. That would only work if employers' payroll records would square off against RTI and HMRC's income tax database. That was only possible if the data quality was good - if names and codes were consistent, for example. Only then could HMRC fulfil its side of the bargain, which was to get every employer in the country connected to RTI by October 2013. And only then could RTI feed Universal Credit the realtime earnings data it needed to calculate people's social security payments.

This was going to be a challenge, said the auditor.

Employers would nevertheless be mandated to join RTI October 2013, in stages from 2012 when HMRC planned to start piloting the system. Small and medium employers would go first.

"All employers will be under RTI from October 2013," he said.

But it soon emerged that employers wouldn't meet the deadline. In June 2012, three months after parliament finally approved Universal Credit, Morse wrote in HMRC's June 2012 Annual Report and Accounts that payroll firms didn't think businesses and banks had enough time to adapt their own systems to submit pay data to RTI.

The Universal Credit farce was just starting to get into full swing as the true extent of HMRC's problems emerged. The farce was largely due to other things which, despite all the attention paid to Universal Credit by press and parliament, were overlooked. More to follow on that. Suffice to say that one of the reasons Universal Credit wouldn't have been able to roll out as planned in October 2013 was because HMRC's RTI wasn't ready. HMRC denies this is so. It insists that RTI was "substantially completed" in October 2013. And that was good enough for Universal Credit which, HMRC did not add, was then itself cranking up so slowly that HMRC's problems would not have been any cause of alarm, assuming they were not actually cause of delay. Still, HMRC insists (see its statement in response to questions below*) that RTI is now working at almost full capacity. So this is all immaterial anyway. More to follow on that too.

*HMRC statement

"It is completely untrue to suggest that RTI has had any impact on the implementation of Universal Credit. RTI has been fully rolled out since 2013 and was delivered to its original timescale. It was always planned to implement RTI in staged phases. It was piloted in 2012 and the main roll out started in April 2013 and was substantially completed in October 2013. By May 2013 over 1 million employers were reporting in real time. Over 99% of schemes with employees and pensioners are reporting details in real time, representing around 99% of all individuals within PAYE."


Coalition computer programme creates circulatory problem for fascist idyll

| No Comments
| More
Circulatory system by C14th Persian - Mansur ibn Iiyas - 350 years b4 Europe - at-HistoryNeedsYou.pngBritain has got a circulation problem. It is only just getting a circulatory system, and it already has a circulation problem. Blame both on the coalition government's computerization of the state.

You might see less cause for blame than praise when you learn what sort of computerized administration the coalition has been building. But the diagnosis is pain: painful decisions, painful truths. It has been building a digital remedy, but that may itself be a cause for concern: it suggests the organismic harmony of a fascist utopia. So blame, surely. Blame.

The computer systems our government has been putting in place at HM Revenue & Customs and the Department for Word and Pensions in particular are turning the government administration into the nerve centre of an organismic nation state. But drop for a moment the fascist associations from this analogy of nation state as organism that is currently in vogue, so we can diagnose the pain clearly.

Those computer systems - HMRC's Real Time Information system and DWP's Universal Credit - are bringing the nation's employers, banks, exchequer and social security system into harmonious combination.

Almost entirely automated, it is coming to resemble a circulatory system: the nation's blood its money, its nerves as data, the parts made greater in their whole by the computer systems that unite them. Complexity theorists call it emergence. Puritans called it God. Fascists called it glory. The coalition government calls it efficiency.

You might think this nothing new. The nation does circulate money and data already, but only as well as it circulates venereal diseases or gossip. With HMRC's Real Time Information System cranking up, it's about to get ship-shape, shiny boots and step-in-time. RTI is getting links to the payroll systems of companies and the payment systems of banks, so it can get their data and money in real-time.

The circulatory flow is depicted in the concept diagram for Universal Credit, which you can get in a pop-up from this article >here<. It shows how money and data will flow from employers, banks, HMRC, DWP and back round into people's wage packets. The idea is that government regulates the flow to sustain the working poor in the lower limbs of the national organism. So HMRC's nerve system gets updates about individual people's income, day-to-day, from their employer's payroll systems. It gets updates about who gets paid what similarly from the banks. The systems extract the money itself from people's pay. The money flows to the exchequer. The data and some of the money then flow round to the Department for Word and Pensions. DWP checks which employers don't pay their staff properly. And then its Universal Credit system, the beating heart of the national weal, is supposed to make up the difference. When it's all up and running, tuning the economy will just be a matter of tweaking the algorithm that controls the flow of money between rich and poor, roughly speaking.

Beast

So when Boss Hog, say, pays himself so much that he doesn't have enough left to pay a living wage to his skivvies, DWP is supposed to correct his wrong by snatching money out of his pay packet and tucking it in theirs. That's the shape computer-age social harmony is supposed to be. And how glorious this is, surely. But this beast's got a serious circulation problem.

The circulation has got clogged somewhere between the point at which HMRC's algorithm was tuned with rules that determine how much money it should snatch out of Boss Hog's wage packet and the point where DWP's Universal Credit was tuned with rules that determine how much money should be tucked back into the pay packets of his skivvies. The system's initial settings are not getting money and data to flow round to the working poor.

The clot wot caused the blockage is otherwise known as parliament. A diagnosis will show it is a classic case of arteriosclerotic vascular disease. It is in other words a build up of fatty deposits that have stemmed the flow of blood to the lower limbs. The beast's muscles are so starved of blood that manual labour induces pain. Its arteries are so clogged that it is in serious risk of a heart attack.

The problem can be corrected easily enough. Parliament just needs to tweak the algorithm so the flow of blood resumes and the country can start moving easily again.

That would be no end for the pain though. It wouldn't be an end even if the state didn't take this power to micro-manage people's finances, their comms, perhaps their medical records, their genetic constituencies, and use it to pile privileges on those who fate has already advantaged and visa versa, creating a final solution to all social ills by putting each in their allotted place, thus leading the meritocracy to its techno-bureaucratic conclusion, it <parp> .. excuse me ..

It will creep up while you are unaware. Even if things didn't turn out so bad that the technocratic state had socially sorted you into some job you were deemed physiologically, psychologically, algorithmically suited to do, you will at least, thanks to the circulatory flow of blood and money or money and nerve or whatever it is the HMRC has established with its realtime information, be apportioned some wage deemed appropriate for a person of your humble position and the health of the nation, and befitting the pride and the charity of your superiors, and their representatives in parliament, both who will sit ever so much more snugly in their high chairs for all the justification the algorithm gives them. Because they won't tune the economy for the sake equality. They will tune it for efficiency at the bottom and luxury at the top. For the sake of the nation.

What will hurt the most is the realisation that your subservience has been entered into the national economic equation as one of its constants: yours and many other whole lives of menial work and meagre rewards at last liberated from the false hope that they might find some kind of meritocratic salvation through striving. Losers in the game of snakes and ladders will accept their place in the organismic hierarchy. Fools will pick up bad dice and roll them again. It will be like everyone worked for John Lewis. Give a little curtsey, stand in line with your hat in your hands, get a penny for your labours: thank you, Mr Lewis.

Gov dodged scrutiny of Universal Credit

| No Comments
| More
Universal Credit payments - reclaimed overpayments.pngThe damage coalition cuts and bodge have done to Universal Credit was omitted from the National Audit Office assessment of the scheme in November.

The full story emerged a week after the auditor published its report. Had the NAO been able to put all the cards on the table, it may have discredited the coalition as the country entered its few months of mudslinging before the general election in May.

The ambitious Universal Credit welfare scheme was in a worse state than the Department for Work and Pensions had admitted when the auditor did its assessment. But this may have been caused more by government cuts in welfare to people in low paid jobs than its well-publicised IT problems.

The NAO assessment was incomplete because DWP had not disclosed the latest estimates the NAO used to determine if the programme's ongoing development was financially viable.

Those numbers would be drawn from the 2014 Universal Credit business case, which the Treasury signed off in September.

"The department did not fully revise the Autumn 2014 figures," said the NAO report on 26 November.

The shape of those figures became apparent just a week later when chancellor George Osborne made his budget statement, on 3 December.

He unveiled numbers the public auditor had not been allowed to publish: further cuts to the amount of income support Universal Credit would deliver.

George Osborne - then shadow Chancellor - meets Bradford B&Q store staff on Get Britain Working tour - 2 OCT 2009.pngThis and three other revisions the chancellor and the DWP had made to their flagship scheme's business plan since 2011 had diminished its potency drastically.

The chancellor's cuts had been a primary factor in the NAO's assessment of the programme's diminishing success.

But there was worse to come. The scheme had also been undermined by problems the DWP and Cabinet Office had (overseen by the Treasury) with the IT system.

Those problems postponed the time when people could start receiving Universal Credit. The auditor's second primary measure of the scheme's success was the total amount of such payments administered in the first twelve years since work and pensions secretary Iain Duncan Smith commenced construction on Universal Credit in 2010.

By these two measures, Universal Credit's potency had diminished by two-thirds since 2010 - a £24bn fall. It would deliver just £11.8bn of help in its first twelve years, by the most recent available numbers. That was less than the cost to build it.

The NAO said Universal Credit's potency would shrink another £2.3bn if it was delayed even by another six months.

Lo and behold, while the chancellor was delivering his statement on 3 December, the Office for Budget Responsibility issued a forecast that Universal Credit's development would indeed slip another six months.

With ongoing IT problems, the NAO had raised the possibility that it might even be delayed another year. Along with the chancellor's further cuts, the original £35.8bn advantage in doing Universal Credit was diminishing to almost nothing.

More cuts than IT

The NAO and Institute for Fiscal Studies have now hinged their final judgement on Universal Credit's success on how the implementation overcomes its IT problems.

But the NAO based its judgement on misleading calculations. A further delay would obviously lessen the amount of social security payments it administered in the twelve years that followed the day Iain Duncan Smith announced his intention to do it. That's how the NAO predicted Universal Credit would lose £2.3bn of its value if it was delayed another six months. It reckoned Universal Credit was worth less while it was unfinished.

By the same reckoning, if Smith had put his announcement off for a single day in 2010, perhaps to savour his moment in destiny, it would have cost Universal Credit £12m in benefits not delivered: not delivered that is, until they were delivered, a day later than originally planned.

It judged the scheme by a measure that could be meaningful only once it was finished.

The final reckoning wouldn't be whether the system was finished. It would be how much social security it delivered, how well, and with what difference to society and the poor.

The coalition's own welfare cuts undermined Universal Credit more than its IT delays.

More damage

Yet it will cause alarm when Universal Credit's further IT-related delays emerge officially, as they will undoubtedly continue to do after the election when the almighty bodge the coalition made of its development bears fruit. Politicians of any creed would exploit this to damage their opponents at the expense of the programme. Even the Conservatives who conceived the scheme would benefit from its collapse, and the damage it would do to big government and welfare, both of which they have vowed to diminish. The purpose of the venture, then long-forgotten, would sink under the waves with the system itself, and probably its captain.






Cuts and computers undermined Universal Credit

| No Comments
| More
Universal Credit payments - reclaimed overpayments.pngCuts and computer problems are responsible for Universal Credit's failure to "make work pay" as work and pensions secretary Iain Duncan Smith said it would when he sought parliamentary approval for the scheme in 2011.

That's not to say it's not worth the continued effort to get it right. But the chart above shows how weakly the coalition government's revised plans for Universal Credit will help the poor. Their worsening circumstances were arguably this government's greatest responsibility and greatest failure.

The chart shows how the government has diminished Universal Credit's power to make work pay with repeated revisions to its plans since 2011.

Universal Credit is set to take more money from the poor than it gives them after the Department for Work and Pension's most recent revisions to the scheme's business plan.

Parliament agreed the scheme in 2011 on the understanding that it would drastically increase the amount of income support it distributed to people in low paid jobs.

The coalition's 2011 Universal Credit business plan promised to increase welfare by £38.5bn over the first twelve years from its inception, according to DWP estimates reported by the National Audit Office in November.

It would simultaneously save the government £15.5bn by reducing welfare payments made in error.

The balance would be a boost to the poor of almost £20bn.

But the government cut the amount of welfare payments that would go through Universal Credit by two-thirds in successive revisions of the schemes' business plan since 2011, according to the Institute of Fiscal Studies. It cut the Universal Credit three times after 2011.

It had cut Universal Credit so much by its last revision that it would actually claim more back in overpayments than it paid out in support of people on low incomes.

Universal Credit would make the poor worse off by £2.2bn under the 2013 business case that followed the Cabinet Office's "reset" of the programme in 2013.

Universal Credit Caseload Projections - NAO - November 2014.pngBut coalition cuts only explained part of Universal Credit's impotency.

The amount of benefit it would administer had also fallen because the coalition had made such a bodge of its implementation, IFS senior research economist David Phillips told Computer Weekly. DWP was cranking the Universal Credit system up so slowly that it would be years yet before many people were subscribed to it. So the amount of money it would administer would be much lower until much later.

The NAO had assessed Universal Credit's success according to how its numbers stacked up during the first 12 years of its life, starting from its inception in 2010. After by the DWP's latest revision, it would still be loading eligible people onto the system after 2020, 10 years from its inception. The end-date wasn't even in sight.

By mid-2017 - almost two-thirds into the NAO's assessment period - only half the eligible people would have been loaded onto the system.

The degree to which Universal Credit's diminishing revisions were due to its bodged implementation was obscured from the NAO report on that very implementation in November.

How much of it was due to cuts and how much to bodge? The NAO didn't break the numbers out. Its assessment of Universal Credit's implementation papered over the bodge.

The real reckoning of Universal Credit won't be a measure of how much it got done while it was still being implemented. It will be what it achieves for the poor when it is finally done. But even the NAO's flawed assessment of its implementation was incomplete. Because the DWP withheld the recent estimates from its 2014 business plan for Universal Credit. Even since then there's been more cuts and predictions of more delay.

Universal Credit bodged its bodge

| No Comments
| More
Iain Duncan Smith looked like a fall guy when he started work on Universal Credit. He humbled forward to take the stage for his grand social project, with a little shove from his superiors. He looked so earnest, with his army officer poise. They waved him on: yeah you go for it, Iain.

Yet Universal Credit was just the sort of IT project his superiors had long since pilloried for always being bodged, wasting billions in tax... you know the words by now.

Leaving aside the fact that Universal Credit will give government more power to redress inequality than any disheartened socialist toiling under the post-Thatcher yoke might dream possible.

There is still this knee-jerk IT bodge story, which for now is all Universal Credit is to most people. That story is a matter of great hypocrisy.

The prime minister and his most grandiose lieutenants had promised they wouldn't do big IT projects like Universal Credit. They wanted to break the big government computer systems up, and government as well.

They effectively won power by doing so much jeering at the last, Labour government's ambitious, computer-powered social reforms that their faces got stuck in rictus.

And they wanted to cut social security payments to the poor - to make it harder for people to get help from the state, not easier.

Yet when Smith, work and pensions minister, said he wanted to build a huge computer system to improve the way his huge government department gave a huge amount of money in welfare payments to the poor, they said yeah you go on Iain. They said they would make an exception to their rule that there would be no more big state IT.

Not only that, they said they would do this perilously ambitious project in just two years and for just £2.4bn, as though they had banged their heads and forgotten everything they had said before about big IT bodges. The surest way to create an IT bodge was to overestimate how cheaply and quickly you could put a system together, and then to make a big political parade out of it, leaving yourself no choice but to drive all your resources recklessly into it. So that's what they did.

They were letting Smith attempt to build a grand, national benefits system of unfathomable proportions against a very specific, goofy budget and breakneck schedule. Just like the last lot.

Ta

Francis Maude, Cabinet Office minister responsible for ensuring the government didn't do any more IT bodges, then lumped Smith's Department for Work and Pensions with three other ambitious IT reforms at the same time.

The biggest told-you-so story in politics consequently became that Universal Credit was as big an IT bodge as ever.

But when you've got an organisation like the DWP, with 90,000 staff making £70bn of payments to 7m households through under a variety of regimes and through a national network of 700 job centres, 10 call centres, and throughout local government as well, your IT systems are going to be big and complex; and you are going to build them only after trial and error, unexpected pitfalls and climatic changes. Other civil engineering projects would come close to the same volatility if they did their construction out of Tetris.

Universal Credit's troubles did seem to prove the Tory case against big state organisations. It got knees jerking all round. But it wasn't quite so.

Its trouble was that the Cabinet Office's 'independent' Major Projects Authority approved its grossly unrealistic schedule. And that the extra requirements the Cabinet Office piled on the programme's shoulders quadrupled its chances of failure. It made unprecedented demands of the contracting model, the design methodology, the transaction model, its security precautions, the complex rules by which it administers benefits to people, and the organisation itself. This all caused the DWP so much trouble that the Cabinet Office, with its MPA hat on, had to freeze the programme. In the two years of chaos that ensued, with Cabinet Office ripping up DWP's plans and removing the extra burdens it had earlier imposed, the public perception was another big state IT bodge. The programme consequently went two-years over-schedule, as if to prove the case.

When Universal Credit was at last resumed in November, Cabinet Office had forced DWP to alter some of its big state designs. Its revised solution was more in keeping with its own favoured model of public IT: something more amenable to reform; and particularly amenable to its own favoured reform, which was to automate government functions and make their staff redundant.

Cabinet Office usurpation of DWP's project was partial however, according to the NAO report on the affair in December, because its own designs were unproven at high levels of scale and dependability demanded by the social security system.

DWP had been reluctant to move too quickly to Cabinet Office's radical tune. It had the nation's £70bn social security resting on well-rooted, dependable computer systems. The Cabinet Office's own alternative designs turned out to be to be premature. It was not even possible to say whether they would be reliable enough to run the social security system even in 2019, said the NAO in November.

The transformation of DWP from an organisation of 90,000 people into a Cabinet Office web app has consequently slowed. But not much.

The transformation was already inevitable to some degree, as people transact more of their lives online. The question has been how soon: whether it would be accelerated for the sake of a Tory strategy to break up government and cut social security.

DWP now expects up to 74 per cent of people will process their social security claims online under Universal Credit in five years time.

At least when the DWP back office is laid off over coming years they will be able to fall back on the Universal Credit system that replaced them.

* CORRECTION: This article originally said Cabinet Office strove to process 100 per cent of Universal Credit claims online. This is not necessarily so, though its strategy was "Digital by Default". Sarky outgoing comment also added.

Domesday computer to make equality automatic

| No Comments
| More
АК-47.jpgFire a few kalashnikov rounds into the air over a crowded square when Universal Credit is completed in 2019. Perhaps rattle a few volleys from the windows of cars on roundabouts. At the very least, throw some confetti.

Because the Universal Credit social security system will give government power to deliver equality at the press of a button. The only question is whether it or indeed any government will use it.

The Credit will be the final piece in a jigsaw of computer systems that will give government a totalitarian view of people's incomes, and extraordinary powers to act on its intelligence by levying tax and paying supplements.

Its intelligence gathering system is a Domesday Book for the 21st Century: the Real Time Information system, built by Her Majesty's Revenue and Customs almost without anyone noticing, and just coming online.

It happened with a surprising lack of public alarm, after all the fuss about ID Cards. But Universal Credit has commanded everyone's attention with another knee-jerk IT bodge story. HMRC meanwhile built links with employers' payroll systems, and the wider electronic banking system, to get live data about everyone's earnings.

This would be creepy but for Universal Credit. By establishing its own data feed from HMRC's realtime earnings intelligence, the Department of Work and Pensions has gained the power to correct pay inequalities in realtime as well.

What used to be a system of social security has consequentially become a system of social justice.

Get your own

David Cameron - Lewinnick Lodge - Newquay - 2 MAY 2010 - Conservatives promotional picture.jpgIt just depends what you call social justice. Prime minister David Cameron campaigned in the last election with the idea that fairness was a middle class luxury: what's mine is mine, and if you want some too, well fair on you if you can get your own.

While the UK has become the most unequal country in Europe, his ministers strive to cut the amount of money government redistributes from the rich to the poor on whose backs they stand. Universal Credit has been justified as a way to make sure the state paid the poor no more than their allotted handful of alms. Commentators have been more concerned with clawing back "overpayments" made by the approximated system of Tax Credits that preceded Universal Credit, than with increasing the paltry amount of redistribution that the system already manages to perform.

Universal Credit was presented to the public however as a system to administer social justice. It was supposed to encourage people to take grossly low paid jobs they would otherwise shun by topping up their wages with social security payments.

The point of Universal Credit was to "make work pay", as its creator Iain Duncan Smith pitched it in 2010.

Without Universal Credit, the tax system worked against them. A family living on social support and a part-time wage could earn £7,500 more if their wage earner went full-time on the minimum wage. But the government would claim £7,000 of that in tax. That's what they call 94 per cent 'marginal taxation', for someone who doubles their hours on poverty wages.

BBC bruiser Andrew Neil interviews Iain Duncan Smith about Universal Credit - 15 December 2014.pngBut as BBC bruiser Andrew Neil put it to Smith in December, if that same poor family got Universal Credit, 83 per cent of their extra income would still be consumed by tax. If on the other hand they were wealthy, and their wages went above £150,000, the government would tax only 45 per cent of the extra. So much for Cameronian fairness.

If Smith wanted to "make work pay" he might have simply made employers pay people a proper wage in the first place.

He had instead created a benefits system that subsidised employers who paid low wages, he admitted to Neil.

But the point of Universal Credit was that succeeding governments could tweak the Universal Credit algorithm to increase the amount of money it redistributed to the working poor, Smith said in not so many words. Once the system was set up, social justice was just a matter of calibration.

The problem it claimed to address wasn't merely caused by companies not paying staff enough money. It was company directors not paying staff enough while paying themselves too much. And it was Übermenschen such as IT professionals demanding so much money that many of them have long since enjoyed a work regime that involved doing six months work and six month's on holiday, or investing their surplus in property that they then rent out at exploitation rates to the people whose wages they've hogged, who they then employ to clean their cars and wipe their toilet seats.

Universal Credit's own problems were exacerbated by an inability to recruit enough IT Übermenschen to build it, and an inability to pay enough money to retain them. It presumably capitulated and did for them what the social security system has failed to do for those beneath them: topped up their wages.

When the system is up and running, it will hopefully do for the IT Übermenschen what they programmed it to do: get their over-inflated wages back in tax and give to people more in need of it.

Magna Carta

Universal Credit Concept Viability Diagram - DWP - 2009.pngThe basic concept diagram of Universal Credit illustrated how it would happen: an almost virtuous circle of data, money and labour.

HMRC would get data from banks and employers about the money people were earning, and DWP to feed back fair rebates to those being rewarded but meanly for their labours.

But the flow of money had not been automated between HMRC and DWP. The Treasury and parliament damned the flow and, under the current arrangement, would channel enough on to DWP for its Universal Credit to let a trickle back into the pay packets of the working poor.

Any government could open the taps. But they would not likely be under much pressure to do it, because the circulation of data was broken too. Employers knew what everyone was earning in their domain. The banks knew all they knew. All their incomes data went to HMRC. It and DWP would know what everyone was earning. But that's where the circulation of incomes data would stop. Only employees themselves would be kept in the dark.

This has created a quandary in this 800th anniversary year of Magna Carta, the celebrated tax reduction the Medieval Barons muscled out of King John.

king-john-signing-the-magna-carta-reluctantly.jpgAs HMRC cranks its totalitarian tax collecting system up this year amidst the commemorations of Magna Carter, it might seem appropriate to smash it up*.

For while the Carta is celebrated for empowering common people, it was the barons wot got the best deal. What's trickled down since has evaporated as soon as replenished. HMRC's Domesday computer now gives common people the means to muscle a better deal out of the barons, in these days of zero hours and obscene inequality. You might say Universal Credit was their Robin Hood. All they need now is a government prepared to use it. That would surely follow if the government took its defining computer-age policy to its logical conclusion - its promise of open data and transparency for the sake of democracy and liberty: so free our incomes data then, so we can see that justice is done.

* That's what the Conervatives said they would do with big government computer systems anyway, and yet here they've built another one.

UK drone net got torture-grade CIA comms

| No Comments
| More
Punishment_of_the_Paddle_1912.jpgA computer network that the US Central Intelligence Agency began using a decade ago to conduct the kidnap and torture of terrorist suspects has become an integral part of the system now operating drone strikes in the Middle East and Africa.

The means to send 'above top secret' intelligence communications around the globe without exposure empowered the CIA's Rendition and Detention Program to snatch and interrogate suspects in the US 'war on terror'. The same network system became the principal mechanism behind the intelligence-led "targeted killing" of suspected enemies using drone strikes today.

The technological link between the two sinister programmes, signposted in passing detail by the US Senate Select Committee on Intelligence Study of the Central Intelligence Agency's Detention and Interrogation Program last week, further confirms that a US military network routed via the UK carried intelligence vital to the US targeted killing programme, and presents evidence that may sway officials deciding whether contractor British Telecommunications Plc should be held to account for building a part of the network used to transmit drone targeting intelligence since 2012.

In both programmes, secure global comms gave the CIA unprecedented, computer-driven power to collate, combine and analyse information about individual suspected people, and to pursue their subjection or assassination in other countries rapidly, with dreadfully focused vigilance.

Now having been pulled up for torturing the 'wrong' people, for assassinating the 'wrong' people, and for straying beyond the scope of international law, its newfound intelligence powers have been exposed as a grotesque.

CIA_illegal_flights.pngInformation dominance

The Senate Committee Study and other investigations of the US' misuse of power have focused on its effect. Its mechanism and its means, however - its source - remain unquestioned. That is the information dominance the US has striven for since embarking at the turn of the millennium on its ambitious strategy to create an intelligence-led, computer-driven, globally-networked war machine focused on pin-point actions against individual people.

The Committee report nevertheless gave a peek at the source of this terrible power - a thread to be unravelled.

Describing how contractors who had masterminded the CIA's deranged interrogation programme set up a private company to do it once the operation had become well established in 2005, the report noted that the CIA had given the company (called "Company Y" in the heavily redacted, previously classified report - later identified as Mitchell, Jessen & Associates) access to its 'above top secret' computer network, so they could use its intelligence sources in their work tracking and snatching terrorism suspects, 'rendering' them to one of a network of secret prison bases in countries desperate or weak enough to permit them, and 'interrogating' them.

SCIF - Sensitive Compartmented Information Facility.jpg"The CIA also certified Company Y's office in [REDACTED] as a Secure (sic) Compartmented Information Facility (SCIF)," said the Senate report, "and provided Company Y access to CIA internal computer networks at its facility."

It was really only a passing reference. But getting access to Sensitive Compartmented Information (SCI) was tough enough that government agency's still paid pay for it out of their capital budgets.

They had to build specially secured buildings and rooms called Sensitive Compartmented Information Facilities (SCIF), just so CIA-grade intelligence could be handled, and even discussed.

This had been the case since 1999 when a formal order from the office of the CIA director ordered SCI as the designation for data relating to CIA intelligence sources, and the precautions that would secure its transmission over US military and intelligence networks.

It effectively extended the CIA's hush-hush, clean room secrecy over the network and to wherever its intelligence went, so the information could move freely among those permitted to know: software, network pipes, computer facilities, people, would all be locked down.

Intelligence network

Inevitably, such CIA-grade intelligence found its way onto the Global Information Grid (GIG), a US military and intelligence network that has over the course of the 'war on terror' become the data-fuelled engine of US operations, especially drone strikes. Likewise the Defense Information Systems Network (DISN) - the global network of high-capacity comms cables that formed the backbone of the GIG.

This was not a simple undertaking. The National Security Agency, the US' network intelligence centre, extended the CIA's secure, compartmented realm over the DISN/GIG by starting a programme to build architecture good enough to carry CIA-grade intelligence.

KG-340 - NSA Certified.pngThe NSA's crypto-modernization programme guided US military contractors in their production of devices such as the KG-340, a high-capacity encryption device that has become one of the principal building blocks of the GIG.

Thumbnail image for Thumbnail image for KG-340.pngAs a 'Type 1' encryption device, the KG-340 was certified by the NSA to carry any data up to the level of Top Secret / Sensitive Compartmented Information (TS/SCI).

That meant it would securely transmit government and military information classified by the usual trio of designations - Confidential, Secret and Top Secret - using encryption algorithms developed under the NSA's Commercial COMSEC Evaluation Program. But it would also encrypt data to the level required to transmit data within the CIA's compartmented realm.

The result was that the DISN and the GIG would incorporate CIA-grade intelligence into their operations, allowing it to be combined with data from other sources in systems such as those the US used to pick, track and attack targets like people on its terrorism suspect list.

Off-the-shelf spying

This was not done lightly. None less than the Director of National Intelligence, an office created in 2005 as over-arching head of intelligence agencies including the CIA and NSA, dictated how SCI would be handled by the DISN/GIG.

As it was put in a definition of SCI agreed in 2010 by a committee of military and intelligence agencies: "Classified information concerning or derived from intelligence sources, methods or analytical processes, which is required to be handled within formal access control systems established by the Director of National Intelligence".

On being established, the DNI established an agreement between the CIA, its sister agencies and the Department of Defence to collaborate on network security. Their subsequent work established means for assuring the transmission of SCI across the GIG.

The NSA ensured the devices that would do this did so in conjunction with established network technology: devices such as the KG-340, which would turn a standard, high-capacity network into one capable of sending sensitive compartmented, CIA-grade intelligence. Supplied by industry, they would rely on proven technology. The KG-340 was designed by SafeNet, a long-standing military networking pioneer. It was recently bought by Raytheon, one of the larger US weapons manufacturers. It was designed to be a standard "off-the-shelf" network components that would work with other standard, off-the-shelf network components.

That was where BT came in. The US Defense Information Systems Agency (DISA) contracted the telco to build a high-capacity DISN trunk line between the UK and a US military base in Djibouti, North Africa. As part of the DISN/GIG, DISA set out in its contract specification to BT that it would cap the line either end with KG-340 encryptors. The BT line would thus carry CIA-grade intelligence as well as other Top Secret information for military operations such as drone strikes.

As DISA itself said of the DISN in its 2015 budget statement to Congress: "The DISN provides secure voice, video, and data services over a global fibre-optic network that is supplemented by circuitry obtained from the commercial sector.

"DISN subscription services are described as follows: compartmented information communications services for the DoD Intelligence Community and other federal agencies."

BT has tried to portray this network as an infrastructure comprised of unexceptional features and built for banal purposes, in an effort to discourage UK officials looking into the question of whether the British telco ought to be called to account under international rules for corporate social responsibility for its part in the DISN, after US intelligence-led drone strikes became an international human rights scandal.

Officials have spent 18 months deciding what to do because, they have said, has been a lack of evidence that the BT network was anything more than BT said it was: a trivial network connection of no significance and of no interest even to its own corporate ethics board. The DISN, however, was built to be the foundation of all US military operations.

Thumbnail image for Thumbnail image for Restraint_chair_used_for_enteral_feeding_at_Guantanamo.jpg

NSA encryption no smoking gun, says drone net contractor

| 1 Comment
| More
KG-340 - NSA Certified.pngAn NSA encryption box that secures the US military's global drone network has become the focus for UK officials deciding whether a telecoms contractor should be pulled up under international rules on corporate ethics.

British Telecommunications Plc, which faces a formal investigation of its contract to supply part of the US military network, told UK officials they should disregard the NSA encryptor and drop the case.

It had been cited in evidence by legal charity Reprieve, in a bid to make BT meet an obligation to assess whether it was responsible for human rights atrocities after supplying part of the network the US has used to target a calamitous drone assassination programme against suspected armed opponents of its military offensives in the Middle East.

BT told officials the NSA encryptor - called a KG-340 - was the only part of Reprieve's case that had not already been thrown out as insubstantial, and it was irrelevant anyway.

Thumbnail image for Miles Jobling - BT Solicitor.jpg"BT did not provide or install any KG-340 encryption device," said a legal brief BT solicitor Miles Jobling sent to UK officials at the Department for Business, Innovation and Skills.

"The KG-340 encryption devices were to be provided by and managed by the US government," it said.

Officials deciding whether to let BT off an international obligation to look into human rights atrocities have hinged their judgement on whether it looked likely on the face of it that BT's work for the US military was related to drone strikes.

They threw the case out once already, saying the evidence was thin. But Reprieve made them look again after a Computer Weekly investigation showed BT had provided a major trunk of the global military network the US used to conduct drone operations.

Thumbnail image for KG-340.pngThe KG-340 encryptor had been just one small part of the extraordinary "network-centric warfare" system the US has been building for the last 15 years, and which culminated with its "targeted killing" programme: which pulled disparate intelligence sources together on a network to remotely track and kill people on its suspect list.

Its design overseen by the US National Security Agency (NSA), the KG-340 was built to transmit classified intelligence and military communications down this network's high-speed backbone - and at "ultra-low" latency, so it could transmit critical combat mission communications without delay.

KG-340 for Net-Centric Warfare on the Global Information Grid.pngAs a "Type 1" encryptor, NSA certified the KG-340 under its Commercial COMSEC Evaluation Program (CCEP) to transmit comms classified "Top Secret". It had been a primary building block of the Global Information Grid (GIG) - the network system that drove the US net-centric warfare systems - and the high-grade backbone that underpinned the whole thing: the Defense Information Systems Network (DISN).

The US military's contract notice for BT's part of that backbone said it would be capped with KG-340 encryptors, removing any doubt about its purpose.

BT told officials: "The KG-340 is an off-the-shelf product that simply protects the integrity of secure communications."

"Mere knowledge that the US government uses KG-340 cannot of itself impose any burden on BT," it said.

The British telco had insisted it should be able to remain ignorant of what its customers did with its services, in contrast with international rules on corporate social responsibility.

The KG-340 was acquired by defense contractor Raytheon in 2012 when it bought the government division of SafeNet, its original manufacturer. The latter had developed it under the NSA's Commercial COMSEC Evaluation Program.






Ignorance is defence for drone death net corp

| No Comments
| More
BT Tower top.jpgComputer Weekly quizzed BT on a plea of ignorance it sent to officials trying to decide whether the telco should be held to account for a network the US military used to conduct a controversial programme to assassinate suspected terrorists with drone strikes in the Middle East and Africa.

It had told UK officials they shouldn't investigate its contract under international rules for corporate social responsibility because, as a telecommunications firm, it expected to be able to turn a blind eye to what its customers did with its services.

It pleaded two other sorts of ignorance as well, in a statement it has issued repeatedly in the last year to escape an obligation to investigate human rights abuses in its supply chain.

It first pleaded the ignorance of the chemical weapons widget manufacturer. Like the weather-worn businessman on an industrial estate outside Dover who happens to make the only sort of widget that can control the release of anthrax from a warhead, but usually sells them to companies that make mobile disco smoke machines, and turns a blind eye when an order comes in from the bursar of a Syrian military laboratory.

BT likewise claimed it had no responsibility for what its customers did with what it sold them, and so said UK officials should throw out a complaint by legal charity Reprieve that it had neglected its obligations under an international agreement called the OECD Guidelines for Multinational Enterprises.

BT Tower top.jpg"BT can categorically state that the communications system mentioned in Reprieve's complaint is a general purpose fibre-optic system. It has not been specifically designed or adapted by BT for military purposes," it told officials on 8 October.

Evasive manoeuvres

The system in question was a high-grade comms line it supplied as part of the US military's Defense Information Systems Network (DISN), the global fibre-optic backbone of US intelligence and military operations.

The telco seemed to be trying to imply the system it supplied would not be suitable for military purposes. And it could therefore never have had any reason to suspect its equipment might be used in military operations, let alone drone assassinations. And so it shouldn't have any obligation to look into US human rights abuses under OECD rules.

But saying it wasn't specifically designed for military purposes was not the same as saying it couldn't be used for military purposes. Oddly, BT did not claim its system would not be used or could not be used to operate US drone strikes.

CW Logo.pngComputer Weekly put this to BT in February after it had persuaded UK officials to bury the complaint. It made the same arguments then as it sent to officials last month, after a CW investigation showed BT's DISN connection did operate drone strikes, and was the major trunk line for communications between its hub at RAF Croughton in Northamptonshire and Camp Lemonnier, a combat base in Djibouti, on the horn of Africa.

BT Tower top - CROPPED LONG.pngA BT spokesman replied in writing that the system was used by the military, but not necessarily for military operations.

"Camp Lemonnier is a United States Naval Expeditionary Base. So the fibre optic connection will be used by the military," he said.

But he added: "The fibre connection could be used at the base for a wide range of day-to-day activities, such as general housekeeping/internet browsing, email, communications, stores ordering, data functions and voice communications."

Unanswered questions

It was misleading to say the connection could be used for managing the naffy internet. That was handled by a local firm under another contract. The BT line also happened to be part of a military network so powerful that to use it for domestics would be like using a flame thrower to toast your bread. BT wanted to create the impression domestics were as likely as drones. But they were most unlikely.

CW Logo.pngBT's rebuttal had also left the original question unanswered. So CW put it again: would such a BT connection have to be specially adapted for military purposes? Would it be suitable only for laundry dockets if it wasn't specially adapted? What sort of line would you need for drone operations anyway?

BT Tower top.jpgBT pleaded ignorance again: "BT has no knowledge of the reported US drone strikes and has no involvement in any such activity," it said.

But it couldn't say its connection services hadn't been used in drone operations. This was because it formed the very foundation of US drone operations. But if it could claim it didn't know this, it might be able to persuade officials that it shouldn't be held to account for it under the OECD rules on corporate social responsibility (CSR).

Corporate social responsibility

When UK officials reported on the affair in February, it emerged that BT had done a cursory assessment of the risk of human rights abuses in its DISN contract.

But it hadn't addressed the drone question specifically. So it hadn't officially addressed the specific human rights controversy that had been raging over the US drone assassinations.

That didn't mean it didn't know. And it might have been true that BT did not operate the drones itself. And it might not have been contracted to supply in direct support of drone programmes (so it had "no direct involvement"). But its network infrastructure was still the vital component of drone operations. This was a matter of public record for people in defence comms circles.

CW Logo.pngSo CW put it to the telco: the reason BT cannot not say is because it doesn't know - and it doesn't know because it hasn't asked. But it did know about the drone strikes because everyone knew about them. It just hadn't acknowledged them officially.

BT Tower top.jpg"Yes, you're absolutely right," said the spokesman.

"To be exact: 'BT has no knowledge, other than from general press reporting, of US drone strikes and has no involvement in any such activity.'"

CW Logo.pngYet OECD rules said companies should assess the human rights risks specifically when they didn't know. That was the whole point of corporate social responsibility. CW put these this to BT: surely it had breached the rules by trying to ignore its DISN contract's association with the controversial drone programme?

BT Tower top.jpgIts spokesman wrote: "I've double checked and there's nothing more to add."

There was one more thing it said, and has said repeatedly since February, and again since Reprieve resubmitted its complaint with evidence that its line was part of the drone network.

BT Tower top.jpg"BT is glad that UK NCP has assessed Reprieve's complaint and rejected it," it said.

It could not refute the allegation because it was true. But it would persist in trying to discredit any attempt to make the allegation stick under OECD rules. It thus hoped to go on ignoring its connection to the US drone programme.

The NCP rejected the first complaint because without BT's due diligence there wasn't any evidence. Now CW had shown the evidence had been there all along if you knew where to look, BT was trying to imply officials had already decreed that the allegation was not true.

BT wrapped all these statements of ignorance up into a single statement it has issued repeatedly - to CW, to other publications, and to the UK representative itself last month: a rebuttal made of pettifoggery; legal obfuscation in place of corporate social responsibility.

Drone net contractor claims telecoms opt-out

| No Comments
| More
Thumbnail image for Thumbnail image for BT Tower.pngTelecoms contractor BT pleaded ignorance of controversial US drone operations to avoid accounting for its work on them under international rules of corporate social responsibility.

The telco has issued the same statement repeatedly since legal charity Reprieve complained about its contract to supply a major trunk line for the US global military network under OECD rules last July.

It sent the statement most recently to UK officials, who are about to decide whether BT should be held to account for the contract under soft laws on corporate ethics established by the Organisation for Economic Co-operation and Development.

BT told them should be excused the rule that companies determinedly avoid their work contributing to human rights abuses.

The US has operated a drone assassination programme from its network, killing suspected insurgents without trial and slaughtering civilians in attacks gone wrong. But the telco said it had reasons to be ignorant of the programme and reasons to remain ignorant of it under OECD rules.

BT pleaded a special sort of ignorance in a legal document it sent to UK officials, a copy of which was obtained by Computer Weekly and published exclusively here.

Censorship

It claimed the same principle of ignorance telcos use to resist government attempts to censor people's communications on the internet. Telcos see it as their duty not to pry on people's communications.

"BT cannot monitor or control the content that is carried on the system," it told officials at the UK Department of Business, Innovation and Skills.

"BT cannot properly enquire into or know what customers do with the equipment it provides to them," it said, arguing that telco ethics did not permit it to know whether the US used its network to launch controversial drone strikes.

World Wide Web founder Sir Tim Berners-Lee and others have made the same argument against state surveillance of people's communications. Their principle is that electronic comms are like the postal service: postal couriers wouldn't open people's envelopes to judge whether they thought a message is worthy of being passed on. Telecoms carriers likewise must not pry into people's digital comms.

Ignorance

BT turned this into an argument for not asking questions about iffy business activities. It wasn't to know about drone strikes over its network, it argued, because it wasn't allowed to know.

BT might have justifiably refused to snoop on a customer's comms. But its argument looked disingenuous. Even if this sort of intrusion into personal affairs was forbidden, it wouldn't stop a comms company making ethical judgements on commercial contracts. Telcos might ordinarily refuse to supply commercial customers if, say, their credit ratings looked too risky.

Whether telcos discriminate against people when providing general communications lines to the general population has been an issue of civil liberties. Comms firms nevertheless routinely sell advertising and tailor their services according to algorithmic analysis of people's messages. And people have had their phone cut off for lesser things than drone death. Like not paying their bill on time. Or making hokey music and software downloads.

Similarly, if a known despot asked BT to install a line between his control room and his death squad, the telco might reasonably ask questions. This was just the sort of situation where OECD rules normally require companies to ask questions, instead of just judging business opportunities according to their own financial interests.

Thumbnail image for Thumbnail image for Sir Michael Rake - BT.jpgAs BT chairman Sir Michael Rake put it in the firm's annual statement on corporate social responsibility in May: there could be "no compromise between financial results and social returns". There's more to business than simply making money, he said.

The OECD had clarified this recently, for example, for companies extracting minerals from war zones in central Africa. They were meant to take special care over due diligence on things that looked iffy.

BT answers drone death complaint

| No Comments
| More
raf_croughton.jpgYou would think British Telecom might have been terribly grateful when legal charity Reprieve pointed out that the US military was using one of its network systems to assassinate suspected insurgents in the Middle East, without trial and using drone strikes that had been going tragically wide of their target.

It must have been deeply distressing to BT, a vocal advocate of human rights and corporate social responsibility. On publishing its annual statement of corporate principles in May, Sir Michael Rake said it's very purpose was "to use the power of communications to make a better world". As a signatory to the UN Global Compact, it had pledged to make sure it was "not complicit in human rights abuses".

So you would think the telco might have overflowed with gratitude when Reprieve pointed out its error - when it pointed to the horror being delivered down its fibre-optic cables. It would act urgently, surely.

There was no doubt about what it should do. It was set out in an international agreement on corporate social responsibility (CSR) - also supported by BT - that the UK and other members of the Organisation for Economic Co-operation and Development signed in 1976, and updated with stronger human rights rules in 2011.

BT would have to do a full, honest and faithful assessment of the situation. And if it found that its activities had contributed to human rights atrocities, it would have to do something about it. It would stop them from happening. Failing that, it would withdraw its support. And failing that, it would have produced from its assessment a reasoned justification for carrying on as before, stating why what seemed like an atrocity was in fact perfectly necessary because, it might say, it was necessary to commit atrocities to create a "better world".

Nope

Camp Lemmonier - Djibouti.jpgBut BT didn't do this. Instead, it said it didn't believe the allegation was true. It didn't actually know whether it was true either. It didn't know if the high-grade comms line it had been contracted to supply between the US military communications hub at RAF Croughton, in Northamptonshire (shown above), and the Camp Lemonnier drone combat base in Djibouti, North Africa (on the right), formed the operational backbone of a drone assassination programme that had been a raging international controversy.

Even BT's military comms experts had apparently not known about it. So it said it couldn't be held to account for it. So much for corporate social responsibility.

The allegation was true though. Computer Weekly exposed this in weeks-long investigations between March and June this year.

Thumbnail image for 2013_05_31_PUB_Noor_Khan__Kareem_Khan.jpgYou would think BT might be overcome with gratitude on learning this. US drones had mistakenly killed hundreds of people (left). They had also killed thousands of suspected insurgents in Pakistan, Yemen and Somalia, all without trial. Campaigners said it was illegal. The US said it helped fragile states fight terrorists. Others said it stoked the fire. With BT's help, the US was using this terrible power to pick people out of the crowd and execute them from on high. The United Nations, picking over charred remains, said these targeted killings threatened world peace. It called for the restoration of international human rights law.

2014-10-08 - Formal BT rejection of CSR complaint about drone deaths.pngSo you would think BT might look into this, in respect of its own CSR rules if nothing else. Instead, it told officials to bury the complaint. But it didn't refute the allegation. It just tried to discredit it.

Its legal department made an official rebuttal under OECD rules in October. Published here exclusively (right), it ran to 8 pages without refuting the allegation. It pleaded ignorance of the allegation in defence of it. So much for corporate social responsibility.

Iffy

You would have expected otherwise because the whole point of the OECD rules was that when things looked iffy, corporations did a formal assessment (called due diligence, in the parlance). BT has said repeatedly that it fully supported these rules - called the OECD Guidelines for Multinational Enterprises. The rules say responsible companies do formal assessments when things look iffy.

But BT didn't take an official look at the drone controversy when it took the contract. It didn't even do the due diligence after Reprieve said there was cause for it. It thereby preserved its ignorance. It then pleaded ignorance as its defence: how could it be expected to answer allegations nobody could say where true?

442px-Ian_Livingston_World_Economic_Forum_2013_crop.jpgThat was enough to bury the complaint at first. The UK National Contact Point for the OECD Guidelines, the OECD's representative at the UK department of Business, Innovation and Skills (BIS), rejected it in February. It wasn't prepared to use the OECD rules firmly. It said this had nothing to do with BT CEO Lord Livingston becoming minister of trade and industry at BIS. The rules worked by mediation not prosecution. They were gentleman's rules (soft law, as it's known). But perhaps not old boy's rules. And anyway, the UK NCP had been moved from trade and industry to consumer affairs and under a different minister, Jo Swinson. Perhaps consumer affairs would be more in keeping with its light touch.

Either way the UK representative was caught in a catch-22 of its own making: it wasn't prepared to spend any time investigating the allegation until it looked substantial; but it depended on BT's due diligence for substantiality; and BT hadn't done due diligence on it. So much for the OECD Guidelines on corporate social responsibility.

Law to leave wiggle room for public data profiteers

| No Comments
| More
Forthcoming data laws will put only a loose cap on public bodies seeking to sell public information for profit, according to the head of a UK consultation on their scope.

Yet while the law change will also boost the amount of public information being sold for profit, the UK insists it will become harder for the majority of public bodies to get away with it.

Officially, by boosting the amount of public data that must be made freely available to the public, the data rule will fulfil a promise made by the coalition government's biggest honchos: prime minister David Cameron, chancellor George Osborne, and Cabinet Office minister Francis Maude.

Alternatively, a severe compromise weakened the coalition government's defining technical policy so much that the data experts behind it have cried foul and increased their calls for a total ban on public data profiteering.

These differences will be settled when the National Archives publishes draft regulations in December, just months before a general election that will force information-age electors to consider how well the coalition has looked after the public assets as it transfers them to the digital realm.

The government had in fact been using its open computing policy to drive a private road through the public sector. But its open data policy held that public assets should stay in public hands. That was until it turned that policy into law, and its former backers in the open data community reacted by saying it would actually help public data profiteers.

Creep (mission)

It extended the digital profit permit to public libraries, museums and archives - allowing them to operate online like commercial bodies, charging fees to digital visitors and paying some of the proceeds to people who owned cultural assets. Benefactors of the National Maritime Museum, for example, might get a constant trickle of money for old rope by copyrighting its likeness and displaying it online. The museum would take a cut. Other public bodies could similarly get permits to sell public data for profit under the new rules.

The UK extended the profit motive into these areas by pushing its partners in the European Union. Thus it insisted after the European Commission sought to implement the total profit ban that Cameron's open data backers wanted.

But the UK and other EU countries would have to define their own conditions public bodies to profit from data. The resulting European Directive on Public Sector Information left a blank space for countries to fill in the details.

The model for this were partially-privatized public bodies created by previous Conservative governments in the UK. These "Trading Funds" were under orders to trade their public assets to cover their costs, and to make a profit as well - a "reasonable return on investment".

Now libraries and so on could do that sort of profiteering too. Most public data would be prohibited for trade though because tax payers had already paid for it. It's just when costs went beyond convention that public funding would fall short. Such as when libraries found they could no longer afford to stock long-held periodicals because their publishers had moved them online and started charging data subscriptions. Other public bodies would be permitted to trade data too, perhaps if their costs went up similarly and central government refused cover.

The blank in the EU directive would let countries decide how public data profiteers calculated their data fees. And how much profit made a "reasonable return on investment".

National Archives put that crucial question to public consultation. But the resulting regulations will dodge the question of profit.

The open data lobby reckoned the blank bit of the law would create more room for public bodies to profit from data. They wanted it prohibited. But they couldn't even find out how much profit the UK was going to permit public bodies to make.

Mouth (horse's)

Howard Davies, who has lead the consultation as National Archives strategy manager, told Computer Weekly the regulations would leave the answer blank so the government could change the rules easily.

"I am sure what we are going to be looking at is that there will be a form of words that directs public sector bodies that are needing to make a charge to some externally referenced piece of guidance," said Davies. "I don't see that it's going to be chapter and verse in the regulation."

Before now a Treasury document called Managing Public Money had put a loose cap on the profits of Trading Funds.

It said government services competing with private sector suppliers should try and earn an equivalent rate of profit to their competitors. That would be about five or ten per cent. But they might earn to up to 15 per cent if they were in a risky business.

Davies said the regulations would not likely refer to Managing Public Money even. They would just call upon external guidance.

The existing profit cap therefore looked looser than before. But Davies insisted so few public bodies would be permitted to charge profit that the rules would rarely apply.

"If you go outside the Trading Funds, the exceptions are going to be few and far between," he said.

"This is all public sector information. [Public bodies] are already funded to collect, hold and manage that information. So the cases where you need to cover costs and make an additional charge is going to be so rare I can't think of an example," said Davies.

Shop (private)

The amount of public data to be sold for profit had nevertheless been increased dramatically. And it had been open-ended.

This was dire in the open data lobby's view: it played into the hands of Trading Funds like Ordnance Survey, which makes money selling geographical data from the national mapping database. OS had according to Ellen Broad, policy lead at the Open Data Institute, been particularly opposed to a profit prohibition.

As though to prove their point, OS didn't bother responding to the consultation, not even to contribute to the question most pertinent to its health - whether it should continue charging profit. A spokesman for OS said it "hadn't felt the need to respond on this occasion".

The law change secured its future broadly. But it has been exploring further steps toward privatization. Its spokesman said it was talking to the Department for Business, Innovation and Skills (BIS - its parent) about releasing its assets as open data after all.

Portrayed by open data advocates as an obstruction to progress and prosperity, OS nevertheless employed 1,027 people last year. The case against it and other public European data bodies has long been that many small private companies are better than one big public one. That was the premise for the partial privatization of the Trading Funds as well in prior decades as well.

Meanwhile, half of Ordnance Survey's £144m sales came from other government bodies last year. And two thirds of its profits - £32m - went back to where they came from, in a dividend payment to BIS.

Geo data geared for stage-2 privatization

| No Comments
| More
The UK's public mapping data has been put on the road to privatization after the Ordnance Survey opened talks to become a govco.

The controversial plan contradicts official claims that the UK is committed to turning its most valuable public data sets into open data. Such data is already being sold for profit by quasi-public bodies - called Trading Funds - set up under the privatization programmes of consecutive Conservative governments over the last 40 years. The govco plan continues that effort.

Ordnance Survey slipped out that it was in talks to become a govco in the summer, when a public furore over the government's other govco plans was reaching full boil.

Sir Rob Margetts CBE, non-executive chairman of Ordnance Survey, said in its annual accounts on 18 June it was looking at govco plans because it wanted more flexibility.

Sir Rob Margetts CBE - Portrait by Alastair Adams.jpgHis office was unable to say what that meant. Other Trading Funds have sought govco status to shed the generous terms and conditions they must give their civil service staff. This would give them flexibility to undergo "digitization", which involves replacing costly manual processes with computer automation. The coalition's Digital Transformation programme aims to automate 80 per cent of operational civil service jobs. Trading Funds seeking govco status have meanwhile claimed a need for flexibility to do private sector mergers without civil service constraints. Employment laws usually force private companies acquiring public assets to give effected staff the same generous terms as they get in the civil service.

An Ordnance Survey spokesman said one advantage of the deal would be that its staff would no longer be civil servants. He was unable to elaborate. But he insisted it would remain wholly government owned.

A spokeswoman for the Department for Business, Innovation and Skills, which acts as shareholder of the partially-privatized Trading Funds, refused to discuss the plans.

She said BIS couldn't talk about the reason for the talks until the talks were concluded. They had not yet concluded. Ordnance Survey said in June it was aiming for a decision by 30 September.

That was before the govco furore forced BIS to puts its other plans on ice.

Public and Commercial Services Union Land Registry Protest 14-15 May 2014.pngBusiness secretary Vince Cable tore up similar plans for the Land Registry after their leak to The Guardian newspaper sparked staff protests, and a parliamentary committee criticized his department's earlier privatization of Royal Mail.

The Royal Mail deal, with the national address database at its heart, created such a stink about squandered data assets that the Advisory Panel on Public Sector Information (APPSI) stepped in with guidelines to stop any more public data being sold off carelessly.

Professor David Rhind CBE - chairman - Advisory Panel on Public Sector Information - APPSI.jpgAPPSI Chairman David Rhind - formerly chief executive officer of Ordnance Survey - said it was "entirely proper" for government to monetize its data assets.

He said no more public data assets should be sold off so carelessly. They should be leased to the private sector for up to 10 years-a-time instead.

But private investors might make a claim for ownership when they were required to keep leased public data sets up to date. So he imagined the public might not always keep ownership of its data after all.

Rhind explained his position on privatization with reference to Ordnance Survey's current search for a CEO to lead it into an emerging global market for mapping data.

The new CEO would lead Ordnance Survey through a time of transition by commercialising data for global markets, said the job advertisement on the Sunday Times website, on the day its annual report was published in June.

Global ambitions

Markets for geospatial data services were booming, said its annual report's statement on the govco proposal.

"Becoming a GovCo would enable Ordnance Survey to become more responsive and flexible, keeping pace with the rapidly changing markets," it said.

Ordnance Survey made about £45m selling data to energy and property firms with an obvious need to map their assets last year, said its annual report. It also started selling data to wealthy insurance and banking firms. They had been feeding its geographical data into risk and market intelligence systems.

This contrasted with Companies House, another Trading Fund that declared in July it would make all its data open. Its announcement followed Cable's veto of the Land Registry privatization. But the companies registrar's own accounts exhibited a database with little proven monetary value and minimal growth potential.

Vince Cable conference speech - May 2014 Libdem blog.pngThe political backdrop of the affair, however, was complicated. Cable was reported to have stopped the Land Registry privatization against the wishes of Conservative Party ministers. He told his Liberal Democrat Party conference on 6 October that his Tory coalition partners were obsessed with cuts because they detested public services. His department was responsible for the Royal Mail data privatization and the Companies House data release.

Ordnance Survey's spokesman said this week: "We are working with BIS on reviewing the Trading Fund business model. We are looking at being a govco. It's a project that's still being explored."

But he insisted the govco would remain wholly public. Only its data would be sold. He talked up its open data releases. But its minimal releases were all intended to introduce potential customers to its premium-priced data.







UK needs "mature debate" on open data, says senior official

| No Comments
| More
The UK's open data programme looks like it has hit a wall after a senior official intervened to call time on its release of the nation's most valuable public data assets.

As the campaign bugle call for prime minister David Cameron and chancellor George Osborne, open data was at the heart of their plans to downsize government. But their open data programme came up hard against the those quasi-public bodies permitted to sell the UK's most valuable public data assets for profit - bodies called Trading Funds.

That changed in the summer when Companies House, the Trading Fund that manages the public financial statements of private companies, said it would release all such statements as open data by June 2015. The public have had to pay a nominal fee to search its records.

But while it looked like the open data programme was at last achieving its radical ambition, the Companies House announcement was not the breakthrough it seemed. And now the Companies registrar has said that might be as far as it goes.

Tim Moss - CEO - Companies House - 2 - CROP.png"For too long the debate has been an all-or-nothing, black-or-white situation," Tim Moss, chief executive of Companies House, told Computer Weekly.

"I've heard people say, open data: good, charge-for: bad. The world doesn't work that way.

"We need to have a mature debate on open data, as to where it works and where it doesn't. I don't think we always get that," he said.

Moss was responding to the open data advocates around the prime minister, David Cameron, who had protested that the UK had neglected to use an update of regulations to impose open data conditions on all public data.

For Moss, the debate was not a matter of whether the public should be able to access public data without condition. It was up to Trading Funds to decide for whether it was in their own interests to do it.

"The open data model absolutely works for us," he said. "And that's why we've built it into our strategy.

"But the business of other Trading Funds is different and has to be looked at differently," he said.

Catastrophe

On the face of it, open data looked devastating for Companies House. It made made £64m last year, and employed 967 people.

Formerly public bodies, Trading Funds such as Companies House were set up to charge fees for public services with commercial potential. Created by previous Conservative governments, they were partial privatizations of public services not rich enough to be sold off. The fees where meant to cover their costs. Open data would cut their income and invalidate their reason for existing as Trading Funds. They have been reluctant to sanction this.

The coalition government's "digital" reforms have meanwhile been replacing manually-intensive government functions with automated processes. That has cut the cost of managing and distributing public information drastically. It made their open data programme feasible. If public data could be distributed free of charge for the public good, then why charge for it?

That question has been answered partially by its moves to privatize its most valuable public data assets.

The Trading Funds have meanwhile been undergoing their own digital transformations. But their cost-savings have been boosting their bottom lines. Other Trading Funds such as the £144m turnover Ordnance Survey have so far refused to invest their cost savings in open data. Though they have made token releases, their livelihoods still depend on selling public data.

Phantasm

So open data sounded radical for Companies House - meaning either renationalisation or dissolution. Moss too, told Computer Weekly it was a radical move.

But what would it mean for the £64m Companies House and its 967 people? What had Companies House forecast would be the consequences if it started giving its assets away?

Moss said he didn't have the numbers to hand.

But if he did, they would have revealed another surprise: open data will have minimal impact on Companies House. It won't be hollowed out by open data.

More than 80 per cent of Companies House' income came from company registration fees last year. The law obliges companies to pay to file their financial documents at Companies House.

Its public search fee had been token. It raised £15m from search fees last year. But its search function made a loss. It used to run separate computer systems to receive company filings and provide public access. Now it has merged them, companies will make their filings as data and the same system will give public access at almost zero extra cost. The process will be automated or self-service in keeping with the coalition government's digital strategy.

Digitization might cut its costs. But open data won't undercut its own business. The considerable public interest in opening access to its archives was already reflected in its token fees. Digital transformation removed any need to charge by removing the cost of providing the service. Making its data open was a trivial matter because it did not concern its core income-generating business as registrar.

But digitization has not removed all incentive to charge for public data assets where they have greater commercial value. Contrast Companies House with Ordnance Survey, which has made limited open data releases to entice customers to buy access to its full database. Ordnance Survey is a data business. Its public data assets have immense commercial value. They are consequently less likely to be turned into open data.

Francis Maude, Cabinet Office minister responsible for the UK's open data policy said in July Companies House proved the government was committed to turning data "of most value to citizens and businesses" into open data.

It rather demonstrated how the government was committed to turning only that data of least value to private investors into open data.

UK open data "failed opportunity" says founding academic

| No Comments
| More
The UK has sent the world's open data movement up a wrong turn, fear its greatest advocates, by easing restrictions on profits that can be earned from the sale of public data.

Their fears were roused by a reboot of regulations that will make it easier for public bodies to sell data. This was contrary to the campaign promises of the coalition government's Conservative leadership, who said they would make public bodies publish their databases as open data. Public data was to be put in public hands for the public good.

The new rules, to be drafted in December, will permit those quasi-public bodies who hold the most valuable public data to continue selling it for profit.

Open data advocates linked to prime minister David Cameron's initiative called an alarm after the National Archives put the new rules out to routine consultation in the summer. One key part of the new rules was how much profit should people be permitted to make from public data. The Archives left this part of its proposals blank.

The government has meanwhile been selling choice public data assets off, eyeing others for long-term lease to entrepreneurs, and putting others at the heart of international expansion plans for those quasi-public data bodies, which the government has subjected to a systematic programme of privatisation since it came to power.

Ellen Broad - policy lead - Open Data Institute.jpgEllen Broad, policy lead at the Open Data Institute (ODI), said in a submission to the UK consultation earlier this month the new rules were a "backwards step" for the government because they went contrary to its open data policy.

"What the UK is proposing would make it easier to charge more," she told Computer Weekly.

Her submission was backed by some of the key people behind the government's open data policy: Rufus Pollock, the Cambridge academic and president of the Open Knowledge Foundation whose work on open data was headlined in the Conservative Party's 2010 election manifesto; Mark Taylor, the open source software pioneer who helped write that manifesto; and Tim Berners-Lee and Nigel Shadbolt, world-wide web pioneers who the prime minister, David Cameron, honoured by establishing the ODI and the Queen honoured with Knighthoods.

Pollock, Berners-Lee and Shadbolt had also formed the Public Sector Transparency Board, inaugurated by Cameron in June 2010 to oversee the transition of public data assets into open data, freely available for public use.

Dismay

Now five years on, the rebooted open data regulations were a "failed opportunity" to bed that policy down, Pollock told Computer Weekly.

The open data movement's founders were dismayed because the UK had been an exemplar for open data to the rest of the world. But the new rules had spoiled its visage and obstructed their efforts before their work was done.

The new rules were set by an update to the European Union's Public Sector Information Directive, an agreement to replicate the UK's open data policy across the rest of Europe. They were implemented by Neelie Kroes, European Commission vice president who was already pushing the open computing agenda in Brussels when Cameron was using it to campaign for office in 2008. The regulations will be the culmination of all their work.

Rufus Pollock - president - Open Knowledge Foundation - CROP.pngBut Pollock suspected the UK had watered the EU agreement down.

"There was a failed opportunity," he said. "My sense was the government didn't take it. The EU wanted a stronger line than was in the directive. That was toned down."

That stronger line would have been what Cameron's open data team wanted: a strict prohibition on profit from public data.

The EU agreement contained a provision to accommodate a unique aspect of the UK data landscape: those quasi-public bodies entrusted with looking after the most valuable public data. Called Trading Funds, they include organisations such as Companies House, Ordnance Survey, the Land Registry and the MET Office.

The government permits the Trading Funds look after valuable UK data on condition that they charge access fees to cover their costs. These costs were higher in the pre-digital age, when collecting, managing and distributing public information were manual processes.

Zero

But Cameron's open data policy relied on digitization - the automated collection and distribution of public data sets. This would cut the cost of distributing public data - called its "marginal cost" - to as good as zero.

The government had always allowed the Trading Funds to charge a small profit on top of the cost of distributing their data. Now the marginal cost of public data was almost zero, the open data movement had pushed for all charges to be removed: there was no longer any reason for making a profit on something that didn't even need to be sold, especially when it was a public asset.

But that may not happen. Pollock suspected the UK was torn between short-termism and open data. Should it generate tax income by selling public data? Or should they stick to their guns and and treat public data like public investment, to be injected into the economy as a long-term growth stimulant - pump-priming for the digital age?

Pollock said governments were failing to see the bigger picture. He suspected HM Treasury was more interested in tax revenue than open data.

Marginal

The Treasury had however led implementation of the coalition's open data policy. The Cabinet Office open data team worked from the Treasury after the coalition came to power and the Cabinet Office was staffed by officials put in place by the last, Labour government. Chancellor George Osborne had campaigned with Cameron for open data. The new rules are being implemented by the National Archives.

Howard Davies - Standards Manager - National Archives.jpgHoward Davies, who as standards manager at National Archives helped write the UK consultation, said: "We are not wanting to see a situation where excessive charges are made. That's just not what government policy is about at all."

"Where there are going to be charges, they will be limited to marginal cost of reproduction, provision and dissemination of documents. That's for the vast majority of public sector information," said Davies.

Profit could be made from public data "only in those exceptional cases, where bodies have to generate revenue to cover part of their own costs", he said.

Those exceptional cases would be the Trading Funds, he said, and other cases "where a public body is required to cover its costs".

Southwest One earns breather on £50m IBM debt

| 1 Comment
| More
Avon and Somerset.jpgSomerset's controversial Southwest One outsource is trying to square £50m of unpayable debt with parent IBM after the County Council cut its contract.

But while talks dragged on last year, the joint venture made its first operating profit, scraping £70,000 thanks largely to a legal settlement and service cancellation fees paid by Somerset, according to 2013 financial results it published this week.

It scraped a pre-tax profit with just four years left to run of its 10-year outsource contract with Somerset, Taunton Deane Borough Council and Avon and Somerset Police Constabulary, all of which took minority shares in the venture with IBM in late 2007.

IBM kept it afloat with guaranteed loans after years of losses and an admission this week that it would reap only a trickle of profits from its final stretch.

Led by a Somerset County Council that became hostile to Southwest One after Conservatives took it over in 2009, the venture's
public partners renegotiated its contract to bring some of their services back in-house. The last of those changes won't show up in its accounts till this time next year, when its profit will be balanced precariously again.

Their change of heart left Southwest One with £48.8m losses to be lumped on IBM in 2018 if there wasn't also a change in fortunes.

Derek Pretty - chairman - Southwest One.jpeg
"The directors are in discussions with IBM," said Southwest One chairman Derek Pretty in the company's 2013 report this week.

"There are insufficient cash flows to be generated in the remainder of the contract to settle this loan balance.

"IBM does not feel there is any immediate need to restructure this debt," he said, with talks about a final loss having already gone on for more than a year.

IBM was holding out for a late comeback. Pretty said they had already begun talking with their public partners about extending the contract beyond 2018.

Southwest One had meanwhile not let go of its original ambition, to consolidate the backoffices of other public authorities in the region.

"I am very hopeful that there will also be opportunities for profit and service improvement to be found as the company continues to engage in both our clients' change programmes and perhaps become involved in wider public sector restructuring initiatives," said Pretty, a former Kwik Save finance director.

But he gave only vague details on what amounted, after all the bluster from Somerset Council, to a meagre renegotiation.

Mouth and trousers

The central government's Universal Credit scheme had forced Southwest One to hand Revenue & Benefits back to Taunton Deane, said the report. That was a crown jewel. And it gave back Design and Print services, and advisory staff for finance and human resources. It gave Property and Facilities Management services handed back to Avon Constabulary, perhaps ahead of a cost-cutting consolidation of police buildings.

It didn't amount to much though.

Somerset had taken mostly scraps back in-house as well. But not before it had been forced to square its differences with its joint venture subsidiary.

Southwest One wrote off £4.4m of invoices that had gone unpaid since 2010. The council's campaign against Southwest One had also involved neglecting to implement the backoffice savings schemes that were the venture's raison d'etre, damning it in public, and contesting its contract.

But Somerset was forced to pay Southwest One £5.8m compensation for the trouble. The council's initiative appeared to to have achieved little but to damage Southwest One.

The 2013 results neglected to recount what services Somerset had actually taken back. But they didn't amount to much either.

The County Council said in 2012 its cull had taken pensions admin, health and safety, finance and HR advisory, and some accounting, business development and staff training back in-house.

But Southwest One would retain the juicy backoffice stuff that formed the bulk of its purpose: accounts payable, accounts receivable, recruitment, HR admin and payroll. The Somerset documents read by Computer Weekly said nothing about the other mainstay of its Southwest One venture: it's procurement office, which still processed £26m last year, though it was down 15 per cent.

Thumbnail image for Councillor David Huxtable - Somerset County Council.jpg"The contract renegotiation last March returned to the Council a number of the more strategic functions originally placed with Southwest One," said a February review of the renegotiation overseen by Somerset cuts supremo and Southwest One nemesis
(and board member, till last year) David Huxtable.

Somerset retained other bulky services as well, including IT.

Somerset's negative obsession with Southwest One didn't make any sense unless interpreted as a Conservative Council's determination to discredit a deal set up by its close Liberal Democrat rivals, or the Conservative coalition government's determination to undo big outsourcing contracts and sell off their public services. Pretty said Southwest One had already fulfilled Avon Constabulary's 10-year target for procurement savings. Somerset had handicapped its own procurement savings when it began its campaign against Southwest One in 2010.

Somerset slams Southwest One again

| 3 Comments
| More
Somerset's Conservative Council has reprised its campaign against Southwest One, its own joint-venture that had obstructed its efforts to privatize council services.

It's Audit Committee published a critical report on Southwest One this week, digging up a long list of old complaints about the venture, which the Conservative Council's Liberal Democrat predecessors set up with IBM and neighbouring public authorities in 2007.

But the committee report neglected to mention how Somerset had itself undermined Southwest One in an effort to break it up.

Though it was chaired by David Huxtable, the councillor who led Somerset's cuts programme and who has represented Somerset on the Southwest One board since March 2010, the report neglected to mention how the council's Conservative leadership put a freeze on Southwest One after the coalition government came to power in 2010, preventing the venture from delivering the savings promised when the LibDems set up the deal. Somerset Conservatives had already deposed the LibDems in 2009 with a campaign promise to fix Southwest One, which was then still setting up and already delivering promised savings. It had been trying to renegotiate the contract so it did not obstruct its plan to privatize council services. The council's squeeze undermined Southwest One's commercial performance. The council then hailed this as justification of its opposition to the venture.

This week's report - "Update on Lessons Learnt from the South West One Contract" - neglected to mention any of that. It also neglected to give any but the most vague description of any lessons that had been learned.

It instead gave a long list of teething problems Southwest One faced in its first years of operation, and a brief list of problems participating public bodies encountered while learning to work in a joint venture together, and a few complaints about how the Conservative council had been unable to tear up the 10-year, £198m LibDem contract after it took over in 2009.

Cllr Sam Crabb - Somerset Council.jpgHuxtable is not a member of Somerset's audit committee, but nevertheless leads its scrutiny of council finances, which he manages as Cabinet member for Resources. He told Computer Weekly this was not a conflict of interest. Cllr Sam Crabb, opposition lead of the Audit Committee who is said to be Somerset LibDem's expert on Southwest One was not at the meeting that produced the critical report. He said he had been away on business.

Jane Lock - Liberal Democrat opposition leader of Somerset County Council.jpgJane Lock, LibDem opposition council leader, stood in for Crabb. She told Computer Weekly she was not familiar with the subject. She agreed the report seemed short of lessons learned. But she was familiar with Conservative opposition to Southwest One: "They wanted rid of it," she said.

The report's more recent complaints concerned the Conservative council's inability to get rid of it. And it complained that Somerset had not had power publish Southwest One documents and data against the wishes of its partners. This had hampered transparency, a mechanism of the coalition government's programme to break up such contracts and replace the public services that rely on them with private providers.

Huxtable told Computer Weekly in 2012 how his council had been trying to cut the contract since 2009 so it could cut council services.

Councillor David Huxtable - Conservative Cabinet member for Resources at Somerset County Council.jpg"In the harsh political world it was a contract set up by the Liberal Democrats and we've spent the last three years trying to renegotiate it to get more flexibility," he said.

"Every part of our organisation has a part of Southwest One appended to it, whether it be buying financial services, property advice or whatever. But local authorities can't afford to do all this stuff they used to do.

"So if we for instance shut down a department, if we privatized school meals - the Southwest One overhead cannot be removed, so we are still paying a Southwest One overhead on something we haven't done for almost two years.

"Going forward, pretty much everything we want to change now, we are almost precluded from doing because we will be carrying this overhead forever - unless we can renegotiate parts of this contract.

"We are reviewing all our services. I think we've got 170 services. Some of those could be moved into a trust or not-for profit. 65 per cent of our organisation is already run by somebody other than us. We are not atypical of a local authority. WS Atkins do all our highways. Somerset Care do all our homes. Southwest One was just another contract to run all the back-office services. Unfortunately, it was far too complex and nobody envisaged the day when money in local authorities would actually physically go down," Huxtable told Computer Weekly in 2012.

Yet published accounts suggested Somerset could have made the financial savings it sought without cutting council services, if only it had not frozen relations with Southwest One and begun trying to close its services.

Southwest One had in 2010 already made £48m back-office savings Somerset would recoup over the 10-year life of the contract. Southwest One had found another £59m the Conservative council then refused to approve, trying instead to get out of the contract and cut £40m of council services under pressure from central government. Somerset stopped approving Southwest One savings schemes when it started trying to renegotiate the contract.

Huxtable told Computer Weekly today he did not recognise this account.

"My understanding is different. I don't think we froze savings," he said.

"We asked for more cashable savings, and that put us into conflict with Southwest One. My understanding was that Southwest One were in some instances claiming to have made savings when they were in fact made completely independently from them. That's when we started falling out with each other.

"I think what we all realised in local government post-2009 was the days of County Council budgets constantly growing was over. And in fact they started to go into sharp reverse. So we started to ask for more savings, cashable savings. And having to reduce the amount of money we gave them which, as you know, put us into conflict over their contract," he said.

Somerset settled out of court with IBM last March, after the IT company challenged the council's refusal to recognise savings it had made and pay it an agreed share.

The Audit Committee published its critical report on Thursday, the day of a council by-election the Conservatives narrowly won over the LibDems. It appeared the day before in trade reports sympathetic to the Conservative critique of Southwest One.

Huxtable produced the Audit report at the suggestion of Grant Thornton, the district auditor whose own account of Southwest One was unusually sympathetic to the Conservative account of its failure.

Microsoft gets flack over "rubbish" UK data

| No Comments
| More
Sir Tim Berners Lee calling for RAW DATA NOW at TED 2009.pngData experts and government officials have fingered Microsoft's popular Excel spreadsheet as a source of gremlins troubling the UK government's reform programme.

Four years after the coalition government began its attempt to create the most open and transparent democracy in the world, technical problems persist.

With only half a year until the end of its term, the coalition may leave government with its flagship transparency reform in a bodge.

It sought simply to publish public records on the internet as open data: a form people could scrutinize easily using computers. It would create an army of "armchair auditors" who would hold public bodies to account.

But the scheme has languished since its launch four years ago: irregularities still plague the data, making it difficult for all but computer experts to scrutinze it.

Traced to its source, the bug leads to the same problem that upset other coalition transparency reforms.

That was vested interests. As it happens, prime minister David Cameron's plan for government was all about dismantling vested interests. His reforms were humbled by the same vested interests they sought to undo.

The gremlins that infested his government's data came from a forsaken backwater of computing called character encoding (apparently overlooked in the coalition government's plans). The Cabinet Office admitted character encoding was still a problem for the most crucial part of its transparency reforms after the issue was exposed in Computer Weekly last week.

Even the World Wide Web Consortium (W3C), the high temple of the global computing movement that inspired Cameron's transparency reforms, has struggled with encoding.

In a nutshell

The encoding problem was thus: the government had no standard way to encode data - no standard way to take the letters and numbers people read on screens and to represent them in codes computers could handle. So its attempts to publish public spending were flawed by the incompatibility of the data they released.

The Department for Work and Pensions, for example, publishes around a quarter of a million spending records every year.

The whole point of doing this was to help private companies compete over public services and contracts; and to help patriotic citizens terrorize public bodies with awkward questions about the minutiae of their budgetary records.

The entire initiative would be futile if the public could not easily draw meaning from that data. But the data was being released in batches that were incompatible with one another.

One of the reasons for this shambles was ecoding. It was illustrated by a similar problem W3C had with video formats. Software vendors cornered the digital video market with "proprietary" video codecs. That is, they claimed property rights over the codes that represented to a computer the images people see. Market forces gave their codecs power. That power forced people to use their video encodings rather than anybody else's. And that undermined the principle of the web that communication would be uninhibited by vested interests. That was where the coalition government was coming from with its own policy to deliver open data. It had to be uninhibited.

Cameron and W3C wanted it to be like pen and paper. Imagine if it was the other way around. Imagine a world that had "proprietary" pens.

You would sit down to write in lament of vested interests only to find your Bic pen, say, would only write on Bic paper. So instead of writing poetry you'd be shaking out your piggy bank or breaking rocks to get money to buy Bic's proprietary paper.

The Conservative Party formulated a computer-led reform plan that would tolerate no proprietary claims over the vehicles of digital communication.

In reality, their plan presumed to defy global commercial forces: companies like Adobe and Microsoft, with a global base of customers already tied into using their proprietary formats. A laissez faire legislator like Cameron would move these vested interests as he might move a sand dune with hands, and some blowing and coaxing.

But that was a bit of a side-show really. It got more attention than it deserved because office documents were something everyone could relate to.

Cameron's liberation policy was concerned primarily with the only thing his government had rights over itself: its own data.

Labour <=> Conservative

200px-David_Cameron_St_Stephen's_Club_2_cropped.jpgThe idea, as presented by the prime minister: transparent budgets and open data would make government more efficient and accountable. Costs would be cut. Plebs would be empowered.

That was actually how Gordon Brown, the last prime minister, put it just before he got voted out of office in 2010.

You could swap his name with Cameron's and (largely) not tell the difference in what they said.

You could trace Brown's data liberty schpiel to the same wellspring as Cameron: national nerd hero and world-wide web founder Sir Tim Berners Lee.

Both prime ministers took up the cause Berners Lee had dedicated his life to: the common basis of communicating, sharing and combining data that was the foundation of the world-wide web. In respect of him, they made this a cause of national pride and the basis of reform.

Hence primes ministers Brown and Cameron planned for government data to be "linked", in the way Berners Lee had been urging it should be.

That meant had to be possible take any bucket of data and combine it with any other, and to arrange the lot in any way your fancy chose. That meant the data had to be good quality. It had to be comprehensible to computer.

200px-Gordon_Brown_Davos_2008_crop.jpgBetween them, Brown and Cameron set Berners Lee up with a £10m office called the Open Data Institute, to aid UK policy implementation of his ideas. A sort of colonial office of the W3C (of which Lee is founding director), it was going to make sure UK data lived up to the reform schpiel.

Cameron kicked it all off in 2010 by publishing public spending records as open data. Four years later, that data is effectively incomprehensible. The the ODI is still trying to make it linkable. Britain's aspirations therefore to be the most open and transparent government in the world, the world leader in open data, the most efficient, open and responsive government in the world, are still work in progress.

Some of the most prominent government data experts confirmed what the data itself had already said about its own poor quality.

"Rubbish"

Jeni Tennison - Open Data Institute.pngUK spending data was "horrendous", Jeni Tennison, technical director of Berners Lee's Open Data Institute, told Computer Weekly.

"It's ridiculous," she said.

Even when computer experts tried to link this data they had to jump though such hoops that it was "shocking", said Tennison, who got an OBE for her work last year and sits on the Cabinet Office Open Standards Board and Open Data Panel.

"It isn't like we are in a state where the data is basically okay and it just takes a bit of effort to put it together. We are talking about a state where it's basically rubbish," she said.

Companies that set themselves up to do innovative things with UK spending data had to spend 80 per cent of their time simply tidying it up so they could even start to work with it.

UK spending data rubbish was rubbish because it had incompatible encoding. Staff were largely powerless to do anything about it. Because their software was at fault.

Microsoft's Excel spreadsheet has got most of the blame for this.

The problem, according to Tennison and other experts, and just about any forum that addresses the subject online, was Microsoft's atrocious handling of UTF-8, the character encoding widely favoured as the lingua franca of open data.

UTF-8 became encoding-of-choice for the UK government as well as the world wide web. But most of government was using Microsoft software. Microsoft's UTF-8 incompatibilities have long been condemned by experts. The problem was inherent to both Microsoft Windows and its applications, most notably Excel. Users could circumvent them by following complicated instructions. But the workarounds were arduous. This was problematic for government, where most staff use Microsoft software but were apparently not shown how to get to work with UTF-8. More recent versions of Microsoft software employed codecs related to UTF-8 but not compatible with it.

"Popular spreadsheet applications", as Tennison put it, made it hard for users to encode their data in a format that would be universally compatible.

Technical obstacles

"When you export from popular spreadsheet applications you don't get control over encoding and it usually chooses a bad one," she said. "It usually won't be UTF-8. It will usually be something like Windows 1252."

Windows 1252 was an old, proprietary Microsoft encoding. The result, said Tennison, was the data contained characters incomprehensible to other people and programs. Their systems - unless they were using Microsoft Excel on a Microsoft Windows computer - interpreted the incomprehensible characters as "garbage".

"It can cause problems matching stuff up," she said. "If you have the name correct in some data and not in other data then you can't match those two names together. And therefore you can't put the data together accurately."

Ian Makgill - Spend Matters - On panel at Open Data Institute members networking event - 26 March 2014.pngIan Makgill, managing director of Spend Network, a start-up trying to clean up government spending data, concurred with the ODI.

"A lot of the problems are with Microsoft Excel not being able to output open [data] because it likes proprietary formats," he said.

"It's damaging. Microsoft's handling of these things is a problem. Different versions of Microsoft Excel have different formats.

"They default to proprietary formats... because that makes data available in other products," said Makgill, who is regularly cited by other prominent UK data experts as the leading authority on  government spending data quality.

Makgill and other experts said the Microsoft problem was not only its handling of UTF-8, but the difficulties it created for people who wanted to publish their open data in a universally compatible file format. HM Treasury said in 2010 its open data should be published in .csv file format (comma-separated values). But Microsoft didn't handle this most simple of file formats well. This had further helped degrade the UK's open data quality.

Hushed words

Source.jpgComputer Weekly learned through an unofficial government channel that the UK Cabinet Office, which is responsible for the UK's open source, open data and open standards policy, also blamed Microsoft's software for hindering its work.

"There are several issues with saving UTF-8-compliant .csv files from Excel," said a source close to the Cabinet Office.

Another Cabinet Office source said government data was going out with mistranslated pound signs after being exported by Excel. Government guidance in 2010 said departments should leave pound signs off their payment amounts. But departments still put them in. So their output was garbled. Makgill said apostrophes caused similar problems.

These hushed words, by the way, were from officials in a government that stands for transparency. Its transparency only applies in areas where it is in its own interest to cause disruption. That does not extend to itself.

Thumbnail image for Harvey Lewis - Deloitte.png"Microsoft data files are always a bit of a challenge," said Harvey Lewis, head of data analytics at consulting firm Deloitte.

But data quality was not a big issue for Lewis.

The government had rushed its data out in 2010 in respect of the Sir Tim Berners Lee's famous geek plea for "raw data now!", made at a 2009 conference for the sci-tech elite in California.

The government had always intended to get its data out first and then clean it up later.

And, said Lewis, open data had been for government primarily an innovation policy - a means to stimulate the economy. For companies like Spend Network  to thrive from selling linked data services from gov data that had to be cleaned up before it could be linked, the public might have to accept that government will go on spewing out raw data.

Treasury oopsy

HM Treasury did indeed tell civil servants they should publish data now and perfect it later. It even referenced Sir Tim's own advice.

"The focus of the guidance is on how, pragmatically, to make the data available quickly
rather than seeking to achieve full alignment across every entity," it said.

"Publishing raw data quickly is an immediate priority, but we are working towards producing structured, regularly updated data published using open standards," it said.

People in and around the Cabinet Office said the ongoing problem is that people don't know how to persuade their Microsoft software to output in a universally compatible format. Four years on, they still needed training. And UK data was still rubbish.

But HM Treasury, overseen by the National Archives, established conditions for their own data initiative to struggle when they issued the guidance that set it off in 2010. They instructed government officers to publish their data in a standard Microsoft Windows encoding. It assumed they would be using Microsoft software. It imagined alternative encodings as a future possibility.

Bigger picture

Even the W3C has meanwhile struggled to establish UTF-8 as a standard way of encoding .csv files on the web.

It set up a working group last December that won't publish its conclusions until August 2015. It does have more to contend with then character encodings. But character encoding was one of its most thorny issues, said Tennison, who co-chairs the CSV on the Web Working Group, that is addressing the issue for the W3C.

"We are leaning in the direction of UTF-8," she said. "It should be UTF-8".

ODI has simultaneously been trying to persuade government departments to clean their data up using a tool it produced, and to join a certification scheme to improve other elements of their data publications. Departments have shown little interest, despite the poor state of government data.

Vested Interests

Some departments have been so reluctant to even release data that SpendNetwork  had to order their release under Freedom of Information law. Wigan Council would only release spend data after the Information Commissioner intervened. The Ministry of Justice fought all the way to an Information Tribunal.

316px-BorisJohnsonSept08.jpgA similar initiative by London Mayor Boris Johnson floundered for six years because civil servants refused to allow their data to be published. The open data initiative was part of the Conservative Party's plan to break up the public sector. Gordon Brown's proposals were not dissimilar.

Johnson put it in his 2008 manifesto with help from Cameron's campaign team. His Greater London Authority's Oversight Committee said last June something ought to be done about London's poor spending transparency. Civil servants were not co-operating. It traced the problem to the vested interests of the companies whose business dealings were exposed in the spending records. Civil servants might also have had an interest in non-co-operation with the means of their own demise. The coalition plan has aimed for 80 per cent cuts in operational jobs in the civil service.

The coalition claimed on coming to government that its primary interest was challenging the vested interests of corporate IT suppliers. Those interests have prevented it even publishing its own data effectively. Its grander plan to challenge vested interests it saw in the public sector was consequently obstructed.

Microsoft would not talk about either about UTF-8 encoding or its problem with .csv files.

"Modern versions [of Microsoft software] support the most popular standard document formats including PDF, ODF, and Open XML," it said in a written statement.

This, it said, meant applications such as Excel would export "to other programmes which use open standards". It said people should contact their Microsoft supplier if they had any issues.

Subscribe to blog feed

Archives

-- Advertisement --