EC's anti-competitive stance is laudable, its action over Google & Android is laughable

bryang | No Comments
| More
As a football fan, one of my primary experiences of European antitrust law comes through watching the English Premier League on TV.

Once upon a time, I paid Sky Sports for a subscription and saw any and all of the games that the Premier League allowed to broadcast live.

Then the European Commission got involved and deemed the Premier League's process for selling live broadcast rights to be anti-competitive because only one TV company was allowed to win.

As a result, live English football from the top division is now broadcast in the UK by two companies - Sky and BT. Therefore, as a football fan I have to take out two subscriptions to watch all the Premier League football I want. Strangely, the combined value of the two is more than I used to pay for one subscription.

And because of that extra competition in the market, the price paid by broadcasters for Premier League rights has ballooned. So, the price of a subscription to Sky Sports is going up more than usual to pay for the huge bid Sky submitted to maintain its position as the main football broadcaster.

Therefore, thanks to the European Commission's intervention to stop the Premier League's anti-competitive behaviour, consumers now have to pay far more money, the broadcasters have to bid far higher sums of cash, and that terror of anti-competitiveness, the Premier League, is forced by those European regulations to, erm, make an even bigger profit than it ever had before.

In 2009, the European Commission forced Microsoft to promote alternative web browsers to Windows users, having previously bundled Internet Explorer, after an investigation into anti-competitive behaviour. As a result, Windows users were forced to go through a "browser choice" screen when setting up their PCs to offer a range of alternative browsers. Most consumers had no idea what they were being asked to do, and by this time most of those who did understand had already deserted IE anyway.

In 2012 Microsoft was fined €561m by the European Commission after a "technical error" meant the browser choice function was omitted from Windows 7. By this time, the growth of smartphones already meant most people were browsing the web on a mobile anyway.

Microsoft had previously been fined €497m in 2004, for anti-competitive behaviour that involved bundling Windows Media Player into  the operating system, thereby restricting competition for rival media player software. The software firm was forced thereafter to sell two versions of Windows - one with its own media player, and one without. Needless to say almost nobody bought the one without.

Did any of those actions against Microsoft make the slightest bit of difference to the European technology market? Not at all.

Now it's Google coming under the EC's antitrust microscope for the way it packages Android onto smartphones and the deals it strikes with phone makers to use the operating system.

There are plenty of people understandably eager to see Google - and other US internet giants - brought down a peg or two, and it's a good thing that the EU is willing to stand up to the aggressive marketing that some of these companies employ.

But does anybody really think that forcing Google to unbundle elements of Android - in particular its search engine - will make the slightest difference to the market? Will consumers benefit, or will they just end up as confused as the users suddenly told to choose whether they wanted a version of Windows with or without or media player, or unexpectedly made to select from a list of browsers when they probably never even realised they were using IE before?

It's difficult to argue against the principle of preventing anti-competitive behaviour by companies like Microsoft and Google that have achieved dominance in a particular part of the technology market. Nobody is suggesting they should be allowed to abuse that dominance to the detriment of rivals or customers.

But does the world really need to force Android users to select which search engine they want to use, or which app store?

It's worth noting that Apple is not considered anti-competitive for disallowing such choices on an iPhone, because Apple doesn't allow any competition in the iOS market. Because Google allows more openness in the Android ecosystem, it is therefore not allowed to abuse its own dominance of that ecosystem.

It's entirely daft.

Who will benefit if the Commission is successful here? Will consumers, who may face more complexity or even more costs as a result? Will rival providers of mobile phone search engines benefit? (At this point, please chip in if you can list more than one - and if you can, hope you're enjoying being a Windows Phone user).

Providers of rival Android app stores will benefit - many of which are riven with malware, but that's another matter.

Providers of rival forks of Android might benefit - unlucky consumers if they do, as the market gets split by confusing Android incompatibilities.

The principles that the Commission seeks to uphold are laudable. The practicalities are laughable.

Besides, Android users can't afford any extra costs if we're already having to stump up all that extra cash for our wonderfully competitive live TV market for English football.

Liam Maxwell: The man who checks the homework

bryang | 3 Comments
| More
Former Cabinet Office minister Francis Maude used to call Liam Maxwell "the man who checks the homework".

First as an advisor to Maude, then as government CTO, Maxwell was responsible for making sure Whitehall IT buyers did what they were told and writing their report card when they didn't. He was never afraid to tell them there was room for improvement.

Given the power to veto IT purchases by approving departmental technology spending, Maxwell rooted out the knee-jerk practice of giving huge, mega-outsourcing deals to a small number of large and inefficient system integrators. His mobile phone became famous for the "What is the user need?" sticker on its back, which he would brandish at errant IT buyers during meetings.

It was an essential task, and Maxwell was very good at it. Thanks largely to him, we can hope we have broken once and for all the oligopoly of IT suppliers that dominated government technology and sucked out billions of taxpayer pounds every year for little return.

He is moving to become the UK's first national technology advisor, responsible for co-ordinating policy and plans across Whitehall to grow the digital economy and attract tech companies and investment into the country. It's a huge and important role - one that Computer Weekly and others have called for on many occasions to give a high-level focus to the growth of a digital Britain.

With his move, the Government Digital Service (GDS) has now completed almost a clean sweep of departures among many of its key pre-election leaders.

Maxwell follows GDS chief Mike Bracken, deputy director Tom Loosemore, director of strategy Russell Davies, design director Ben Terrett, transformation director Mike Beaven, deputy CTO Magnus Falk and G-Cloud head Tony Singleton out of the door of Aviation House, the GDS HQ opposite Holborn tube station in London.

When Bracken announced his departure last summer - closely followed by a co-ordinated mass resignation involving Loosemore, Davies and Terrett that left a bad feeling among some of those that remained - it raised inevitable questions about the direction of digital government and the future of GDS.

Bracken effectively secured GDS's immediate future before he left at the end of September - his business case for the government as a platform (GaaP) strategy helping to secure a £450m budget from chancellor George Osborne.

But Maxwell's departure comes at a time when questions still linger around GDS, prompted in part by the continued absence of a business plan for spending that £450m pot, which was originally meant to be published in December last year.

Maxwell leaving GDS is not as seismic a shock as when Bracken quit, however. Insiders suggest that Maxwell had been operating increasingly at arm's length from the rest of GDS for some time, with key lieutenants like deputy CTO Andy Beale and director of common technology services Iain Patterson fronting the work.

In recent months it has seemed at times that there are two camps around GDS. There are those who support Maxwell and point to his achievements in enforcing spending controls, open standards, and technology governance as key to the £599m savings attributed to GDS by government auditors.

Then there is the digital camp - the proponents of Bracken's legacy who point out how digital has become central to the transformation of Whitehall departments and is at the heart of public sector and civil service reform in the years to come.

In truth, both have of course played a huge part - and in many respects, both probably moved on at the right time, having put their own strengths to the best use.

Bracken was the visionary who sold the concept of digital government and helped embed digital and agile thinking and skills into Whitehall.

Maxwell was the bulldog who bashed heads together and stopped the plainly stupid and destructive technology purchasing habits of the past.

GDS now is becoming a more collaborative partner - a supporter of departments rather than the agent of change; a setter of standards and governance rather than an enforcer to stop bad practices. It could be argued (and both camps have) that Bracken and Maxwell's strengths are not so well suited to this next phase of GDS's development.

Maxwell will be looked back on as the right man for the right time in government technology. He came into Whitehall initially as an advisor to Francis Maude, who in turn provided the political muscle that allowed Maxwell to become the destroyer of old ways.

Inevitably though, being the person who says "no" means you fall out with people along the way. Many are the tales from government insiders citing arguments and breakdowns in relationships.

Maxwell was forced to bear the brunt of the highest-profile example of such a breakdown as senior responsible owner for the Rural Payments Agency (RPA) digital service when it collapsed in March 2015. He was later labelled "Mr Fancypants" by MP Richard Bacon during a Public Accounts Committee hearing that revealed "personal rifts" and "counter-productive behaviours" between RPA and GDS staff.

Maxwell admitted to the committee that he is not a "conciliatory person when there are issues to be resolved".

For all his achievements, there will be people happy to see Maxwell's back as he leaves GDS - a fact that Maxwell himself would probably acknowledge in private. But he is respected by ministers - he was close to Maude even before the Tories came to power as part of the Coalition government, and he is close to Maude's successor, Matt Hancock, too. As a former Tory local councillor - and ex-IT manager of Eton College - Maxwell understands the political classes.

In an interview with Computer Weekly soon after he quit last year, Bracken made a prescient observation about Maxwell's strengths: "Liam is a brilliant figurehead for technology policy. I can't think of anyone better to advise ministers on encryption, the digital single market, or to accompany ministers on international visits. Liam is brilliant at those."

Bracken was unwittingly writing the job description for Maxwell's new role as national technology advisor in that one statement nine months ago.

So now Maxwell goes from checking the homework of government IT buyers to checking the homework of government digital economy policy. If he shakes up a few people along the way, good for him. If he brings positive changes in policy and planning as a result, then good for us.

Universal Credit shows it is time to make all major government projects open and transparent

bryang | 1 Comment
| More
After four years of the Department for Work and Pensions (DWP) spending taxpayers' money on legal fees to prevent the release of key Universal Credit documents, the only surprise revealed by their eventual publication this week was there was so little to be surprised about.

We knew already that, in the early years of the controversial welfare reform programme, the IT development was a disaster, that key security and fraud functionality was overlooked, that there were significant delays and budget over-runs. And we already knew that, despite all this, DWP officials and ministers had continued to publicly state that all was well, when they knew that was not true.

Thanks to the persistence of independent project manager John Slater and former Computer Weekly journalist Tony Collins, the Universal Credit risk register and issues log from 2012 - finally published under freedom of information laws - mostly served to confirm what we had known all along.

The DWP's response - after fighting through the courts for years - was to point out that the documents have no bearing on how the programme is running today, and to point out they are "five years' old and out of date". The fact the documents are actually four years' old might tell you something about their numeracy too.

DWP is right to say that Universal Credit is being run more effectively now, and its gradual, safety-first roll-out allows the department to better manage risk around what remains a hugely challenging initiative.

But still we have to take it on trust that this is the case. The veil of secrecy around Universal Credit - highlighted and criticised by MPs on the Public Account Committee as recently as February this year - has yet to be lifted.

The committee called for more details of the digital service for Universal Credit to be published, including timescales and key milestones. Nothing has yet been forthcoming. There will be an issues log and a risk register for the digital service - but we can't afford to wait four years and have a legal fight to see them published.

There has been speculation that the resignation of Iain Duncan Smith - the architect and chief cheerleader of the programme - as DWP secretary of state might herald a new spirit of openness. We have yet to see whether his successor, Stephen Crabb, will respond.

Universal Credit will have cost taxpayers nearly £16bn by the time it's fully rolled out. There is one more thing we already know - it's beyond time that all major government projects should be open and transparent and forced by Parliament to be so.

In the digital revolution, the Luddites are the ones in charge

bryang | No Comments
| More
Between 1811 and 1813, English textile workers and weavers conducted a campaign of protests, sabotage and occasional rioting against the spread of new technology that threatened their livelihoods during the Industrial Revolution. At times they also tried to prevent lower-paid immigrants from taking their jobs.

Their actions were in part provoked by the difficult economic climate caused by Britain's involvement in the war against Napoleon. In the end, their movement was put down by a combination of the British army and show trials that often led to execution or being sent to the colonies.

The protesters came to be known as the Luddites, a term we are familiar with these days, and which is often used to describe anyone attempting to slow or stop the march of the digital revolution. Technologies may have changed in the past 200 years, but many people will recognise the obvious parallels.

There is, though, one insidious difference. In the digital revolution, governments aren't turning to armies or judges to put down today's Luddites, they're turning those digital technologies back on them.

Look at Turkey, where a new data protection law has seen opponents claim that citizens will be "stripped naked" by the degree of surveillance the new law enshrines.

Look in the UK, at the ongoing debate about the Investigatory Powers Bill, and the degree of mass data collection the government seeks to legitimise.

Look also in the US, where the FBI finally backed down in its attempt to force Apple to create a backdoor into the iPhone, although that battle of privacy versus security has only been delayed and not yet won.

So how are today's common folk striking back? Well, digital technologies are proving very useful. The Panama Papers revealed by national newspapers around the world to expose the murky world of offshore finance saw advanced data visualisation software used to analyse 11.5 million documents amounting to 2.6TB of data. It's also rumoured those files were stolen by well-meaning hackers.

The results are showing the degree to which many of the so-called elites - often the ones turning digital tech back on the rest of us - are hiding their cash in tax havens.

The digital revolution is proving to be a huge democratising force, breaking down old hierarchies in favour of a networked, globalised society. Much like the Industrial Revolution, that process is - barring catastrophe - unstoppable and inevitable.

But those at the top of the hierarchies are increasingly realising how technology can be used in an attempt to protect what they see as the established order.

Historians remember the Industrial Revolution as a time of great social unrest, but ultimately leading to huge economic, social and cultural progress that benefited everyone.

Fortunately we've not seen serious social unrest in the digital revolution - although there are commentators around who think we yet might. We can see, though, the enormous opportunity for everyone.

But the real difference between now and events 200 years ago, is that this time the Luddites are the ones in charge.

Are there more changes ahead for the Government Digital Service?

bryang | 1 Comment
| More
As we all know, the Government Digital Service (GDS) was awarded £450m by George Osborne in his Autumn Statement last year. A business plan detailing how that money will be spent was due to be published in December.

It never appeared.

Computer Weekly was told in January that the plan would instead be published for GDS's Sprint 16 event in February, when the Cabinet Office unit laid out its priorities for the next year. No plan was forthcoming. So far it still has not appeared.

One insider who claims to have seen an early version of the plan called it "rubbish", and said that it was full of "meaningless marketing fluff."

Barely four months after Osborne surprised many people - including some in GDS - by giving such a significant financial backing, there are already rumours circulating over the future of GDS, with speculation that further changes to the organisation and its remit may be under consideration.

Computer Weekly sources suggest that Civil Service CEO and Cabinet Office permanent secretary John Manzoni is still not entirely convinced about GDS's remit. Manzoni has in the past publicly stated that he thinks expertise is best placed in Whitehall departments, not at the centre, and some sources close to the situation speculate that he feels more of GDS's digital development work would be better reallocated to departmental digital teams.

Some of the digital suppliers that have worked with GDS have also privately expressed frustration at what they see as a lack of strategy and vision for where GDS is going next.

At the heart of GDS's plans is the government as a platform (GaaP) strategy - developing a series of standardised components for use across Whitehall to eliminate duplication of functionality. Early examples include Gov.uk Pay, a common payments platform, and Gov.uk Notify, for sending status updates on the progress of transactions.

However, according to sources, some departments are already pushing back on these initiatives.

There are concerns about the cost and effort for departmental IT teams in unstitching existing tools used for payments and having to integrate a different platform such as Gov.uk Pay into their wider applications.

One source suggested that GDS is unwilling to own the risk of any problems with Gov.uk Notify and offer guaranteed service levels to departments - effectively, those departments expect GDS to take responsibility if anything goes wrong. If they were contracting an external supplier to provide similar software, that supplier would inevitably offer a service-level agreement and ownership of risk (at a cost, of course).

While nobody criticises the intent of the GaaP plan, nor the capability of the platforms being developed, the reality is that these new tools can't always be easily plugged in to existing software applications. If departments are to use these new platforms, they would only do so at such a point when they look to replace their existing systems, which for many could be several years away.

If that is the case, it makes the business case for those platforms much harder to justify within the timescale of a parliamentary cycle.

Perhaps the highest profile platform is Gov.uk Verify, the identity assurance service that will allow citizens to prove they are who they say they are when logging in to government websites.

Verify has been in public use as a beta test version since October 2013, and is due to transition to "live" status next month, with as many as 50 digital government services using the system in the coming months.

But still there are question marks. The Health and Social Care Information Centre (HSCIC), the IT authority for the NHS, told Computer Weekly that Verify is not secure enough for use in the NHS, and is likely to take a "hybrid" approach to meet the high security expectations of medical records.

The backchat around Verify suggests that some departments find it too complex, and that it is yet to make the grade for the most secure applications such as passports and benefit claims.

GDS's programme director for Verify, Janet Hughes, is a fantastic champion and advocate for the project. In conversation with Computer Weekly recently, she accepted there are criticisms of Verify but stressed that the aim was never to please everyone right from the start - the iterative nature of its development means that any concerns will be addressed, and more services will come on board in time.

In some ways, that is a fair assessment of the wider challenge that GDS faces. The benefits of digital government are widely acknowledged, but the agile, iterative nature of the digital approach does not always sit comfortably with civil service management culture.

The fiasco around rural payments was the perfect example, where a digital-by-default approach was forced on the project where perhaps it was not appropriate for a large, complex programme with fixed requirements and a fixed EU deadline to meet.

GDS has had to change its own culture in the months since last year's general election. With the departure of former chief Mike Bracken it has shifted from being a visionary, disruptive organisation challenging departments to change, to a more collaborative, conciliatory operation under Bracken's successor, Stephen Foreshew-Cain. The mantra now is less of, "We're here to change how government works" and more "We're here when you need us, and we're with you all the way". Foreshew-Cain often uses a slide in public presentations that summarises the approach as, "We've got your back".

There is no suggestion that anyone in government is backing away from its commitment to digital services - in addition to the £450m GDS budget, Osborne also gave £1.8bn for digital transformation across departments, plus £1.3bn to HM Revenue & Customs for its digital tax strategy.

But many people close to the government digital community feel that the management and organisation of digital delivery across Whitehall is not yet the best it could be. Until the GDS business plan is unveiled, it's inevitable that such questions will continue to be asked.







What now for Universal Credit - and could Iain Duncan Smith quitting lift the veil of secrecy?

bryang | No Comments
| More
Amid all the political fallout and the carnage within the Conservative Party since the shock resignation of work and pensions secretary Iain Duncan Smith last week, many observers have started asking questions about the future of Universal Credit - the controversial welfare reform scheme created and championed by Duncan Smith.

Having followed the scheme's stuttering progress for the last five years, I suspect that anyone wondering (or hoping) if Universal Credit was in some way a contributory factor in IDS's departure will be disappointed.

But it does seem likely that without him at the helm, there may be a change in attitude, and perhaps even some much-needed openness and greater scrutiny of the programme.

Think about it this way - now that he is out of the Cabinet, Duncan Smith is free to talk about Universal Credit without the restrictions of being a minister. And given his developing feud with George Osborne, and rumours that the chancellor has never been as wedded to the scheme as IDS, you can be sure that Duncan Smith will shout from the backbenches if he sees any decisions that will tarnish the reform that he has always wanted to be his great political legacy.

Let's look at the context for Universal Credit as IDS steps away from it.

£15.8bn lifetime costs

When the programme is fully rolled out - or perhaps that should be "if" - it will have cost you and I as taxpayers £15.8bn. That's the most recent published estimated for the lifetime cost of the controversial welfare reform programme.

The main reason that astonishing figure has not received greater scrutiny is the consistent line from the Department for Work and Pensions (DWP), and in particular from Duncan Smith, that the economic benefits to the UK will amount to £7bn per year.

Of course, the champion of the scheme is now gone. And given the changes affecting Universal Credit from successive budgets and welfare cuts, you have to wonder how realistic that benefits figure is too. Duncan Smith had always justified the cost on the basis that the UK economy would benefit from getting people back into work sooner and so making a productive contribution to GDP and paying more taxes.

But George Osborne's planned austerity cuts to Universal Credit benefits - effectively a replacement for the widely criticised tax credits cuts from which the chancellor made a dramatic U-turn last year - must surely raise questions over the rationale behind the business case. That's a business case that is not due to be finally signed off until September 2017.

What's happening with the roll-out?


As of 11 February - the latest date for which figures have been published - 203,392 benefit claimants were on Universal Credit, most of whom are single people who don't own a home, making them the simplest type of claimant to process.

The original plan was for four million claimants on Universal Credit by April 2014.

The DWP trumpets the roll out of Universal Credit to all Jobcentres nationwide, a process underway now. But the IT system they are rolling out will mostly be thrown away before the scheme is fully implemented - currently targeted to be March 2021 - to be replaced by the "full service" or digital version of the scheme.

The digital version has so far only been tested in three or four limited areas in London, and is due to begin a national roll-out in May 2016, starting with just five job centres a month. No figures have yet been published about the number of claimants being processed by the digital system.

We know that £130m of IT work will have been thrown away by the time the digital version is fully rolled out - and it's highly likely this figure will prove to be a lot higher.

According to the Public Accounts Committee (PAC), there are still no published milestones for the further roll out of the digital version: "The lack of specific and timely plans for digital service roll-out - and only being able to say that roll-out will happen 'soon' - not only affects local authorities, it also creates uncertainty for claimants and those whom they turn to for advice," said the committee in February this year.

PAC chair Meg Hillier, MP, added: "The lack of transparency surrounding a programme with such wide-reaching implications for so many people is completely unacceptable."

Migration risks

At some point, the millions of people claiming existing benefits will have to be transferred to Universal Credit - an enormous physical, logistical and technical task to undertake. Estimates from 2013 suggested that more than 200,000 benefit claimants need to be transferred on to the new IT system every month in the year leading up to its full launch. That scale of migration is unprecedented in UK government history, and must surely present a huge risk to subsequent timescales.

The Office for Budgetary Responsibility has previously said that every further delay wipes out great chunks of the expected economic payback.

Throughout all the public scrutiny of Universal Credit from multiple PAC hearings and National Audit Office reports, Iain Duncan Smith has consistently insisted that the programme is on track and under budget - even if those milestones have been something of a moving target.

So given the new reality of an IDS-less DWP, and the risks that still exist for further controversy and embarrassment to the government, there must be people within DWP now willing to think the previously unthinkable and question whether the current plans are sustainable.

From the IT side, there seems to be a quiet confidence that the latest technology is finally going in the right direction. While DWP still resists releasing any detailed information about the digital tests, there are few whispers to suggest problems with the system - unlike the situation when the original IT was going down the drain when everyone was trying to cover their backs.

But even with the current confidence in the digital system, it has yet to be tested at any sort of scale.

A delicious irony?


DWP has for the last four years been fighting against a freedom of information request to publish three critical planning documents from the early stages of Universal Credit, back in 2012. At that time, Duncan Smith was very publicly insisting all was well - as he has continued to do even when it was shown that all was not well.

The question for him and DWP is whether those key documents will show that even as IDS was telling a positive story to Parliament, he and his senior officials knew that was not the case.

Such a revelation might even have been enough to force a ministerial resignation, had the minister not gone already.

But now, Duncan Smith no longer has to defend the current government plans for the scheme, and can instead point fingers elsewhere if he believes that others - notably in the Treasury - have effectively scuppered the ambitions he once laid out for Universal Credit.

He can present himself as the man who hit the brakes when officials were failing with the early IT, and who put the plans back on track with the infamous "reset" of the programme in 2013. He can point to Treasury meddling and cuts ever since that have thrown the future of his great reform into doubt.

Wouldn't it be a delicious irony if the man whose obfuscation and lack of transparency for five years of Universal Credit might now prove to be the catalyst for its secrets being revealed?

The future is here, now - if only our political and business leaders had the vision to see it

bryang | No Comments
| More
If you had the time-travelling opportunity to read the forthcoming issue of Computer Weekly magazine just a few years ago, you might have assumed it was a review of BBC Tomorrow's World. There's virtual reality, artificial intelligence (AI) for financial advice, smart cities, internet of things, autonomous delivery robots - even robotic concierge services in hotels.

These are all technologies that not so long ago were considered tantamount to science fiction - yet here they are, every one of them being used in real-life situations and on the verge of becoming mainstream.

We will have to lament the lack of a Star Trek transporter system for some time to come, but for many of us, things that in our lifetime were once considered fantastical and futuristic, are now very real.

For any organisation - whether a business or a public body - there are so many technologies available to help take a step up in competitiveness, profitability, efficiency, customer service, cost-effectiveness or any other core objectives. Technology is helping us get more value from our time and our assets - through so-called sharing economy services such as AirBnB or Uber, for example.

The scope for innovation is greater than it has ever been.

We're on the cusp of an enormous economic displacement from what you might call an analogue economy to a digital one, with huge amounts of GDP already transferring from low-tech activities - such as print advertising - to technology enabled ones, such as search ads.

So what's stopping us?

For all the unwarranted scaremongering about robots or AI stealing our jobs, the biggest single factor holding us back as an economy and a society from taking this great leap forward in innovation is people. There just aren't enough people with the skills, awareness and vision leading governments and businesses to make it happen - nor enough with the technical skills to put it all together and make it work.

This week saw an effort around International Women's Day to encourage more women into technology. There are so many such initiatives now that if every one of them had pushed just 10 women each into the sector, we could stop talking about the diversity gap - but still it persists.

Getting more women into IT is just one obvious way to bridge the people and skills gap, but the people problem is not going away without real leadership vision - especially in government policy. Next week sees George Osborne's latest Budget - you can be sure it will lack any such vision for the future.

The adoption of all these amazing technologies is eventually going to happen - the momentum is unstoppable. But it's too slow, and every lost minute is a wasted opportunity for us all, economically, culturally and personally. We all need to call on our leaders to step up.

Blockchain will bring a radical rethink of banking - but not yet

bryang | No Comments
| More
Rarely has an emerging technology experienced both the levels of hype and the levels of anti-hype that exists around blockchain at the moment.

For every supporter proclaiming the distributed ledger technology as the future of banking, there is a critic decrying it as overblown and unworkable.

Anyone would think it was the Conservative party debating EU membership, for all the differing opinions.

Such arguments are increasingly typical for any technology that has the disruptive potential ascribed to blockchain. There will always be pioneers excited by the opportunity, and laggards who see it as a threat.

The common ground between the two is that it will take many years before blockchain becomes a mainstream technology in financial services - this is not an industry prone to rapid adoption of new technologies, as we have seen with the early reluctance around cloud.

This week a consortium of 40 banks announced successful tests of competing blockchain products in a wholesale banking environment. The main reason for their interest is the potential to take huge costs out of their back-office infrastructure - replacing the costly and complex financial settlement systems that make us all wonder why it takes days to clear a cheque.

If the claims of blockchain supporters are correct, the savings would be in the many billions across the sector.

Critics cite the complex regulatory environment as a blocker to blockchain, but the fear factor for them is the potential for blockchain to do to established banking hierarchies what Netflix did to Blockbuster, or what the web has done to print media.

To become a bank these days - even the many challenger banks now appearing - takes years and huge amounts of cash. Blockchain, if it works and continues to mature, could theoretically enable an organisation to set up a banking transaction network near instantly, if regulators allowed it.

The big banks involvement means there is plenty of money being invested in testing out the potential - but also acts as a brake against too much change too quickly. This is a technology that will take time to prove itself - correctly so, given the scale of the financial systems it could one day supersede.

Just as the web has not entirely killed print, blockchain will not entirely replace the status quo. Context is everything in assessing its true potential. But it's certain that blockchain is going to bring a radical rethink of how the banking sector of the future will look.

Ofcom sets ambition for fully fibred broadband Britain - now industry must respond

bryang | No Comments
| More
Ofcom's much-anticipated review of the communications market is a smart and pragmatic attempt by the regulator to accelerate fibre-to-the premises (FTTP) broadband across the UK - but as ever with such a wide-ranging exercise, the devil will be in the detail.

The essence of Ofcom's recommendations are twofold.

First, BT has been given a last chance to prove that its Openreach division can operate as if it were a fully independent company while still part of the BT Group. If it fails to do so, Ofcom will pursue the nuclear option and refer BT to the competitions watchdog.

Second, that the regulator is happy for BT to sweat its copper assets in the "last mile" connections to homes and businesses if the telco chooses to do so, but it is unwilling to wait for BT to create a market for FTTP.

It was reassuring to hear Ofcom CEO Sharon White on the BBC saying we should compare our broadband infrastructure to world leaders like Japan or South Korea, and not to laggards in the EU as the government and BT like to do. White has set the ambition for a fully fibred digital Britain over the next 10 years - it's now up to the communications industry to make it happen.

But many questions remain, of course.

The core of creating an FTTP broadband market lies in allowing other ISPs to use BT's poles and ducts to lay their own fibre to homes - something they have been allowed to do since 2011. Nobody has taken up that offer so far, complaining early on that BT's terms and conditions made the scheme unworkable.

Ofcom has to find a way to make access to ducts and poles cost-effective, straightforward and transparent - something BT will gently resist for as long as it can.

The hope is that fibre investment by BT's rivals will spur BT/Openreach to respond - or will alternatively make Openreach redundant if BT sticks firmly to its copper guns.

While many observers will focus on BT's unwillingness to further the structural and operational separation of Openreach, Ofcom has cleverly put pressure on the likes of Sky, Vodafone and TalkTalk to put their money behind their complaints about Openreach.

If rival ISPs truly believe that BT's lack of FTTP investment is the reason their ambitions for broadband are thwarted, Ofcom is giving them an opportunity to go their own way and ignore BT. If they do not, then their arguments about splitting off Openreach lose all credibility.

The Ofcom review guarantees nothing. If ISPs don't invest in their own fibre, we're stuck with Openreach and copper cables. But the watchdog has made a positive statement that Britain needs a fully fibred digital infrastructure within the next 10 years, and it has laid down a challenge to which the industry - and not just BT - must respond.

Will Ofcom break up BT? Probably not - but it must enable fibre broadband fit for the future

bryang | No Comments
| More
Next week sees an announcement that will set the scene for the next 20 years of the UK's digital infrastructure.

Ofcom's review of the communications market is due out within days and its potential impact spreads well beyond the telecoms companies affected by the watchdog's regulatory canvas.

The UK broadband market is largely a success - competitive for the consumer, with widespread adoption that puts us ahead of most of our European rivals. But it has serious flaws too - a lack of support for rural and hard-to-reach areas, and the natural tendency of a former telecoms monopoly to want to sweat its copper assets instead of investing in a fibre infrastructure fit for the next 50 years.

The big question that has dominated the headlines since the review was announced is the future of Openreach, BT's "last mile" infrastructure provider, and whether it should be split from its parent.

I've written here before the reason why I feel BT should divest Openreach and allow it to become the Network Rail of broadband Britain (although hopefully rather more effective). But it's a complicated situation and there are equally strong arguments against - Ofcom will know that forcing BT to sell off Openreach will be hugely controversial and lead to years of legal debate.

However, Ofcom and its regulatory predecessors have been the catalyst for the growth of broadband in the UK. First, Oftel forced local-loop unbundling (LLU) on BT in 2001, delivering competition and price cutting in the consumer market. Then in 2005 Ofcom forced the structural separation of Openreach within BT Group to level the playing field for its emerging rivals.

For all the investment BT has made in broadband, it has to be remembered that those significant steps were taken through regulatory pressure, not BT's altruism. For all BT's resistance, Ofcom may yet feel that the logical next step in that process is full separation of Openreach. The regulator doesn't have the power to make that happen, but it can refer the issue to the competitions watchdog, which does.

In reality, I suspect Ofcom will not force the issue yet. Instead, it will give BT a final opportunity to prove that that a wholly owned Openreach can deliver better customer service, full rural coverage, and investment in a fully fibred future.

For a start, Ofcom might push for full unbundling of BT's fibre to the cabinet (FTTC) services - there are complex technical considerations to doing so, but such a move is likely to generate more competition in FTTC than the current "virtual unbundling" offered to the likes of Sky and TalkTalk.

For all the government's boasts around the roll-out of superfast broadband, adoption of FTTC still lags behind and more competition and price pressure will grow its take-up in the same way that LLU boosted ADSL broadband use.

Ofcom also must find a way to force Openreach to provide better support for smaller fibre broadband firms - the so-called altnets - as well as local community initiatives such as B4RN. These innovative providers should not feel like they are fighting BT at every step in areas that BT has deemed economically unviable for itself.

The regulator also needs to create an environment where telecoms companies are encouraged to invest in fibre to the premises knowing that the return on that investment may be very long term. Let BT sweat its copper for as long as it likes - but we need a long-term plan to replace that copper last mile with fibre, whether BT wants to do so or not.

The Ofcom review will only be the start of a process and its recommendations will be analysed and picked over for many months. But the importance of getting this right cannot be underestimated - Britain's digital future is at stake.







Why are so many organisations bringing outsourced IT back in-house?

bryang | 1 Comment
| More
To outsource or to not outsource? That, for many IT leaders, has been something of a religious question for a long time. You're either a follower or you're not.

But we are no nearer to answering the question of whether outsourcing works. In recent months, Computer Weekly has talked to numerous large organisations that have brought large-scale outsourcing arrangements back in-house with enormous benefits.

AstraZeneca has saved $350m a year from its IT budget by insourcing. The DVLA expects to save £300m over 10 years from a similar exercise. Daimler anticipates €150m annual savings. Even General Motors, which practically invented large-scale IT outsourcing and was owner of outsourcing giant EDS for many years, is slowly insourcing about 90% of its previously contracted-out operation.

Clearly if you can make savings of that size, there is something fundamentally wrong with the outsourcing model at scale.

And yet, in the last six months of 2015, UK public sector outsourcing leapt 55% compared to the first half of the year, and across the year was up 26%. In contrast, private sector outsourcing spend fell by 42% during the second half of 2015 - but was still worth £688m in new deals.

In the public sector - and in particular local government - there seems more confusion than ever. In Whitehall, the Government Digital Service issued an edict against large-scale contracts and is encouraging moves to bring IT in-house. And yet it's also conducting a review into how to handle the billions of pounds worth of outsourcing deals set to expire in the course of the current parliament.

In local government, we've seen Cornwall, Dorset, Bournemouth, Liverpool, Birmingham and others move away from outsourcing - often after a realisation that their suppliers are unable to deliver austerity cuts and still make a profit. But we've also seen many councils signing up to new long-terms contracts on a promise of delivering cuts.

Of course, a big part of outsourcing success relies on the buyer to be an intelligent customer. But too often, organisations simply outsource the IT skills they need to manage suppliers effectively.

Like so much in the digital world, IT chiefs need to take a much more granular approach to sourcing. It maybe that certain routine, predictable, process-oriented IT tasks are very well suited to being run by a specialist outsourcer at scale. But you might struggle if your customer-facing web or mobile software shop is run by a supplier when you need fast response, agile development and rapid iteration.

Digital transformation is a huge challenge for traditional outsourcers and a threat to their business model. Those suppliers need to go through a lot of change - belatedly - or they will be swept away by smaller, more agile alternatives.

Outsourcing has its place - but the "all or nothing" approach is surely dying.

Privacy Shield is no solution for data protection - EU should put personal data in the hands of its citizens

bryang | No Comments
| More
Many US and European businesses no doubt breathed a sigh of relief when the European Commission announced it had agreed a basis for replacing the defunct Safe Harbour data protection agreement with the US government.

Now we have a Privacy Shield in place to ensure that European citizens' personal data is subject to comparable data protection principles when transferred into a US-located database. At least, that's the theory.

However, as privacy campaigners point out, there seems little more to the new arrangement than a letter from America promising that EU residents' data will not be subject to mass surveillance. "Honest it won't," said a draft of the letter. "We really promise, pinky swear, that we won't use it for mass surveillance. Because hey, we don't do mass surveillance anyway! (Hey Chuck, do you think they'll notice we had our fingers crossed?)"

OK, maybe that's not what the letter said after all. But given the US government's attitude to data privacy, a nicely worded note might not prove to be the basis for a lasting and secure agreement.

"A couple of letters by the outgoing Obama administration is by no means a legal basis to guarantee the fundamental rights of 500 million European users in the long run, when there is explicit US law allowing mass surveillance," wrote campaigner Max Schrems, whose legal case brought about the demise of Safe Harbour.

But the underlying problem has not been addressed - that current approaches to data protection remain a 20th century attempt to solve a very 21st century problem; analogue solutions in a digital world.

Data protection laws are entirely predicated on the assumption that corporations and governments hold all our personal data, and are thereby granted conditional rights to use that data as they wish. It's based on the concept that all the data that matters is held in big databases, somewhere apart from the person whose data has been collected.

The UK Data Protection Act, for example, says that we have the right to see a copy of the information that an organisation holds about us and how it is being used - but only if we ask nicely in writing and pay a fee.

How quaint. Surely the digital solution is the right to log in to an organisation's website and see all our data and how it is being used - and moreover, to be able to edit or remove it if we wish?

Data protection remains a database-centric approach to regulation, when the "digital way" is data-centric - a very important distinction. Data-centric means that laws and IT systems start from the data itself, not from a centralised place in which that data is aggregated with everyone else's.

There has been a lot of discussion about personal data stores - still an emerging technology that allows us to hold all our relevant information in a location controlled by us, from which we set the rules and permissions about who can access our data and for what purpose. It's the data equivalent of an online bank account, which is where we control who can take our money and for what purpose.

This data-centric approach puts control in the hands of the individual and negates the need for international data protection agreements because if you're happy for your data to be accessed or stored by a US company, that's your decision. You might trust Google, but not Facebook - the choice should be yours.

If all you want is to allow access for the purpose of a single transaction, you could say that too - so the company uses a copy of the data to complete the transaction, then deletes it. Maybe if they offer you a good discount, you might be willing to let them keep a copy for a while? After all, why should a company's desire for data analytics to better target its marketing at you be a reason for them to keep all your data as they see fit? The choice should be yours.

The Safe Harbour / Privacy Shield row only serves to show that legislators are a very long way from understanding the potential to put control of our data back into our own hands. A truly digital solution would be to legislate for the introduction of technology - whether personal data stores or something else - to make that happen. The choice should be yours.

Google tax debate is just an early example of social upheaval from the digital revolution

bryang | 2 Comments
| More
In many ways it is a shame that the most likely reason for a technology company to hit the headlines in national newspapers and on TV is because they don't pay enough taxes.

For all the largely justified criticism of Google's £130m tax settlement with the UK government, surely most people in the technology community would rather be reading about the great innovations from Google et al, and how they are changing the way we live and work for the better (mostly).

But the tax argument is nonetheless one that demonstrates the scope of the technological disruption facing society and how the digital revolution will force governments, companies and individuals to re-evaluate many of the old norms we have taken for granted.

At its heart, the Google tax debate rests on an accounting principle, not a technical one - the ability for country subsidiaries of a multinational corporation to charge for services between those subsidiaries in order to shift the point of profit to a lower-tax regime. It surely can't be beyond law makers to legislate to restrict those internal transfer fees so that profits are more accurately recorded at the point of consumption for goods and services, not of delivery.

But until technology made it so easy to split consumption and delivery of digital services, it was barely an issue.

When Apple launched the mobile app era in 2008, who could have predicted that one outcome just eight years later would be French taxi drivers on strike and burning tyres on a Paris ring road because a piece of software - Uber - is threatening their livelihoods?

For all the excitement about the great innovations we are using and those yet to be created, it's almost impossible to predict the true social and business impact they will have - only that there will be huge upheavals as we absorb these new capabilities into our lives.

We hear plenty of scaremongering about the potential effects of robots and artificial intelligence, but think too about more mundane developments like the internet of things and how that could empower individuals like never before with information about the world around them.

Governments and legislators will always react too slowly even with so much evidence of the pace of digitally inspired social change in front of them. There are digital King Canutes everywhere - but a disturbing concentration of them among our leaders, even as they hope for the glory by association that comes from close involvement with digital innovators - look at David Cameron and George Osborne's very public fondness for Google.

The only certainty is that taxation will not be the only social tenet that is challenged by technological change in the years ahead. As a society, we need to be prepared for further upheavals that few are likely to predict.

We need coordination between old economy job cuts and digital economy job creation

bryang | No Comments
| More
BT has one of its main contact centres in a tower block in Swansea city centre - it's the highest office building for miles around, just a short stroll to the sea front. From its upper-floor windows you can see across Swansea Bay to the lights and smoke from the giant steel works in neighbouring Port Talbot.

Today we've heard that BT is creating 100 jobs in that Swansea office. We also learned that, sadly, Tata is culling 750 jobs at its steel plant across the bay.

If ever there was a situation that demonstrated the benefits of better co-ordination around old-economy job losses and digital economy job creation, it's exemplified in that South Wales microcosm.

I have no idea how much steelworkers earn compared to BT call centre workers, but it's certain there will be no local demand for unemployed people with steelwork skills. Surely it's not that difficult to put some of those redundant Port Talbot staff in touch with BT tomorrow? With some training and new skills, 100 people could move from a declining industry to a growing one.

This is a scenario that will be replicated across the country over the coming years.

We also heard today from the World Economic Forum (WEF) in Davos that its research suggests seven million jobs could be lost as a result of technological developments in major economies in the next five years, part of what WEF calls the "fourth industrial revolution".

Meanwhile, the European Commission tells us that Europe needs to find a further 900,000 skilled IT workers by 2020.

It's getting a bit boring writing about this - see here and here, for recent examples - but surely it is not that difficult to achieve some form of co-ordination to take people losing their jobs in one industry and retrain them with skills needed in the technology sector?

You would like to think the government might do so - but there's no evidence the current UK government has any inclination for such an initiative. You would hope at least that local authorities could team up in their regions to make something happen - but most are focused elsewhere and struggling under austerity cuts. It would be nice to think that big business might take an interest - they stand to benefit, after all. But nothing happens.

We talk endlessly about skill shortages in IT when the truth is we have a training shortage - both in training new entrants from other industries, and in training existing people in IT with the new digital skills that are most in demand today.

Almost every time I write something about IT skills shortages I hear in response from an out-of-work IT expert complaining that they can't find a job - often it's because their skills are different from those being recruited by IT employers today. They could do with help in retraining too.

Wouldn't it be great if someone picked up the phone at the BT office in Swansea - it is a call centre after all, there are plenty of telephones there - and put in a local call to HR at Tata steelworks about those jobs.

Wouldn't it be even better if there was somewhere every redundant worker could go to be retrained in digital economy skills and help them find a job with a future?

The government's patrician approach to privacy risks a spiral into ever greater surveillance

bryang | No Comments
| More
It must be 15 years since the first time I wrote the phrase, "Privacy will be one of the defining challenges of the internet age". In the intervening years, that challenge has grown enormously.

Allied with privacy is trust, and it's clear nobody trusts a word the government has to say on the subject. Home secretary Theresa May's recent appearance in front of a committee of MPs examining her Investigatory Powers Bill showed as much.

She insisted that the UK government does not conduct mass surveillance of its citizens, for example, provoking widespread cynicism across social media in the light of Edward Snowden's revelations about GCHQ snooping.

May also insisted that government has no plans to ban encryption or legislate for the introduction of backdoors. However, she did say that companies would be expected to act on a lawful warrant ordering the disclosure of data to the authorities - something that many providers who offer end-to-end encryption would find very difficult, if not impossible.

There is widespread agreement that security services need to access internet data to help keep the UK safe. The challenge, of course, is where to draw the line between necessary surveillance and acceptable levels of privacy.

Law enforcement inevitably feels that the loss of privacy is worth it. An FBI cyber security expert recently told me exactly that - ensuring security is a price worth paying and citizens should accept that governments will be able to store, access and analyse their data. But he would say that, wouldn't he? Such an attitude will make most of us feel deeply uncomfortable.

The more authoritarian that governments become in their surveillance, the more the technology community will find ways to protect people from such intrusion - new forms of encryption, additional layers of data security - that makes it harder to intercept our data, and in turn makes governments introduce even more extreme legislation.

There is a real danger of an ongoing spiral into ever greater surveillance powers.

The Investigatory Powers Bill will be an important landmark but will not be the end of the privacy debate. As individuals, inevitably, become more security aware and technologies emerge to give us more control over who can see our data and what they can do with it, this will become an ever more personal issue for all of us.

Governments will need to be less patrician about how they approach privacy and surveillance, and do more to earn our trust. They can be sure that technology will give us all much greater control of our data, and without that trust the essential work of the security services only becomes harder.






Will 2016 be the year we find out which 'traditional' IT suppliers survive the digital revolution?

bryang | 1 Comment
| More
The combined revenue of the five biggest global corporate IT suppliers declined by more than $12bn over their past four financial quarters - at a time when the use of technology worldwide is booming.  

That $12bn isn't money that has disappeared from the market - it's simply being spent elsewhere. A growing number of IT leaders are turning to new sources of innovation - whether from startups, fast-growing firms in emerging technology areas such as cloud, or greater use of open source and in-house resources.

So perhaps the biggest question for 2016 is whether the traditional IT suppliers - the ones you formerly couldn't get sacked for buying from - are able to reverse that slide by proving they can compete in a new digital world. If not, then by the end of this year we may be preparing a series of corporate obituaries for their slow, inexorable, inevitable declines.

Look at the corporate structures of some of these old behemoths and read their marketing messages - it's all product, product, product. Yet there has rarely been a time when IT leaders are less interested in the latest product.

For most of the history of corporate IT, change has been driven by the newest products from the big suppliers. You all remember the acronyms - ERP, CRM, BPM and all the others. Each represented a new wave of software products, and a new source of income for their suppliers. Microsoft Windows was always the classic example - product sales driven by whenever Microsoft decided to release a new version.

That's left many of those suppliers in the unenviable position of spending most of their multibillion-dollar research and development budgets on simply developing new functionality for their existing products - and their sales reps forced to push those products endlessly.

As a result, the amount of genuine innovation coming from those companies has dwindled - how many game-changing technologies have they produced in the last five years? And they all missed cloud, mobile, big data, social media - and will miss many future trends too.

The balance of power has changed, but traditional IT suppliers are having the same conversations with IT leaders they were 10 years ago. Those IT chiefs - dependent as they are on the suppliers' legacy systems, for now - listen politely, but increasingly take their transformational digital spending elsewhere.

Some observers would say it's already too late for many IT dinosaurs - and in some cases, they're right. But there is no doubt the clock is ticking and the asteroid is approaching for all of them. By the end of this year, the purchasing decisions of IT leaders will give us a clearer view of their fate.

GDS must respond openly and honestly to NAO criticism

bryang | No Comments
| More
MPs are often at their most lyrically creative when they sit on the Public Accounts Committee (PAC) and are given free rein to criticise those they see as miscreants wasting taxpayers' money.

So this week we saw yet another problem IT project under scrutiny, described as "some form of a Greek tragedy" and a senior government IT chief labelled as "Mr Fancypants".

We also saw an unprecedented interjection from the head of the National Audit Office (NAO), Amyas Morse, stressing how serious and unusual it was for the NAO to criticise the personal behaviour of senior individuals involved in the project.

The programme in question was the new digital service for the Rural Payments Agency (RPA), which was suspended earlier this year in favour of paper forms, is now 40% over budget and risks costing the UK millions more in EU fines.

We're all depressingly used to reading NAO reports critical of government IT projects, and much of the findings were repetitively familiar for anyone that has read similar reports over the years. But the most disturbing thing is that by now it was meant to be different.

The Government Digital Service (GDS) was not alone in receiving criticism from the NAO and nor can it be solely to blame. But for all the good things GDS has done elsewhere, the RPA project was the first and biggest programme with the sort of complexity long associated with government IT, that GDS was brought in to resolve.

Indeed, previous RPA projects that failed expensively were held up as precisely the sort of thing GDS would ensure never happened again. Yet here, GDS's involvement seemed only to make the situation worse.

Last week, the NAO also reported on the failed e-Borders project - an initiative that first went wrong before GDS was created. But problems still continue around the replacement for the UK's ageing border systems, and while the NAO did not specifically mention GDS's more recent involvement, insiders say that borders sits alongside RPA as the sort of failure that was not meant to happen now GDS is on the scene. The NAO report will be discussed by PAC next week - expect more lyrical fireworks.

GDS has just received £450m to drive the digital transformation of government, and will be a major influence in the £1.8bn given to Whitehall departments for the same purpose. This is, of course, a good thing.

But RPA and e-Borders have raised serious questions about GDS's capabilities in more complex digital projects, as well as highlighting the lack of digital skills across the whole civil service.

GDS likes to be open about its successes, but has too often kept quiet about its failures. With billions of pounds of taxpayers' cash to be spent on digital projects in the next five years, GDS needs to show that it has learned the painful lessons of RPA and e-Borders, and to do so publicly and honestly.

All of us need to play our part to influence the role of technology in the UK economy

bryang | 2 Comments
| More
Influence is a wonderfully subjective measure by which to gauge successful people in the UK technology scene.

Influence can be negative as well as positive. You might be influenced by a particular IT leader, while others may see the same person as thoroughly lacking the quality.

But there is little doubt that influence is key to growing the UK's digital economy - influence over politicians, over boardrooms, business strategies, regulators, investment decisions, skills and recruitment; all of these are needed for technology to expand its own influence on UK growth and public service delivery.

Not everyone agrees with the choice of Computer Weekly's readers and our expert panel of independent judges that BT CEO Gavin Patterson is the most influential person in UK IT over the next 12 months - and in these days of social media, they're happy to tell us so. But plenty of people agree completely, too.

BT has often been a source of controversy and generates strong opinions, but as the UK's biggest indigenous technology company you can't deny its influence. We think that the Ofcom review of the UK communications market and its potential effect on BT will be one of the most significant events of 2016, critical to the future of the digital economy and our mobile and broadband ecosystem. Others feel differently - and that's great.

Every year we hope to stimulate debate with the latest UKtech50 list of the most influential people in UK IT, because this is a debate that affects us all. While the influence of technology in the UK economy has grown inexorably and inevitably, there is still much more to do.

We need to be talking about how to influence public policy and corporate strategy to take advantage of the digital revolution. We need to discuss how technology is changing society for good and for ill; about how our personal data should be used and protected; about the boundaries between privacy and security. And we need role models to lead the way and to encourage more people to join the profession to make sure we have the skills needed to make the UK a world leader.

We're proud to laud Gavin Patterson as the most influential person in UK IT, but the other 49 people on the list are just as important, as are thousands of others making decisions every day on the future of technology. Between us, let's make the tech community the most influential voice it can be.

GDS gets a £450m budget boost - and a £3.5bn incentive to prove digital really works

bryang | No Comments
| More
Even people close to the Government Digital Service (GDS) seem surprised - pleasantly so - by the announcement of a £450m budget over the rest of this Parliament.

While it's still not clear exactly how that cash will be allocated, it's a far cry from expectations in the months leading up to chancellor George Osborne's spending review. From the gloom and despondency of the summer when former GDS chief Mike Bracken and his senior lieutenants quit amid rumours of huge cuts, GDS has received its biggest ever budget boost and a commitment for the next four years.

As recently as September, GDS executives were expecting to be "turning down the volume". Osborne just turned it up to 11.

Assuming the £450m runs from the 2016 to 2020 financial years, that's a 94% increase from the most recent £58m a year. And in a further surprise, that GDS budget is in addition to the £1.8bn allocated by Osborne for digital transformation across Whitehall departments - early assumptions were that GDS was part of that figure.

So Osborne has given significant backing to both GDS and the wider digital programme across government - but now the pressure is really on to deliver the promise of digital change. For that £450m, Computer Weekly has learned that GDS is expected to return at least £3.5bn in savings; HM Revenue & Customs is spending £1.3bn on its digital strategy and must return £1bn every year in additional tax revenue.

The government has claimed big savings from digital during the last Parliament - Cabinet Office minister Matt Hancock mentions a figure of £3.5bn. But that's a highly contentious claim - for one thing, it's compared to a "2010 baseline", meaning it is money that would have been spent if government still worked like it did in 2010 under Labour.

Some of the savings are clearly genuine - Gov.uk costs a lot less to run than the multitude of websites it replaced - but much is clever accounting. For example, £600m of savings in 2014/15 was attributed to the spending controls introduced by GDS. But what that's actually saying is - "Someone wanted to spend £100m on an IT project, we said no, and they spent £20m instead; therefore we saved £80m". It's not a "saving" in terms of reducing the amount of money government used to spend - it's a saving compared to what it would have spent if the controls did not exist.

GDS is working on a new business plan, expected by the end of the year, which will give more detail on how that £450m will be spent. But there can no longer be any doubt about this government's commitment to a strong centre for digital government, technology and data - a commitment questioned in August by Mike Bracken, and which led to his departure.

GDS and its digital advocates have said all along that the potential benefits are huge - they now have four years to prove it.

Software is never perfect - and that includes the Post Office's controversial Horizon system

bryang | 2 Comments
| More
Software goes wrong. Every developer knows that. Even the most thoroughly tested piece of software can come up with an unexpected set of circumstances that cause it to behave in an equally unexpected way. Sometimes those unique cases can be so unusual, they are impossible to replicate.

It is difficult to believe that any large-scale, complex software application is entirely and completely free of any possible flaws arising from unforeseen circumstances, no matter how well it performs in the vast majority of usage.

This, essentially, is at the heart of the ongoing dispute between subpostmasters and the Post Office over its Horizon IT system.

The Post Office has consistently said there are no systemic flaws in Horizon, and certainly none that would have caused the accounting discrepancies that led to subpostmasters receiving fines and even jail terms for alleged false accounting.

The organisation has pointed out that affected postmasters are a "tiny" proportion of the number who use it successfully to process millions of transactions every day.

And in turn, that is exactly the point that campaigners make in response - that all it takes is a tiny number of unexpected, unusual circumstances that perhaps cannot be replicated. There are about 11,500 sub-Post Offices in the UK, and just 150 subpostmasters in the Post Office mediation scheme - that's 1.3% - although many others claimed to have been affected.

Many businesses would be pretty happy with a 98.7% success rate for its core software - but all it takes is just one of thousands of otherwise successful transactions for each of those 150 people to have had a problem, which would mean an even lower failure rate.

The Post Office says, "The Post Office takes its responsibilities towards its postmasters extremely seriously and wholeheartedly rejects any suggestion to the contrary.

"Neither the Post Office nor other parties have identified any transactions caused by a technical fault in Horizon which have resulted in a postmaster wrongly being held responsible for a loss."

And they are correct - none have been identified in those cases. But that doesn't necessarily mean that in 0.013% of sub-Post Offices, there wasn't some undetected, unrepeatable problem that affected Horizon - user error, a power spike, a momentary hardware glitch, coffee spilled on a keyboard.

This week, Computer Weekly revealed the Post Office knows about a recent flaw that can cause accounting errors, and it's being fixed. So it is possible for a problem in Horizon to occur that could lead to a similar situation to that faced by the affected postmasters. But, as the Post Office stresses, there is no evidence to show that it did so in their specific cases.

The lesson for all is that no organisation should assume that its software is perfect.

Archives

Recent Comments

  • David Chassels: Well said KoolAid. All the "innovation" initiatives in Maxwell's watch read more
  • Bryan Glick: Thanks for the comment - I'm not here to defend read more
  • Digital KoolAid: Hello Bryan, Liam may be a wonderful, abrasive and clever read more
  • John Lewis: As an indication of where they are today (April 2016), read more
  • Craig Cockburn: Some additional thoughts on GDS improvements: http://blog.siliconglen.com/2016/03/government-digital-service-reboot.html read more
  • Matt X: Well, here's one organisation still flying the flag for overpriced read more
  • Matt X: The tax debate highlights how global tech companies and their read more
  • Philip Virgo: This raises the (all too real) question of how many read more
  • Sarah Hurrell: Great article. From my experience, I believe that most large read more
  • Roger Cashmore: I would be interested in learning about the demographics of read more

Dilbert

 

 

-- Advertisement --