Amazon Web Services is showing traditional IT players how they need to change

bryang | No Comments
| More
Amazon Web Services (AWS) is clearly doing something right. The e-commerce giant has split out AWS revenues for the first time in its latest financial results, revealing a $5bn business growing at nearly 50% year on year.

AWS has shown the big, traditional IT players the way to do public cloud - defining the market for infrastructure (IaaS) and platform as a service (PaaS) along the way, forcing the likes of IBM, HP, Oracle and Microsoft to respond. Amazon is by far and away the dominant public cloud player, and when you see it is also the company's most profitable division, the scope for further growth, innovation and lower prices shows it is still in the early stages of its development as a business.

Perhaps unsurprisingly, there's a certain resentment towards AWS in parts of the IT industry. It has not played the game by the same rules as its increasingly distant competitors. It has constantly cut prices - the more customers AWS has, the bigger the economies of scale, and the lower the unit cost for every customer. Every new AWS user is, eventually, helping to cut costs for every other user. You're not meant to do that as a traditional IT supplier - imagine if the cost of software licences fell when more customers bought that software; it just doesn't happen.

AWS has done very little marketing, relying on word of mouth among IT leaders. It's common at Computer Weekly events for our CIO guests to be heard advising their peers, "Why don't you just put it on Amazon?" AWS users seem not only to be happy, but to be happy to tell others about it too.

AWS applies a retail mindset to the provision of technology services - the "pile it high, sell it cheap" approach behind its e-commerce success. It exploits the commoditisation of IT to develop new products and services that build its ecosystem using the same commoditised pricing. And it encourages others to use that ecosystem to build their own open services that can be fed back into the ecosystem. It's a bit like Microsoft asking third-party Windows developers to include all their applications in Windows, for free.

Amazon has achieved $5bn of cloud revenue at a time when there are still widespread fears about cloud - related particularly to security and data protection - that prevent many large organisations, especially in heavily regulated sectors like financial services, from moving to public cloud. But those fears will be overcome; the sceptics will be convinced; the laggards will be forced to catch up. A tipping point is approaching.

AWS has proved, so far, to be an impressive technology business, and its potential to further shake up corporate IT is huge. But we do need more competition - a challenge the rest of the industry must better respond to. As long as Amazon Web Services continues to do things in ways the traditional IT players find anathema, it is going to keep eating into rivals' profits and embedding itself into IT leaders' strategic plans.

The missing election manifesto promise - educating MPs on the UK's digital opportunity

bryang | 1 Comment
| More
Who is going to be the big digital winner after the 2015 general election? Congratulations to anyone involved with smart ticketing on public transport - you're in for a good time. The one area that all three of the Conservative Party, Labour and the Liberal Democrats have committed to in their election manifestos is to develop nationwide smart ticketing, particularly on the railways. Who'd have known?

As for the rest of the technology sector, while there is not quite the same universal commitment as those lucky smart ticketing types (and by the way, sign up whoever does their lobbying now), there is enough common purpose to get a view of the digital priorities of the next government, whoever they may be.

High-speed broadband roll-out is assured - if you're in the 95% of easiest to reach properties at least. If you fall outside that area, best vote for the Lib Dems and their promise of 99.9% household availability.

The future of the Government Digital Service (GDS) is secure - even if Labour has taken the opportunity to score a few political points in blaming the outgoing coalition for GDS failing to hit its target of having 25 "digital by default" exemplar services live by the election. The Lib Dems' manifesto was the only one to specifically mention extending the GDS remit to local government, but Labour has previously made a similar promise and Tory chancellor George Osborne made it a Budget commitment last month.

The parties all recognised the importance of backing and investment - in varying degrees - for the UK's science and technology research base, for digital skills and apprenticeships, and for helping to create and grow tech startups.

But the most contentious topic remains that of data privacy and state surveillance. The Lib Dems stand out as the party offering most change, with their digital bill of rights. With Labour and the Tories, we're likely to see continuance of internet snooping by the security services, albeit with slightly more oversight under Labour.

But has any party really given us a vision of a future digital Britain? Not really - it's mostly different flavours of the same trends seen across the past five years. There are still so many more things we could do for the benefit of us all, if only our political leaders had a grasp of the potential.

So here is one policy we would like to see delivered by the time we come to the next scheduled election in 2020 - for politicians to be educated in the true opportunity for technology to radically transform the UK for the better.

Would GDS like to airbrush rural payments out of its digital exemplar history?

bryang | No Comments
| More
Last month, I wrote in this blog about the problems that caused the new rural payments digital service to be withdrawn from use by farmers and replaced by paper forms. The system was one of the 25 "exemplars" that the Government Digital Service (GDS) intended to showcase its digital transformation drive.

GDS needs to be challenged to live up to its mantra of "make things open, it makes them better" by being open about the problems, what caused them, and what was being done to resolve them.

Just today I came across the latest GDS quarterly progress report, published on 27 March, a week after the rural payment service was withdrawn. The report was published quietly and without fanfare in the run-up to the dissolution of Parliament, and GDS along with the rest of the Civil Service is in "purdah" during the election campaign and unable to respond to journalist queries. I haven't seen any other publications cover the quarterly report, so it seems there was little attempt made to publicise its release.

If you read it, here, you would think that the rural payments problems had simply never existed. You might even fail to realise that rural payments was one of the critical exemplar services in the first place.

In a section of the report titled "Service transformation", it highlights "seven more exemplar services went live this quarter" - the list fails to include rural payments, which went live in January.

The report continues: "By the end of March, a total of 20 exemplar services were available for public use. Four services are in beta development and one is in alpha. Home Office, Department for Work and Pensions, HMRC and Department for Business Innovation & Skills /Land Registry will continue work to deliver these remaining five exemplars, building digital by default services that meet the needs of their users."

No mention whatsoever of the work that Defra is having to do to remediate the rural payment problems for future use.

Under a section titled "Exemplar projects: what we're learning" the report says: "We want departments to learn from this transformation work and to use exemplars as templates when redesigning their own digital services. We're also looking at how organisational structures and culture need to adapt and staff skills need to improve."

And that's it - no mention of the lessons that may need to be learned from rural payments, one of the exemplar services, and why it failed to meet user needs.

Eventually rural payment does get a mention in the report, as one of the first services using the new Gov.uk Verify system for identity assurance - but again, no mention of the problems farmers had using the Verify service.

Admittedly, the report was no doubt close to being finished when the rural payments service was withdrawn, and GDS was under time pressure to publish the report before purdah began a week later.

But the total absence of any mention of the rural payments problems in an official GDS quarterly progress report is extraordinary. Not even one sentence, added at the last minute, to acknowledge what happened. It leaves GDS open to accusations that it's not just failing to be open when things go wrong, but seems instead to be airbrushing rural payments out of its exemplar history altogether.


To make Britain digital we need leaders who are part of the network, not apart from it

bryang | 1 Comment
| More
It's great to see so much debate taking place on social media - and hopefully in the real world - following Martha Lane Fox's Dimbleby Lecture broadcast on BBC1 last night. Regardless of what anyone thinks about what she said, if her talk starts a widespread debate about the role of the internet and the digital economy in the UK, then it was a success.

I completely agree with the principles and aims that Lane Fox outlined - better digital inclusion; get more women into IT; tackle ethics and privacy; better broadband; public service reform; and get more politicians to understand the opportunities and issues around digital technologies.

I'd like to think these are all issues that Computer Weekly has consistently highlighted. Let's all get talking about them. If anyone can catalyse the debate, it's Lane Fox, with her public profile, drive to succeed, and contacts with business and government leaders.

But I'm far less convinced about Lane Fox's proposal for a national institution to tackle some of these issues - what she has called "Dot everyone". It seems a very old world solution to a very 21st century challenge. It risks accusations of elitism - gathering the digerati into one great public body to tell everyone else how great digital is.

Let's not forget that the internet became what it is due to ground-up support - nobody in positions of power or influence decided that we would all use the web, that it would become so central to everyday life for so many people. If some great public body had said that 20 years ago, it would probably have doomed the internet to failure.

Equally the criticisms of today's internet giants - Google, Facebook, Apple, etc - and the idea that they need an institutional counterpoint ignores the fact that those companies became giants because we all use them. That doesn't absolve any of them from criticism as they exhibit increasing tendencies towards corporate megalomania, but the great thing about the internet is not only that anyone can use it, but that anyone can stop using it. Facebook grew from nowhere because it engaged people; if we all get fed up with it and use something else, we can still make it go away.

To me, the idea of a national institution places it apart from the people it wants to influence. If there is one thing the internet has taught us, it's that in a digital world, leaders need to be a part of the network, not apart from it.

The real challenge is that politicians need and expect a hierarchical society - indeed, they will do everything they can to protect and maintain it. If the social impact of the internet is truly ground-up, then at some point the irresistibly rising digital tide will meet the immovable hierarchical rock of institutionalised establishment, and then things get really interesting.  

Perhaps proponents of a national digital institution will say its function is just that - to be the bridge between those two forces, engaging with the establishment in terms it can understand, while empowering the network to enable change. But the danger is an establishment-backed institution instead becomes a barrier to keep the hierarchies of power at the top. We already have plenty of public institutions, and you only have to look at some, like the BBC or the NHS, to see what happens when the establishment decides it doesn't really like them the way they are.

In the UK we have yet to see the emergence of the sort of ground-up political movement that can only exist in a digital world. The closest we've seen elsewhere is the Pirate Party, which came to influence in libertarian Sweden and has gathered a following more widely, winning seats in the European Parliament and 5% of the popular vote in Iceland's 2013 election.

Perhaps that suggests the UK has not reached the digital maturity needed for that sort of change and that degree of challenge to the establishment. Plenty of people (me included) throw around the phrase "digital revolution" very easily and carelessly, when perhaps the natural process of social change is more measured. But that change will come - inevitably, inexorably, unstoppably.

As the UK gears up for what promises to be the most dramatic and unpredictable general election in a generation, we can already see signs that the public is turning against the old way of things with its rejection of two-party politics. Nonetheless, we know the country will still be led by one of only two men - both steeped in their own form of establishment background. The stirrings of ground-up change - or at least, the desire for change - are there.

A country governed by the principles of the network not the hierarchy would solve many of today's economic challenges. A new, federated model of central and local government to address devolution, enabled by "government as a platform" technologies, is just one example. A national broadband infrastructure to connect everyone and not just the commercial needs of one or two semi-monopolistic telecoms suppliers, is another. Investment in digital skills to help tackle unemployment and prepare in advance (for once) for the automation of blue-collar jobs, is yet another. I could go on.

Whoever forms the new administration, the next five-year Parliamentary cycle will see the most digital government ever. It will also see the generation that grew up on the internet reach their late 20s and start to emerge as young business leaders and budding politicians. Many of them will be bashing their networked heads against the hierarchical walls of establishment. It's going to happen.

So let's keep going with the debate that Martha Lane Fox has started; let's make noise, make headlines, broaden the network, engage with everyone. In her lecture, Martha challenged journalists and editors to do their bit. That, for sure, is a challenge I hope we can rise to.

After rural payments embarrassment - the test for GDS is to 'make things open, make it better'

bryang | No Comments
| More
The failure of the rural payment digital service last week - and its subsequent replacement by paper forms - is not the IT disaster some would claim, but it is an embarrassment for the Government Digital Service (GDS).

The project, budgeted to cost around £154m, has not seen all that money wasted. Only £73m has been spent so far, and very little of that has been wasted either. The system will still be developed, and will be used for next year's round of farming subsidy claims - or that's the plan at least.

The cost of failure this year amounts to the unbudgeted processing of paper forms, and the cost and effort involved in trying to correct the performance problems that have so far proved insurmountable. That's probably several million pounds that nobody wants to have spent, but it's tiny compared to past IT disasters.

GDS chief Mike Bracken's words in a speech to the Institute of Government in October last year are worth quoting now:

"No policy or service we civil servants think up will ever work in practice the way we thought it would in theory. We must start out humble, and rapidly iterate in response to the messy reality of real users using real services," said Bracken.

"We should say to critics in the media or elsewhere that failure is an essential part of government, just as it is in private enterprise. And the cost of failure should be tiny, dwarfed by its rewards," he said.

"The cost of failure is only enormous if you plan to launch with a big bang on a fixed date in a couple of years' time, with the world's media and public watching - but before you've really started the work to understand how to best meet the needs of the people who will use the service. Big bang was fine in 1986. It is a disaster waiting to happen in 2014."

Those apposite words frame the three issues that need to be addressed in the light of the rural payments problems.

1 - How does agile work with immovable deadlines?

For many years, one of the most frequent causes of government IT failures was the immovability of political deadlines. When the tax credits system fell over under the Labour administration, it turned out that testing was bypassed to meet the political deadline set by then-chancellor Gordon Brown. That's a great example of what Bracken meant when he talked about avoiding fixed dates and big bangs. But the "messy reality" of government is that some deadlines will always be fixed.

If it weren't for an EU deadline of 15 May (since shifted to 15 June) for farmers to claim for the new Basic Payment Scheme policy, the rural payment system might have had time to be fixed. In the end, that fix didn't happen in time. GDS is left red-faced because farmers have been complaining about performance problems with the digital mapping tool since they started using it earlier this year.

Few would argue that avoiding big bang launches is a bad thing. Similarly, few would argue that the iterative approach preferred by GDS's agile strategy is a bad thing. But in government, there will always be fixed deadlines, like it or not - and GDS needs to show that the iterative approach can still work in such circumstances when there are problems along the way.

2 - Was this a prototype or not?

The digital service launched to farmers was, in GDS parlance, still only a beta version. According to the GDS Service Design Manual - the current bible for government IT developments, mandated by force of Cabinet Office minister Francis Maude - a beta is: "A fully working prototype which you test with users. You'll continuously improve on the prototype until it's ready to go live, replacing or integrating with any existing services." Only after a service has passed its public beta phase is it classified as a "live" system by GDS. But farmers were told that this was the mandatory route for making their claims.

The rural payments service was launched to all 110,000 farmers and 1,200 land agents to be used for the very live act of applying for their annual subsidy payments. As one source put it: "Beta is bullshit in this context".

Farmers wouldn't understand the digital concept of alphas and betas - all they wanted was a system that worked. Why wasn't the service launched as a beta to a smaller group of users - preferably the more digitally literate - knowing that they were a test service, while planning from the start to use a paper-based alternative for the remainder of users? That way the developers learn, and they can better prepare for scaling up the service and for what Bracken has called the "edge cases" of farmers who need more digital assistance or live in rural areas with poor broadband?

Iteration, and learning as you go along, is commendable - but was it appropriate for the circumstances here, with a system launched to the entire user base, as the only option for making a claim, with a fixed deadline ahead?

3 - Make things open

Mike Bracken also wrote last year about, "Making things open, making things better". One of GDS's most prized - and widely applauded - principles is "make things open". It refers to open source, open standards, coding in the open, and an open culture, with GDS staffers regularly blogging about the projects they work on, in a reversal of historic civil service practices and secrecy.

But now, when something has gone wrong, the shutters seem to have closed. The Cabinet Office press office suggested Computer Weekly talk to Defra, the department responsible for the policy. The Defra press office said we should talk to the Cabinet Office.

Now is the time for GDS to be completely open about what has happened. There is a growing perception among farmers and the media that the rural payments service is a failure, that the money has been wasted, and all the work done so far has been abandoned. In a vacuum of information from GDS and Defra, rumour and speculation turns into damaging fixed perceptions.

Rural payments is arguably the biggest hiccup for the GDS digital strategy so far. Other services that have gone live have generally worked well - register to vote, carer's allowance, power of attorney, prison visits, online drivers' licence details have all been successful digital launches. But rural payments is perhaps the biggest and most complex of the GDS digital exemplars to so far reach this stage.

The model for this service is classic GDS - multiple, smaller suppliers instead of one or two big system integrators; agile development; multiple off-the-shelf products instead of heavily bespoked versions. This is what we are told will be the model for much bigger services to come, such as online tax accounts, and the Universal Credit digital service - systems that would cause national political repercussions if they failed. It's the model by which the future "government as a platform" will be built.

Furthermore, critics of GDS have warned of an over-focus on the web front-end and user experience, and a lack of attention to the thorny, IT-led area of scaling back-end systems and integrating with legacy IT. We know that the core of the problem with rural payments was difficulties between the front-end mapping tool and the back-end rules engine. Servers were hitting 100% utilisation and falling over, which suggests a scaling issue in the back-end software or the integration layer.

But what were the problems exactly? Were the suppliers to blame? What are the next steps to a resolution? What was broken? We just don't know.

That is not "making things open". Openness is to be welcomed, but it cannot only apply to the good news. The true test of openness for GDS is now, when something has seemingly gone badly wrong.

So over to you GDS - make things open, so we can see how you are making things better.







George Osborne set the UK on a digital roll - now a new government must accelerate the momentum

bryang | No Comments
| More
It's rare that I'm inclined to congratulate George Osborne, but recognition is due to the Chancellor for making his latest budget the most tech-friendly ever.

Every year, Computer Weekly is inundated with press releases before the Budget from various interest groups calling on the Chancellor to include this or that technology policy. Typically, the day after the Budget we then get the follow-up release chastising the Chancellor for failing to deliver on their hopeful wish-lists.

But this year, few could complain - the list of supportive announcements was long. Tech startups, science and technology research, internet of things, driverless cars, smart cities, skills, broadband, mobile networks - all received funding or government support of some form.

The government's digital strategy even underpinned one of Osborne's headline-grabbers - the abolition of tax returns, made possible by the planned introduction of personal online tax accounts and HM Revenue & Customs' real-time information system for tax collection.

There is little doubt that whoever wins the election in May, digital and technology will play a bigger role in the next government than ever before.

Labour must be a little frustrated - since the party's digital government review was released last year, the coalition has slowly nicked most of Labour's most popular recommendations. Osborne added another - extending the remit of the Government Digital Service (GDS) to help local authorities with their digital plans (although the Cabinet Office was unable to offer any further details on this, which makes you wonder how much they knew about it beforehand).

Some of the announcements promise to be truly transformational - not least a commitment to produce a standard banking API to open up the big retail banks' data and systems to new entrants.

We are, slowly, getting the UK onto a digital roll. Whoever wins the election must commit early to protecting, continuing, and preferably accelerating this momentum. The UK has a genuine opportunity to be a world leader in the digital economy, for the betterment of everyone - creating jobs, wealth and social opportunity; improving healthcare and education; making this country a base for science and technology innovation that is the envy of the world.

We look forward to whichever party or coalition of parties is willing to accept and deliver on this defining challenge for the next Parliament.  

Does IT have a problem with people?

bryang | 2 Comments
| More
Does IT have a problem with people?

That's a question I found myself asking, after listening to the speakers at a BCS event last week. It's not a new question by any means, but when you consider a few topical challenges facing IT leaders and the IT industry, it's one that perhaps gives an insight into the radical changes taking place through the technology supply chain.

In companies, the question goes to the heart of the changing relationship between IT departments and their users - the rise of "shadow IT" and "bring your own device" (BYOD) have come about as a rebellion against the historic "command and control" culture of most IT teams.

In government, it relates to the mantra of "user need" espoused by the Government Digital Service (GDS); it goes into how GDS wants to change the procurement and delivery of IT in Whitehall, and the irritation that seems to be causing for IT suppliers.

Let me explain.

One of the speakers at the BCS event was Will Whitehorn, Richard Branson's right-hand man and currently president of Virgin Galactic, Branson's ambitious space travel business.  His simple explanation of Virgin's approach to risk and failure should be mandatory listening for anyone involved in IT. He highlighted lessons that the IT sector - typically so risk averse - is only starting to learn.

From Whitehorn and other speakers, there were, for me, three key points to consider.

IT today is led by products not people

Whitehorn gave an example of Virgin's focus on customer needs throughout its supply chain, which came from his time setting up Virgin Trains, and in particular its West Coast main line franchise. Virgin wanted a totally new train design to meet its goals for speed and passenger service - the end result was the Pendolino tilting trains in use today.

Whitehorn went to suppliers and outlined what their customers wanted, and told the train makers to design and build something to meet that user outcome. He felt that was delivered.

Consider by comparison the typical conversation between an IT leader and their suppliers, where an explanation of business outcomes and user needs typically leads to the offer of a "solution" based on a series of products. "You want to increase customer service levels by 10%? Here - buy my CRM software!"

The IT industry has yet to understand and manage risk well enough to meet that user need challenge - and its thinking all too often is not yet mature enough to go beyond products. There must be a huge opportunity for any suppliers that can genuinely offer user-focused outcome-based solutions in that way - but where are they?

IT thinks "big" when it needs to deliver "small"

"Risk means accepting failure" said Whitehorn, and explained how this was a fundamental principle behind the success of Virgin Group.

The company has grown big by keeping its parts small. Every Virgin-branded business operates independently of every other, so if one fails, the overall brand is not harmed.

Failure is a particular challenge for Whitehorn at the moment, after the tragic crash of Virgin Galactic's test plane last year - a disaster that Whitehorn said is likely to be proved as human error, not equipment failure. But he said that Virgin always knew that going into space was likely to bring high-profile failures along the way, and that the end goal is worth the risk.

But the way they manage that risk is by keeping things small.

Compare that with the historic approach to IT delivery, of "big bang" projects specified to death at the beginning then failing to meet user needs once finally completed. The move to agile - or, better described as breaking down projects into smaller tasks, individually managed and delivered within an overall architecture (for those who recoil at the word "agile") - is an overdue response to that monolithic approach to IT.

It's not about waterfall vs agile - you can use waterfall techniques in smaller tasks as much as you can agile. You can see it coming through in the datacentre too, with the growth of microservices and container technology - keep things small, even within the constraints of something inevitably large such as a corporate datacentre (or better still just put it in the cloud).

The IT industry has always liked to use engineering analogies to explain how it works - that delivering IT projects is like building a bridge or a skyscraper. I've never been comfortable with that analogy. The best and better analogy I've heard recently is from consultant Mark Foden, who said IT has to be grown like a garden, not a bridge that has to be built.

A garden needs a design and an architecture - big plants at the back, small ones at the front; matching colours; hard landscapes mixed with soft - but each element of that garden is independent of the other while intrinsic to the whole. Each plant needs individual nurturing; each pathway laid and maintained. To make the whole garden a success needs different approaches for different elements - shady plants that don't need much water; sun-lovers that need regular feeding; different flowers that bloom at different times as long as you treat each according to their needs.

In IT, you see too many adherents to one methodology - we are waterfall; we are agile; we use Six Sigma; we use Lean; etc. Ironically, IT is not binary; it's not one thing or the other. It needs a flexible (and often complex) mix that recognises the different needs of each element, focused on users not on a finite set of supplier products that in reality vary very little from each competitive offering.

Trust your people

Perhaps it's not surprising that so many IT leaders are not entirely trusted by their CEOs. Too many big failures, too much over-run, too many unhappy users. But it is true to say that too few boardrooms really trust their CIO.

Another BCS speaker, Phil Pavitt, global CIO of Specsavers and former HM Revenue & Customs CIO, made the point that the people in your own IT organisation almost always have the answers to the questions that matter - they just need to be trusted, and to have an opportunity to step forward.

Pavitt bemoaned the typical state of big corporate IT, with expensive consultants brought in to gather information from employees, then report back what they said as if the consultants were the enlightened ones.

I recently talked to a highly experienced CIO, someone with success in major multinationals, who had been employed to overhaul IT strategy and delivery at one of the highest profile organisations in the UK. He has recently left - of his own accord - as a direct result of a new leader of that organisation who insisted on bringing in a consultancy to audit every element of that strategy after it had been completed.

Risk aversion and a lack of trust towards IT are endemic. IT needs to earn that trust, and to convince CEOs to trust them.

So what?

Perhaps it's inevitable that technologists feel more comfortable with neatly defined products that meet a clearly defined purpose. People are complex and unpredictable - how can you deliver IT to satisfy such flaky users who don't really understand IT in the first place?

But this is where IT sits today - at the cusp of change from a product-focused industry to one that is able to deliver user-focused, outcome-oriented business (and personal) offerings composed of small, interchangeable components, with no single points of failure. IT still wants to dictate to its users; its users now know enough to say no.

Perhaps that's the biggest challenge for the maturing IT community as it seeks to exploit the digital revolution and sit at the top table of business and government - to become less risk averse, to manage failure better, and to put people first.

Mobile and cloud adoption is accelerating - don't miss out

bryang | 1 Comment
| More
With 90,000 people attending this year, Mobile World Congress has become one of the definitive events on the technology calendar. Nobody is surprised to hear experts saying mobile is the number one issue - and the same applies to corporate IT.

Computer Weekly's annual survey of our readers' IT spending priorities shows that mobility has leaped to be the top priority for IT leaders with 42% of respondents implementing mobility projects this year - well ahead of the second placed issue, compliance at 31% (see graph, below). As recently as 2012, mobility didn't even feature in the top 10 IT spending priorities.

Thumbnail image for 2015 UK-IT-Cloud Priorities UK - NEO.jpg

As mobility has risen, so has cloud. In our latest survey, for the first time more organisations are increasing spending on cloud than for on-premises IT - a significant milestone. In 2012, the research described IT leaders' cloud plans only as "modest and moderate".

But tracking the survey over the last three years shows not only that mobility and cloud have risen naturally to the top, but they are accelerating at the expense of almost every other category of spending. The strategic shift to mobile and cloud in corporate IT is really happening, and is now unstoppable.

Slide1.JPG

Slicing into the survey data also shows a notable fact - demonstrated in the graph below. If you look at readers' responses on technologies typically associated with digital transformation - areas such as collaboration, big data, social media and virtualisation - you see there is a significantly higher tendency for spending in such product categories among those companies also prioritising cloud.

Slide2.JPG

Cloud is not just a major shift in IT delivery, it is a signifier of greater intent to transform the IT estate using emerging technologies. For IT suppliers, that's hugely significant. If they are not having the cloud conversation with their customers - or are simply tagging a "cloud" label onto their existing products - they are missing out on nearly every other major IT change taking place at that organisation.

For companies such as IBM, HP and others, desperately trying to protect revenues from legacy products and unable to grow sales from emerging technologies sufficiently fast to replace them, that's a huge problem. Who would seriously have thought five years ago that Amazon would become the dominant force in cloud, at the expense of such bellwethers?

For those IT leaders whose employers are still reluctant to invest in mobile and cloud, it may soon be too late - their competitors are changing the game at an accelerating pace. 

Do the UK government's SME spending figures make sense?

bryang | 1 Comment
| More
The government this week touted its success at delivering on a policy objective to put 25% of all purchased spend through SMEs - in 2013/14, that amounted to £11.4bn, slightly ahead of target at 26.1% of all spending.

But, at the same time, many SMEs are up in arms about the way they are being treated by Whitehall, and in particular the Cabinet Office procurement agency, Crown Commercial Service (CCS). The numbers published by the Cabinet Office tell one story, the market seemingly tells another. So which is true? Let's examine the official figures that make up that 25%.

Direct vs indirect spending

First, it's important to understand what the goal was. That 25% consists of direct spending - contracts between government and SMEs - plus indirect spending, which means large firms passing some of their government business to SMEs as subcontractors. Of the 26.1%, direct spend was 10.3% and indirect 15.8%. So in contractual terms, about 90% of all contracts by value still go to big companies.

But government relies on those big firms to report back how much they spend with SMEs - the number is not audited, and is based only on a survey of its 500 largest suppliers. The Cabinet Office admits that "the approach to indirect spend should be regarded as indicative". The indirect figures add a few further caveats:

  • "MoD total procurement and direct spend figures are for the core department only, therefore excluding its Executive Agencies and NDPB."
So, the Ministry of Defence - by far the biggest spending department - does not measure the SME spend through its many external agencies and non-departmental public bodies (NDPB), of which there are 29 different organisations.

  • "FCO direct spend is based on UK spend only"
The Foreign and Commonwealth Office (FCO) - most of which is, inevitably, housed overseas, does not measure direct SME spend in all those overseas operations.

  • "Data as reported by suppliers for central government with no departmental association"
This is the description given for £1.5bn of indirect SME spend - 22% of the total figure - that is not attributed to any Whitehall department. This means that large suppliers claimed they passed £1.5bn of money to SMEs, but cannot account for which departments that work is attributed to. Really? For more than one-fifth of all the sub-contracts to SMEs across all large suppliers, the prime contractor has no idea what department that work is for, or cannot associate the work with any department? That sounds somewhat convenient for those big suppliers, all of which have been pressured by the Cabinet Office to demonstrate an increase in their SME spend. The government admits that the indirect spending figures have not been supplied by departments, so it appears we are to take the word of the large suppliers entirely on trust over that £1.5bn portion of the spend.

Direct spend

Look next at the direct spend - the numbers for which the government can account for itself through contracts placed with SMEs. Those figures have flatlined - representing 10%, 10.5% and 10.3% of total spending respectively in each of the last three financial years.

In the year before - 2010/11 - only 6.8% of spend went direct to SMEs; the prior year it was 6.5%. The jump from 2010/11 to 2011/12 went from £3,200m to £4,439m - an extra £1.2bn. The Cabinet Office attributes that leap to the new coalition policies introduced at the start of the parliamentary cycle taking effect. It was suggested to me by a source that the measurement methodology for tracking SME direct spend changed between 2011 and 2012 - although the Cabinet Office denied this.

So what did cause that jump? The two main policies introduced in 2011, alongside some changes to process such as a mystery shopper service, were:

  • Advertising tenders below £100,000 (tenders greater than £100,000 were already advertised)
  • Abolishing pre-qualification questionnaires (PQQs) for contracts below £100,000 to make it less onerous for SMEs to bid for business.
So, in theory, the only real change for SMEs was in their ability to win contracts worth up to £100,000. There is not yet any published evidence to suggest that from 2011 SMEs suddenly won a greater share of the contracts above £100,000 that were already being advertised.

It's true that in government IT there has been a push to smaller contracts, which opens up more opportunities for SMEs - but the main vehicle for that is the G-Cloud framework, which launched in February 2012, so cannot have accounted for any of the £1.2bn increase in the prior year.

G-Cloud has been widely applauded and has awarded more than 50% of its spending to SMEs - but the total amount spent through G-Cloud since its inception is still only £431m to the end of 2014. Clearly G-Cloud made no contribution to that £1.2bn leap in 2011/12 - and has not had a material effect as overall direct spending has flatlined since its launch.

How many SMEs?

So, we are left to assume that most of that £1.2bn additional annual spend came about as a result of SMEs winning more contracts worth up to £100,000. Even if you take a generous outlook and say that the average contract value was the highest amount of £100,000, that would imply 12,000 new contracts won by SMEs. That's 1,000 per month, or about 50 every working day. Somebody in CCS would be getting through a lot of ink signing all that paperwork.

Of course, if a lot of SMEs won a lot of contracts worth over £100,000, that would account for a greater chunk of the £1.2bn - but that would have happened anyway, as the 2011 policy changes were mostly focused on opening up smaller contracts. If you were generous, then perhaps 1,000 SMEs won contracts worth £1m on average that had never before been won by SMEs, and that would account for most of the £1.2bn. That's still four contracts every working day.

But the government has not been able to say how many SMEs have been awarded direct contracts - Computer Weekly has asked, we've not yet had an answer - but it seems unlikely to be 1,000 additional firms, and even more unlikely to be 12,000. So where are all the SMEs that won that extra £1.2bn?

It's also possible the definition of an SME includes individuals providing temporary services through a company of which they are the only employee - fairly common practice for contractors, to reduce their tax burden. But does direct spend with individual contractors really support the ethos behind the desire to grow UK small businesses through the government estate?

If you apply the same logic to the indirect spend - which increased from £2,946m (6%) in 2011/12 to £6,909m (15.8%) in 2013/14 - that's an increase of nearly £4bn annually. At an assumed average contract value of £100,000, that's 40,000 additional contracts with SMEs let by large suppliers in a year. Again, you might ask - really? Where are they all?

Not lies, just statistics

There is no evidence to suggest the government is lying about its SME spending - but even a casual, entirely non-forensic analysis like the one above raises some significant questions over the extent to which government largesse has really been spread around the UK's SME community.

Of course, the existence of a policy to increase spending with SMEs is welcome, and is clearly having an effect - even if nearly three-quarters of all spending still goes to large suppliers.

G-Cloud has been hugely popular - most SMEs that have engaged with it love it, and some have grown so much as a result they can no longer be categorised as SMEs. But G-Cloud was not created by CCS, which still appears to prefer old-style procurement frameworks managed by outsourcing giants such as Capita, which are coming increasingly under fire from disgruntled SMEs who say they are losing millions of pounds in revenue.

Plenty of SMEs who deal with CCS say they are less convinced that the purchasing agency has bought into the SME policy, and that CCS still feels more comfortable dealing with big suppliers where they can aggregate demand and use that to negotiate better prices.

The progress made on growing SME spending by the coalition government is very positive, let's be clear about that - but it may not quite be as impressive as it wants us to believe. There is still much room for improvement.

Why it's time for the IT community to engage with politicians

bryang | 1 Comment
| More
No blood was spilled, but listening to the three main political parties debate their digital policies together for the first time this week laid out the battleground for their likely general election technology manifestos.

Computer Weekly, with help from TechUK and the BCS, brought together the Conservative, Labour and Liberal Democrat parties to face questions from an audience of IT executives about their priorities for supporting the tech sector and growing the UK's digital economy.

Reassuringly, there was much agreement between the three. On further support for the burgeoning tech startup sector; on expanding the work of the Government Digital Service (GDS) and supporting local authorities in their digital plans; and on the need for reforms around data protection and privacy - all three parties concurred.

You can read all about the debates in our coverage here:

And you can watch video highlights from the event here:

But perhaps the most significant theme from the debate was the need for better engagement between politicians and the IT sector.

In past elections, the IT community has been at the back of the room, desperately waving its hand in the air for the politicians to take notice, trying to tell them how important it was. This year, for the first time, the main parties have invited us forward to sit closer to the front.

There is widespread recognition - at last - that technology can and must play a major role underpinning some of the major reforms needed in the UK in the next five years, in the economy, health and social care, national security, education and welfare.

The IT community has, justifiably, pointed to the lack of digital literacy in Parliament, urging MPs to become more aware of how technology can help deliver the changes they all call for. But all three of our panelists - digital economy minister Ed Vaizey, shadow digital government minister Chi Onwurah, and the LibDems' Julian Huppert, agreed that IT itself needs to become more politically aware too - "The challenge for the tech industry is to meet politicians half way," said Vaizey.

And they're right. For the digital community, your time is now - the doors are open at last. There has never been a better opportunity to engage with the UK's political leaders and show them how technology can make all our lives better.







Banking industry is finally ripe for digital disruption

bryang | No Comments
| More
The financial services industry has yet to experience the digital disruption that radically changed other sectors, such as retail and media. But the influence of the commoditisation of communications, processing power and storage - more popularly known as the internet and the cloud - is going to hit finance too, that much is inevitable and unstoppable.

Arguably, the 2008 crash protected the banking industry from such disruption - increased regulatory scrutiny made it impossible for new entrants and restricted innovation from existing players. But everyone can see the cracks emerging from the lack of digital investment - those ageing, batch-processing mainframe transaction engines at the core of every major bank were simply not designed to handle real-time web and mobile services.

Governments have realised too, that they need to introduce more competition to the market, to reduce the dependence on the global players that let down the world economy so badly.

And what is the one, proven way to increase competition, and break down barriers to entry? It's technology, of course. Once you could sell online without the cost of a physical store, it transformed retail. Once you could publish content on the web without the cost of a printing press, paper and ink, it transformed the media.

We are seeing the emergence of new, challenger banks based on digital technology, which do not suffer from the complex, legacy IT that the incumbents depend on. Also, banks find some of their services being cherry-picked by the new tech giants - payment services such as Apple Pay, for example, not to mention Paypal. Research suggests more of us will use smartphones to pay for things than credit or debit cards by 2020.

Some banks are responding, of course. Barclays is pushing hard on new mobile services, and even providing training in branches for children to learn coding. Santander is going further, and taking on the cloud suppliers at their own game, offering cloud storage to corporate clients. If data is the new currency, there's a logical progression to store it with your bank, even if the chances of one bank building the scale of cloud infrastructure needed to compete with Amazon, Google or Microsoft is unlikely.

Banks still control the global flow of money - but trends like Bitcoin are starting to demonstrate that even here, technology can offer alternatives.

The big finance players are vulnerable - even if they don't want to see it - in the same way as Woolworths, Blockbuster, Comet and others were vulnerable and missed the digital boat. Expect the banks to go through a decade now that will change their industry as much as the last 10 years has done for so many other sectors.

Apple success shows the importance of making IT invisible

bryang | 1 Comment
| More
It's been impossible to ignore the biggest technology headline of the week - especially as it became the biggest business headline in many places too. Apple declared the largest quarterly profit in corporate history - not bad for a company dismissed as a failure 15 years ago and derided as behind the times when it launched its first smartphone.

Apple sold 74.5 million iPhones in the last three months of 2014 - that means 1% of all the people in the world bought an iPhone. You can't say anything other than it was a remarkable achievement - perhaps the zenith, so far at least, of the consumer technology revolution.

I must admit I wasn't convinced by the iPhone when it first launched. I feel the same way about the Apple Watch, which means it's guaranteed to be an instant blockbuster product.

Apple's success has been widely attributed to its focus on product design - making tech cool, fashionable and desirable for the first time. But great design underpins the real reason Apple has changed the tech world - it made the technology secondary, invisible even.

People bought iPhones in their millions because they didn't need a user manual - the intuitive nature of the product meant they looked at its utility not its functionality or technicalities. Even the simplest Nokia phones in the past typically needed you to read through the manual first. That's also a contributor to why Google Glass failed - it was the complete opposite of invisible.

Technology works best when you don't notice it. The suppliers that realise this will be the winners from the digital revolution - I'm not sure, for example, that IBM, HP and Microsoft quite get this yet.

The smartest CIOs that Computer Weekly meets get this - and in some respects, it's what makes them stand out from their IT leadership peers. If your CEO notices your technology, it's usually because it's gone wrong, or when it causes endless frustration to use. "We need to turn that irritant into something that has value," as one such CIO said to me recently.

Technology is great, we all love it, and it's a part of our lives. But the technology that works best - and in a corporate environment in particular - is invisible. For IT to lead in business, it needs to disappear.

Windows 10 proves this is a different Microsoft - but is it different enough?

bryang | No Comments
| More
The software company with 14% share of the global operating systems market announced the latest version of its flagship product this week. It's called Windows - you might have heard of it.

It's not that long ago - barely six or seven years - that Microsoft could claim that Windows ran on more than 90% of all the computers in the world. Since those computers disappeared into our pockets, its global influence has plummeted like no product ever before. More than 90% of PCs still run Windows - but that's a declining market.

Of course, those statistics don't tell the true story of Windows today, and in particular of the Windows 10 launch this week. A 14% market share or not, Windows is still every bit as important as iOS and Android - especially for business technology buyers - even if it has failed to extend its PC dominance into the mobile market.

Perhaps the most important reaction to the latest announcements was not the product reviews, but the widespread acknowledgement that, under CEO Satya Nadella, Windows 10 proves conclusively that Microsoft is a very different company than it was under his predecessor, Steve Ballmer. Nadella has taken the company a long way in his first year in charge.

We now have Office apps for Apple and Android devices - anathema in the Windows-centric Ballmer world - and this week we even have the first version of Windows that will be given away for free, albeit only for its first year, and to Windows 7 and 8 users. Better to get that user base onto Windows 10 for nothing than lose them to non-Microsoft devices.

Most of the coverage of the 10 launch focused on the mobile and consumer features, but actually much of the work went into satisfying the enterprise buyer, with advancements in security, cloud and mobile device management.

The failure of Windows Phone compared to iOS and Android hasn't dented Microsoft's revenue or share price, and corporate sales of Windows and related products on PCs remain the bedrock for the Seattle supplier. The HoloLens "holographic" headset will excite gadget lovers, but will be much less important to IT managers than the claimed software portability for Windows 10 applications across every device from smartphone to tablet to PC.

Much like Windows 7 was the operating system everyone hoped Vista would be, it looks like 10 will make up for the botched job that was Windows 8. So, most importantly for Microsoft, Windows 10 ensures it stays firmly in the plans of its corporate customers. And yet...

For Microsoft to really prosper in the new age of mobile and cloud, it still has to shake off some old habits. The 12-month giveaway for Windows 10 will be welcomed, but there are too many lucrative Software Assurance deals in place for Microsoft to make Windows free forever, like iOS, Android or Apple's OS X for Macs. The company described its approach to supporting 10 as "Windows as a service", which implies it is slowly moving towards a pay-as-you-go or subscription service, rather than the complexity of Software Assurance - a move already evident with Office 365.

If Microsoft really wants to win IT managers over once and for all, a radical simplification of its software licensing would be the number one priority.

The availability of Office on rival mobiles was a big step to existing in a multi-vendor world - but was also a defensive measure to protect Office revenues against the "bring your own device" (BYOD) trend in corporate IT. Ease of integration between Microsoft's enterprise products has always been its biggest attraction for IT managers, but the future is far more heterogeneous than that and IT leaders would far prefer to see more Microsoft products unbundled and better able to integrate with rival software.

I would expect that in five years' time, Office will be a far more strategic product for Microsoft than Windows. You will only pay for the PC version of Windows, and that market will be a lot smaller as more employees use tablets for work - even if they are Windows tablets, where the operating system will be effectively free, with corporate integration features charged extra.

Windows is still too important for Microsoft to ever admit this, but the operating system is no longer the long-term future for the company - that's going to be Azure and Office 365.

The new, different Microsoft is welcome and necessary, and Nadella deserves plaudits for making it happen so quickly. But a lot of this is about playing catch -up - all he has done is bring the company back up to date, and brought back a little of its old buzz. The Microsoft of the near future is going to be even more different yet.

Privacy vs surveillance debate is nuanced and needs more education, less tribalism

bryang | 1 Comment
| More
It doesn't feel that long ago that the information security community were bemoaning the lack of attention they received from the government, national press and wider public. No danger of that happening now, is there?

Data protection, privacy and surveillance are leading front pages and parliamentary debates, particularly after recent high-profile incidents such as the Sony Pictures hack and internet snooping by the intelligence services.

The Paris terror attacks have brought widespread calls from politicians for greater powers to monitor our internet activities, countered by privacy campaigners pointing out the terrible irony of terrorism causing a reduction in our civil liberties as a result.

David Cameron's naïve and careless call to outlaw "communication between people which we cannot read" has rightly led to criticism of what would be a technically unfeasible and highly dangerous attempt to ban encryption.

I can remember writing nearly 15 years ago that privacy would be the defining challenge of the internet era, and so it has proved.

Nobody can argue that targeted electronic surveillance is anything but a good thing for fighting crime and terrorism, but blanket recording of all our communications - even if it is only the meta data - on the basis the data is stored "just in case" is self-evidently a step too far in a liberal democracy.

When the Regulation of Investigatory Powers Act (RIPA) was passed in 2000, many observers warned that its loose language and broad powers could be misused. Politicians assured us that no such thing would happen, relying on the common sense and altruism of the authorities.

Fifteen years later, we have seen how the law has been abused, just as those experts warned, with councils citing RIPA to snoop on parents trying to get their children into schools outside their catchment area, and the police using it to uncover journalists' emails and expose their legitimate sources.

Let's not forget too, that the French authorities already have greater surveillance powers than the UK, and it was still not enough to prevent the Paris attacks by known extremists.

There is no easy solution, and none will be found in knee-jerk reactions or a tribal approach that creates a binary debate when nuance is needed. Both politicians and public need to understand the arguments and issues, and to reach an informed consensus on how best to balance privacy and national security. That debate is not currently taking place, and more education and awareness is needed before it can be conducted sensibly and fruitfully.

This, then, is the opportunity for the information security community. They are, finally, in the centre of the debate they have always called for. They need to lead, to educate and to listen - and most importantly, we and the UK authorities need to listen to them.

A 2015 technology prediction: Nothing much will happen

bryang | No Comments
| More
It's usual at this time of year for journalists to make predictions - something I've tended to avoid as a fool's game beset by needless optimism, over-excitement or the patently obvious. But for 2015, I'll make an exception and offer one forecast as a counterpoint to some of the hyperbole you will no doubt have come across.

My prediction is this: nothing much particularly different will happen in technology in 2015.

Yeah, I know, a bit dull. Sorry.

We will continue to hear a lot about the buzzwords we heard a lot about in 2014 - cloud, mobile, big data, digital, internet of things, wearable technology - all of which are at different stages of adoption and maturity, and all of which will continue their merry progress along those paths.

The giant Consumer Electronics Show in Las Vegas this week was dominated by announcements related to the internet of things (IoT), but nothing that is going to break through into the mainstream this year. Wearables will get plenty of headlines, particularly when the Apple Watch goes on sale, but aside from the inevitable Apple fanboys and early adopter geeks, it's not going to set the world alight yet - the form factor is too unproven.

Cloud adoption will continue to grow - as it continued to grow in 2014. Mobile will continue to expand in enterprise IT, as it did last year. Big data will still be big, but mostly for the big companies with big budgets. More companies - and government bodies - will come to realise their future is digital, and wend their way in that direction at varying rates of change. But these are all journeys that are well underway, and are mostly unstoppable trends now.

Having said all that, we are on the cusp of something very big. Not this year, I suspect, but starting in 2016, and then progressing over the subsequent two or three years, such that in 2020 we will look back on this five-year period as a dramatic time of change - the time the digital revolution really accelerated.

Today, we're still waving red flags in front of motor cars to warn passers-by, in terms of where we are in that revolution. But the digital equivalent of the Ford Model T is very close. It needs cloud to gather more trust, mobile to be more secure, data to be better protected, and the internet of things to commoditise. It needs digital skills to be more widely available, and companies to be confident enough in the economy to invest again in innovation - although many will not, and will fall by the wayside as a result, including some very big names.

So spend 2015 doing what you're doing, but in the process take a deep breath and get ready, because the inevitable, unstoppable and dramatic acceleration of the digital age is getting very close.







Is BT buying EE a step towards selling off Openreach?

bryang | 1 Comment
| More
BT is changing. Under CEO Gavin Patterson, the former monopoly telecoms giant has expanded into content through BT Sport, paying heavily for broadcast rights to English Premier League matches, and now is on the verge of splashing out £12.5bn on EE to get back into the mobile market that it exited when it sold O2 (then known as BT Cellnet) in 2001.

There has also been speculation that BT wants to merge its wholesale division with Openreach, the regulated subsidiary that manages its national telephone and broadband network. BT Wholesale sells access to some Openreach network services to other telcos and ISPs.

That would be a more complicated move, given the strict Ofcom rules under which Openreach operates. But what it would effectively also do is give Openreach greater in-house sales capability, separate from the main mothership of the consumer- and business-facing BT Group.

BT continues to generate controversy and opprobrium in equal measure from rural broadband campaigners over its dominance of the government's BDUK programme to roll out superfast broadband to areas outside the regions BT considers commercially viable for fibre to the cabinet (FTTC) broadband. Critics are equally keen to point out that BT's copper network is an effective monopoly, is outdated and will eventually have to be replaced by an all-fibre network at some point in the future as the hungry apps and browsers of internet and mobile users need to be fed by bandwidth-heavy services such as Netflix or the BBC iPlayer.

BT, of course, argues its case equally fervently. I discussed the arguments over BT and broadband last year - you can read the article here to save repetition - but concluded then that the problem is the lack of competition in the wholesale telecoms market, which can only be solved, in my opinion, by divesting Openreach.

I just have a sneaky suspicion that Gavin Patterson is moving in that direction.

His new BT is becoming more like a modern, integrated, internet-savvy communications provider - offering high-speed broadband, landline telephony (itself a diminishing market), 4G mobile, online and broadcast content. That's a model more like Virgin Media or Sky than a traditional telecoms infrastructure player. Do BT's long-term shareholders really want to invest in a creaking copper network subject to heavy regulation that will inevitably need to spend billions on upgrading its core infrastructure? I suspect not.

I think Patterson can see the writing on the wall for Openreach - hence merging with Wholesale gives it an opportunity to be a standalone company, similar to National Grid in the energy sector which owns most of the UK's electricity and gas distribution networks. National Grid has been able to expand internationally as an energy infrastructure player, using its freedom as a publicly quoted company to buy similar businesses overseas, particularly the US.

It's a model that could offer a future for Openreach outside of BT.

FTTC-based "superfast" broadband is going to last the UK for a few years yet - as it turns out, roll-out is well ahead of consumer adoption, which lags behind several European countries - but by 2020 the cracks will start to show. Even though 5G mobile does not even exist yet, it is equally inevitable that mobile networks will in future offer connection speeds far ahead of what even FTTC broadband currently provides. No way will BT want to keep spending on a declining, heavily-regulated asset in those circumstances - hence the purchase of EE.

A BT free of Openreach and its regulatory handcuffs becomes a very different proposition - but more importantly, so does an Openreach freed from BT; free also to expand internationally and invest sensibly and prudently in all-fibre networks with its own access to capital and debt. And without the parental relationship between BT and Openreach, perhaps other telcos large and small will be more enthusiastic about setting up wholesale network competitors in the UK.

BT will deny this of course - I can already imagine the emails from its press office, similar to the reaction to my article last year. But it feels to me like the new BT knows that, in the long run, it won't need and doesn't want the legacy of owning Openreach. And if Gavin Patterson does go down that route, then good for him - and good for the UK's communications infrastructure.

Introducing the Devereux-Hodge shambolicness scale for rating progress of Universal Credit

bryang | No Comments
| More
If Universal Credit were made into a Hollywood rom-com, you just know that Margaret Hodge and Robert Devereux would be the central characters.

Thrust into repeated opposition on different sides of a heated Public Accounts Committee (PAC) table, the two star-crossed combatants would bicker and argue in ever-increasing circles of conflict. Then, in the final act, in a moment of outrageous serendipity - such as Universal Credit actually going fully live - they would realise their true feelings for each other.

You can decide which outcome - Universal Credit going live, or a future romance between PAC chair Margaret Hodge, MP and Department for Work and Pensions (DWP) permanent secretary Robert Devereux - is the more likely. (Remember this - nobody would have believed John Major and Edwina Currie...)

Last week the soap opera continued as Devereux was once more hauled in front of MPs by Hodge to discuss the latest National Audit Office (NAO) report into the troubled welfare reform programme. The latest act ended with Hodge insisting Universal Credit is a shambles, and then shutting down the meeting before Devereux had one last opportunity to deny the accusation. For the two pugilists, "shambles" is a binary measure - it either is (Hodge) or it absolutely isn't (Devereux).

On Universal Credit at least, it's increasingly clear that "shambolic" is in fact a sliding scale, with the programme veering in one direction or the other to different degrees of shambles. Let's call it the Devereux-Hodge Scale of Shambolicness, to represent the end-points at either extreme, where 100% Devereux represents everything tickety-boo and working as expected; and 100% Hodge representing a total shambles.

So, based on recent revelations from the NAO and the PAC hearing, where on the scale is the project right now?

Wasted spending or value for money?

The discussions last week in the PAC meeting between Devereux, the Treasury's Sharon White, the NAO and Hodge veered into the arcane terminology of accounting and economic modelling. At one stage, Hodge highlighted that £697m has been spent on Universal Credit so far - that's all costs, not just IT costs - and yet DWP said only £34m of that will make it onto the balance sheet as an asset as a result of the "twin-track" approach now being taken, and once the digital system currently being developed is live.

On face value, that certainly seems like close to 100% Hodge on the shambolicness scale. Some reports claimed this means DWP could "write off" £663m, but that's not the case. As White - soon to be leaving the Treasury to become the new CEO of Ofcom - explained, much of that £697m is normal operating expenditure which would never be recorded as an asset, and part of it is "Plan B" contingency spending so the existing Pathfinder system being used can stay in place if the digital system is delayed or doesn't work.

We know for sure that £131m of assets will be written off by the time Universal Credit is fully live, but it is certainly likely that the true figure will be much higher.

Much of that £697m has gone on things like staff costs - money which would have been spent anyway - and up-front design and planning work. The only way that expenditure will have been wasted is if Universal Credit is scrapped entirely. That's unlikely as Labour is entirely supportive of the policy, if not the implementation plan.

But Hodge is absolutely right to say that spending £697m and still not yet having a fully agreed business case is a shambolic way to spend money. In the unique language of the Civil Service, DWP has so far had its Strategic Outline Business Case approved; next year the Outline Business Case is due for approval; and then in 2016, the Business Case should finally be approved.

Devereux, on the other hand, maintains that spending £697m is chicken feed compared to the £7.7bn that will be saved as a result of the twin-track approach compared to the other alternatives considered when the programme was "reset" last year. That £7.7bn figure is derived from those arcane economic models based on more benefit claimants being on Universal Credit sooner, and the assumption that more of them will go back into employment more quickly.

If you want to consider the programme on the basis of a good old-fashioned return on investment calculation, then Devereux's figures make such spending seem minor compared to the promised returns. This has, consistently, been the DWP line - that for all the spending, all the write-offs, and all the delays, the benefits to the UK of Universal Credit vastly outweigh the problems in getting us there.

So how do you decide where this stands on the Devereux-Hodge scale? You could consider the views of the independent Office for Budgetary Responsibility (OBR), which stated last week that in its opinion, "there remains considerable uncertainty" around the plans for Universal Credit, and that "weighed against the recent history of optimism bias in Universal Credit" it expects at least a further six-month delay in the latest revised timescales - which themselves are considerably delayed compared to the original plans back when the project was launched in 2011.

So in terms of the value for money so far, we're very much at the Hodge end of the shambles scale - but if you're looking long term (and believe the DWP economists) then it's closer to the Devereux end.

The business case; or, does anyone actually know what they are doing?

As mentioned above, the phrase "business case" seems to have a flexible definition in the Civil Service. But there is no disputing that DWP has yet to produce what the NAO calls a "target operating model" for Universal Credit.

In layman's English, this effectively means the DWP has yet to define what Universal Credit will actually do, how it will work, and what its processes and workflows will be. And you'd have to say that if this were the business case in a company, you wouldn't get the project past first base if you had not defined what the end result would be. Even an agile project has a good idea of the desired outcome.

One of the independent advisors to Labour's review of Universal Credit told me that for any project of this scale, failing to determine right up front what the target operating model will be and how the future processes will work, is a fundamental error. Labour has promised a three-month pause to the project should it win the general election in 2015, and one of the primary reasons for that is to step back and define that target model.

DWP would counter that it is pursuing a "test and learn" approach - a vaguely agile concept whereby it introduces new features and functions, then learns from their trial implementation, feeding the results back into the roll-out process. Bear in mind of course, that "test and learn" only came about when the project was so out of control that it was halted, reviewed and "reset" because it was without any doubt 100% a Hodge-level shambles.

In many ways, this comes down to the IT argument of waterfall versus agile, and if ever a project was trying to shoehorn both approaches into one, it's Universal Credit. Agile is not a panacea; as the Labour advisor put it, it's "horses for courses". And in its desire to be seen to be agile, Universal Credit forgot some of the basics - namely, agreeing up front what they were all meant to be aiming for.

The NAO's previous September 2013 report made the same criticism, stating: "Throughout the programme the department has lacked a detailed view of how Universal Credit is meant to work... The department was warned repeatedly about the lack of a detailed 'blueprint', 'architecture' or 'target operating model' for Universal Credit."

Work has started to address that gap - the Treasury refused to sign off even the Strategic Outline Business Case without it - but the truth is that no matter how much testing and learning takes place, it's difficult to say with certainty how much of the work completed so far will be relevant for a target operating model that has yet to be fully defined.

So on the business case, you're looking at 80% Hodge so far.

The digital Holy Grail

The digital system being developed as part of the twin-track approach will eventually replace all but £34m of the IT assets created to support the Pathfinder trials - that's just 17% of the £196m of IT assets created so far being retained, according to the NAO.

Digital has become the Holy Grail for Universal Credit - the knight in shining armour riding over the horizon to rescue the programme. It's easy for this to be the bright future when it's only just passed its "alpha" stage of development, and the initial trial of the system in Sutton is processing just 17 claimants so far, with a fair amount of manual intervention still required. It is simply too soon to tell how well the digital development is going - especially since the pilot started six months later than planned, based on a timescale established only 12 months ago.

Insiders are saying good things about the management of the digital project under DWP's digital transformation director Kevin Cunnington. The reason for the delay came from problems recruiting suitably skilled digital expertise into DWP - Computer Weekly reported back in January that the initial recruitment plans were already proving over-optimistic.

It's worth remembering at this point, that the Cabinet Office and the Government Digital Service (GDS) had recommended that DWP put all its Universal Credit eggs into the digital basket, and scrap the Pathfinder system entirely. The latest NAO report shows that the DWP has managed to play with its accounting and economic models well enough to demonstrate that the twin-track approach will save the UK £7.7bn more than if they had waited for the digital system to be ready and stopped the current system roll-out last year, as GDS advised.

At the PAC meeting, Hodge set out her wariness over the optimistic promises around the digital system, compared with the reality of how the programme has gone in the past. For Devereux, digital can be the bright future for only a short time before it has to prove itself.

The NAO, meanwhile, warned that failure to complete the digital system, and relying instead on the existing system for full roll-out, comes with a £2.8bn bill to taxpayers.

So, 50-50 on the Devereux-Hodge scale so far.

The moving target of roll-out timescales

Gauging the shambolicness of the Universal Credit roll-out to date depends very much on the tint of the glasses through which you view progress.

Based on the original plans, over four million claimants were meant to be on Universal Credit by April 2014. So far, fewer than 18,000 people are claiming the benefit. By any standard, that is 100% Hodge of a shambles.

But through DWP-tinted glasses, it is far more important to get Universal Credit right in the end, than to adhere to an unachievable timescale - even if that was DWP's own timescale you're talking about.

As with the digital system, it's easy for DWP to be confident about its latest roll-out plans because most of the work (and the risk) is so far away that within the confines of a House of Commons committee room everything can easily be 100% Devereux. Even Hodge finds it hard to disagree when asked, "What would you prefer, that we get it right in the end, or we get it wrong on any timescale?"

The roll-out plans are still very risky because they are so back-ended. Millions of claimants will have to be migrated onto new systems over an 18-24 month period - and even then, migration of tax credits has been further delayed until 2019. That scale of migration is simply unprecedented under any government.

The NAO report said that a further six-month delay (which is what the OBR expects to happen) will mean the loss of £2.3bn in potential economic benefits to the UK.

If nothing else, you have to give DWP credit for not blindly sticking to its wildly unrealistic former timescales. But based on progress to date, it's been a comfortable 75% Hodge shambles - and the risk of further delays is at the same end of the scale.

The Devereux-Hodge Shambles Rating

Overall, based on progress to date, Universal Credit sits very much nearer to the Hodge extreme of shambles than to Devereux. The DWP's argument is based very heavily on future promises, of jam tomorrow - and of course, if they are right, that's going to be one heck of a tasty jam sandwich.

It's not too late to swing Universal Credit towards the Devereux end of the scale. But with national roll-out of the most basic benefit claims due to take place in 2015, and the early promise of the digital system set for its biggest test, we will learn in the next six to 12 months if the needle is going to move along the scale towards Devereux any time soon.

The key themes from this year's UKtech50: digital, recruitment, and out-of-touch suppliers

bryang | No Comments
| More
Soon after Computer Weekly launched our UKtech50 programme to identify the most influential people in UK IT, over four years ago, we were approached by a couple of the female IT leaders that made the list at that time. They were in a clear minority - only eight or nine women on the list, even if that was reflective of the meagre 17% of IT professionals who are female.

That conversation was about how we find more female role models in IT and give them the recognition and profile that will help to encourage women into IT. That led to our first programme to highlight the 25 most influential women in UK IT. That first poll was won by Jane Moran, then the global CIO of Thomson Reuters.

We were, therefore, especially pleased to see that three years later, Moran - now global CIO at Unilever - this year became the first woman to top the overall UKtech50 list as the most influential person in UK IT. What's more, 16 of this year's top 50 are female - almost one-third.

But of course, Moran did not top the list because she is a woman - she won because she is a high-profile IT leader, driving digital and technology innovation at one of the UK's most important companies, one that touches most of our lives through its consumer products every day. Her gender, in this context, is not the issue. But nonetheless, it is great to see more recognition for the women driving the role of technology in the UK economy, and as examples of the digital glass ceiling being shattered.

Our UKtech50 event to announce the final list also heard talks from 12 of the top CIOs and CTOs in the country, and a few clear themes emerged across those presentations.

The first is the rise of "digital" - the IT leaders acknowledged that digital is already becoming one of those buzzwords that means different things to different people, but all agreed that it is a trend that is transforming how IT is managed and delivered, and its role in the organisation.

That leads directly to the second major theme - the changing skills and organisational profile of the corporate IT team. Increasingly, IT chiefs see two distinct functions. There are the back-end operations teams, running the traditional IT infrastructure under well-governed processes that focus on stability and reliability. And increasingly there is the digital team, often using agile methods to rapidly respond to business needs, developing software close to the customer, iterating, testing, experimenting, learning as they go - but highly reliant on those back-end experts for infrastructure.

Some speakers talked of "dual-speed" IT, or what Garner calls "bimodal" - although not everyone agrees with such terminology - but the clear message was that a new technology team is emerging.

From that comes the challenge of recruitment - finding people with the new digital skills needed. And here, the IT leaders shared successful experiences of growing their teams - Royal Mail, for example, which needed to find 300 new IT staff during a skills shortage, but thought smart recruitment using non-conventional methods, attracted nearly 30,000 applicants.

The message for skills recruitment was to target diversity and to recruit different profiles from those traditionally brought into IT - not just engineering or computer science students, but linguists, historians, economists, psychologists and so on - reflecting the importance of technology across all aspects of culture and society.

The final theme concerned IT suppliers - and really should concern IT suppliers too. Many of the IT leaders felt their traditional providers have not changed with the times and are stuck in a model of software licensing, hardware products and expensive and often unproductive consultancy services.

One speaker, Bank of England CIO John Finch, cited a large supplier brought in to help with a software audit, which ended up sending a £2.5m bill because the Bank was using virtualisation and cloud services in contravention of the licensing terms.

IT leaders agree that their world is changing, but they do not feel their key suppliers are changing with them. This is why speakers like John Lewis IT director Paul Coby highlighted the work he is doing with tech startups, and why government CTO Liam Maxwell flagged that over 50% of purchases put through the G-Cloud framework have gone to SMEs.

The best IT leaders - those featured in our UKtech50 list - are leading digital change, driving innovation, and establishing the best practices for others to follow. They are not short of challenges - and they are certainly not short of work to do. But everyone on our UKtech50 list shows that IT leadership in the UK is thriving and leading the world.

DWP still has lessons to learn to make Universal Credit IT a success

bryang | No Comments
| More
"Unacceptably poor management" and "wasted time and taxpayer's money" - that's how Margaret Hodge, MP, the chair of the Public Accounts Committee described Universal Credit this week.

The latest National Audit Office (NAO) report into the troubled welfare reform programme simply added to the catalogue of concerns. It is easy for secretary of state Iain Duncan Smith to continuously say that the project is on time, when the timescales are put back every six months. At any point in time, delivery is on target - until it's delayed and is back on target again.

The headline findings from the report have been widely covered - still unable to determine value for money; no contingency plan should the new digital service fail to work; lack of an overall blueprint for delivery of the policy.

But reading through the 60-page NAO report also reveals some startling nuggets of information that the Department for Work and Pensions (DWP) would no doubt prefer to have kept under wraps.

For example, in April 2014, a software update caused an increase in incorrect payments to benefit claimants, which meant that every payment had to be manually checked for three months. The cause of the problem was an un-named supplier that released an update containing "significant changes" that the DWP had not been told about, and were therefore not properly tested.

The supplier is likely to be one of IBM, Accenture, HP or BT, the four key vendors supporting the flaky system that will be mostly replaced by the future digital service. Those suppliers have received far too little criticism or scrutiny of their role - even being allowed to audit their own work, what one MP called "marking their own homework".

Only 17% of the work that these suppliers have done will be used once Universal Credit is fully live, according to the NAO.

In January this year, Computer Weekly revealed that DWP was already struggling to recruit the skills it needs to develop the digital service - a fact the DWP denied at the time. The NAO revealed that this has been the key reason for delays in the progress of the digital system, and that recruitment is still required to reach the necessary capacity. The report said that DWP was offering maximum salaries between 8% and 22% lower than the market average.

The DWP Digital Academy programme launched by the department's digital transformation chief, Kevin Cunnington, is proving to be successful in training internal staff - but it's a longer-term solution.

The real problem is that DWP chose to ignore the warnings and recommendations of the Government Digital Service for too long - blundering along with poor project management and misfiring suppliers. Now that digital has been placed at the centre of the programme, it's been a case of catch-up in a recruitment market where digital experts are in short supply and high demand.

The risks around Universal Credit remain, and if future progress follows past history, the cost to the taxpayer of those risks being realised will be significant. There are chinks of light emerging and insiders say they have been impressed by Cunnington's approach. With full roll-out of the new benefits now put back until 2019, there is time to get things right, but only if the DWP has finally learned its lessons over implementing modern, digital government IT.

Cyber security: Can businesses put a value on trust?

bryang | No Comments
| More
Who can you trust in a digital world? The most dispiriting part of the technology revolution is the growing lack of trust felt by individuals and businesses as a result of cyber security threats.

For example, I learned this week that HP is being asked by some large customers to provide legal assurances that its products do not have backdoors - the IT giant even received such a request from a Nato country.

When one global financial services firm held an executive meeting in a Middle East country, its security chiefs were so concerned about industrial espionage, they transported an entire office communications setup from the US - but even then found they were not allowed to bring their Cisco routers into the country and had to source them locally.

While anybody with knowledge of the information security world will see such concerns as understandable, surely we should ask ourselves why we have allowed trust to deteriorate to such an extent.

Awareness of state-backed hacking and the Edward Snowden revelations about internet surveillance are important stories, but have helped to create a culture of distrust that will be increasingly difficult to reverse.

We seem to have accepted that, to do business in the digital world, you have to assume you trust nobody. But at what cost?

Companies are recognising the need to spend money on cyber security, but it is seen as a cost. How, instead, might they put a value on trust? How much more business might you be able to do by using technology as a way to gain the trust of your customers? What is the economic value to governments of making their country an internationally respected place for trusted digital business?

UK Cabinet Office minister Francis Maude said this week that the government's cyber security strategy aims to "make the UK one of the safest places in the world to do business and ensure that our economy and society continues to benefit from the ongoing digital transformation."

That is absolutely the right justification for the government plan - but how much is it undermined by Snowden's revelations about GCHQ snooping?

Royal Bank of Scotland learned the cost of losing trust with a £56m fine for technology failures that prevented customers accessing their money. How much value would the state-owned bank gain from becoming a digitally trusted institution?

Building trusted technology will still cost, and will still need to use the same security systems and risk management methods - but it could represent an important change in mindset and corporate culture. Could the business benefit of trust be the key to putting extra cash into information security budgets to better address those risks?

The IT industry has become used to responding to fear of cyber threats. How much better would it be for everyone if it focused instead on helping its users in building trust.

Have you entered our awards yet?

Archives

Recent Comments

  • Sudeep Sirur: Totally agree with Cloud services as causing a major shift read more
  • Philip Virgo: You will be in the chair when I launch Plan read more
  • Phil Gibson: I think Government as a Platform is a good idea read more
  • Philip Virgo: Over 30 years ago, in a paper on the plans read more
  • David Chassels: Hi Bryan Well well at last.....the message we have researched read more
  • Peter Smith: We explained that "leap" in 2012 here. http://spendmatters.com/uk/data-spend-smaller-suppliers-uk-government-departments-cabinet-office-be-trusted/ MOD and read more
  • John Alexander: Bryan, Having watched the debate, and corresponded with both major read more
  • T Symon: Every Apple convert I speak to says the same about read more
  • Subas Roy: The Privacy laws need to be more explicit. With the read more
  • Philip Virgo: Only just spotted this. An interesting piece of analysis - read more

Dilbert

 

 

-- Advertisement --