When the G-Cloud framework was introduced back in spring 2012, its core aim was to shake-up government IT procurement, so that high-value, multi-year hardware contracts awarded to the same old big-name enterprise suppliers became a thing of the past.
Backed by a Central Government-wide cloud-first mandate, the public sector was actively encouraged to use the framework to source cloud-based alternatives to on-premise technologies via the Digital Marketplace (formerly known as CloudStore) and a much larger pool of suppliers.
Initially, supplier contracts were only allowed to last 12-months, to prevent the public sector from falling back into buying habits synonymous with the old way of doing things, but this was later extended to two years.
It was a move that was warmly welcomed by users at the time, as it meant buyers could avoid having to retender for services so frequently, which some suppliers had flagged as a barrier to G-Cloud adoption within certain quarters of the public sector.
Against this backdrop, it’s not difficult to see why the Crown Commercial Service’s (CCS) new 20% rule around G-Cloud contract extensions seems to have caused so much upset within supplier circles.
The regulation, which is set to be introduced when the seventh iteration of G-Cloud goes live on 23 November, means framework users will be forced to retender if they want to use more of a certain service if their proposed contract extension looks set to exceed 20% of their original procurement’s value.
To avoid this, buyers would have to work out in advance how their use of a particular cloud service is likely to take off within their department over the course of the two-year contract, which could lead to over-provisioning and surplus IT being procured, it is feared.
This, suppliers argue, is at direct odds with the pay-as-you-go ethos of both G-Cloud and cloud computing, more generally, and harks back to the dark days of government IT procurement.
“As part of a G-Cloud procurement, buyers should always work out the ‘cost of success’ when they are shortlisting and selecting their cloud providers,” John Glover, sales and marketing director at G-Cloud provider Kahootz told Ahead In the Cloud (AitC).
“For example, if a project team initially only need to consume 100 users, but expect to expand that to 2,000 over the contract term, that should be factored in.
“But, to ask them to now order and pay for 2,000 users upfront takes us back to the bad old days when public sector organisations committed large sums of capital on ‘shelfware’,” he added.
Why is it being introduced?
When pressed as to why the measure is being introduced, the Cabinet Office fed Computer Weekly a wooly line about how the government is “always improving the framework to make it easier for suppliers and buyers,” before confirming that it will be carefully considering any feedback it receives on the matter.
The insinuation, though, that the 20% cap could be considered an “improvement” would undoubtedly be contested by Kahootz, and many others within the G-Cloud community who have already taken steps to make the Cabinet Office aware of their disapproval.
For example, G-Cloud suppliers Skyscape Cloud Services and EduServ have both put their misgivings about the rule change in writing to the Cabinet Office, while the G-Cloud working group inside trade association EuroCloud UK issued a statement this week, expressing its concerns.
It is a shame the Cabinet Office hasn’t revealed more at this time about the motivation for the move, as every supplier AitC has spoken to seems at a loss to explain it, although they have their theories.
EuroCloud UK, for instance, floated the idea that the rule could be the result of people unfamiliar with the origins of G-Cloud, but with a stake in government procurement, getting involved.
While others have apportioned blame to the recent tightening up of the EU Procurement Regulations around how much variance is permitted within contracts once they’ve been agreed.
Where this theory falls down slightly is around the fact these regulations permit contract variations of up to 50%, which raises further questions about why CCS is intent on enforcing a cap of 20%?
Whatever the reason, given the furore the move has caused so far, there’s every chance CCS and The Cabinet Office will backtrack, given their willingness in the past to tweak the workings of the framework in response to supplier and buyer feedback.
Whether or not they would be able to revoke the 20% rule before G-Cloud 7 goes live is doubtful, but if they choose not to now or in any future version iterations, they might have something of a revolt on their hands.
AitC has already heard from several suppliers who’ve said, while G-Cloud 6 continues to run, they’ll be pushing that to buyers as their preferred framework until it ceases to exist in February 2016. Admittedly, that hardly constitutes a long-term solution to the problem.
What’s at stake?
A lot of those who’ve voiced their opposition to the changes have shared the same concern that the introduction of the 20% cap could end up undoing all of the good work the Cabinet Office has achieved with G-Cloud to-date, and ultimately put the public sector off using the framework at all.
The amount of money spent via the framework since its creation now stands at £806m, with £53m of that attributable to the volume of transactions that took place in September alone. And it would be a shame if all the momentum it’s generated so far were to go to waste.
Particularly when the success G-Cloud has had to date seems to be gaining wider industry recognition, and reports continue to circulate about how other European countries are looking to emulate the model for their own public sector IT procurement needs.
So, here’s hoping the Cabinet Office is taking notice of what suppliers have to say on this matter, as The Digital Marketplace won’t work without them.