mstanley13 - Fotolia
A recently announced global standard will let CIOs benchmark the structural quality of their software and prevent system glitches turning into headline news as the digital transformation of organisations results in ever more customer-facing information systems affecting ever more end-users.
With organisations no longer able to fix their IT problems behind closed doors, reliable systems that avoid both downtime and security incidents will act as differentiators in the digital world.
In September the Consortium for IT Software Quality (CISQ) announced new measurement specifications based on the detection of weaknesses in the reliability, security, performance efficiency and maintainability of software.
The specification offers a benchmark for organisations looking to understand the quality of their software. Without such a benchmark to help measure the success of IT investments, businesses are reluctant to spend money on quality measurement tools.
CISQ is made up of IT executives from Global 2000 companies as well as system integrators, outsource service providers and software suppliers.
Currently billions of pounds is spent on the functional testing of software to make sure it does what it is supposed to. But the resilience, security, efficiency and changeability of the underlying code is not measured. As a result, businesses do not know the structural quality of their core systems to which new applications are being linked.
In those industries where a software failure is literally a matter of life and death, the quality of underlying code is constantly measured. At software quality measurement company Cast’s recent European CIO Forum in Brussels, where CIOs and other experts met to discuss the challenges brought by digital transformations, Cast CEO Vincent Delaroche said he rated the airline industry as the best at ensuring the engineering quality of embedded software used in aircraft simply because that quality is “life-critical”.
By contrast, he added, the IT industry is happy to buy software without full details of the underlying code and its quality.
Quality is a brand issue
The move to digital products and services will mean that the failure of a system can become public knowledge very quickly indeed. System outages in industries like banking damage credibility and deter customers. “Software is becoming life-critical to businesses,” Delaroche declared.
“In the past CIOs could deal with problems behind closed doors. But because many systems will be market-facing after digital transformations, problems will be public,” he said.
One result of this will be to heap pressure on CIOs. “Today we hear about a major software outage every week, but in a few years there will be one every day,” Delaroche said.
“I would not like to be a CIO today because there is a lot of exposure and if you are not running to digital transformation you are seen as a bad guy.”
According to IT consultant and former ABN Amro Belgium CIO Geert Ensing, speaking at the event, digital is about “creating new business practices with new and old software”.
He said new systems offering digital services to customers have to link to the systems holding the data. Although the latest technology might be tested to the extreme, the quality of code in the underlying systems might not be known.
“Many of the legacy systems in our organisations suffer from a huge technology debt and complexity is increasing,” Ensing warned.
He said that CIOs must be able to measure the structural quality of software because this information is vital. “When I became a CIO about 20 years ago I very quickly found myself in a discussion about the cost and quality of IT, and I soon realised I had to have data [about structural quality of software].”
According to Ensing, data should be able to provide more than just a picture of a moment in time, but software measurement will work only as part of an ongoing improvement process.
Danielle Jacobs, director at the Belgian association of IT managers, Beltug, said that software is at the core of most businesses today and is very complex. “You have legacy systems, mobile apps, in-house developments, outsourced software, packaged software, and now agile,” she said.
She pointed out that different industries develop and use software in very different ways, and added: “One generalisation from all discussions with our members is that they want to do much more about software quality but they are not there yet.”
She said that organisations need to pick out their most important software and measure its quality.
Read more about Cast
- Businesses outsourcing SAP customisation often receive an end product that does not meet quality standards, an extensive benchmarking exercise reveals.
- Bill Curtis, who led the development of the Capability Maturity Model, discusses how software quality must keep up with the pace of change.
- Study of business applications reveals choice of programming language in the financial services sector is leading to security and continuity risks.
Andrew Agerbak, associate director at the Boston Consulting Group, agreed that digital transformations will make software glitches “a lot more visible over the next few years”.
He said security problems at Target and Sony as well as the software glitch suffered in 2012 by the Royal Bank of Scotland, when customers were locked out of accounts, could all be traced back to software quality issues.
Most security breaches, he said, are the result of hackers taking advantage of structural software weaknesses. “In the old days you could put walls around everything, but in the digital world you can’t because you want your apps to be used by customers and other members of a business ecosystem.
“People have a lot less information about the quality of software than they have for other technology they buy. If a CIO is buying a mainframe, he or she has all the speeds and feeds.
“Having a robust way of assessing it is a very valuable tool. It is concerning for businesses making big decisions if they do not have robust data on the software.”