Study: IT often fails to meet secure software development requirements

In a recent study conducted by Veracode Inc., more than half of the systems tested failed to meet secure software development standards propagated by bodies such as OWASP and SANS.

A detailed analysis of nearly 3,000 working enterprise applications from around the world shows that more than half contain basic coding flaws that leave them open to exploitation by hackers.

Developers either don't seem to understand security, or they don't care.


Matt Peachey,
vice president EMEAVeracode, Inc.

The exercise was carried out by Veracode Inc., which operates a cloud-based application risk management service. The company looked at 2,922 live applications submitted for review by its customers during an 18-month period.

It found that 57% of the applications failed to meet an acceptable level of security based on Veracode's rubric, which took into account the number of well-known software flaws the applications contained, with scores weighted according to each application's level of mission-criticality. In addition, eight out of 10 Web applications failed to meet secure software development requirements specified by the Open Web Applications Security Project's (OWASP) Top 10 Web application errors.

"These errors are well known, well documented and easy to fix," said Matt Peachey, vice president EMEA for Veracode. "Developers either don't seem to understand security, or they don't care."

The study results also showed that industry sectors such as banking and insurance, which might be expected to exhibit higher levels of security awareness, fared little better than other industries when the criticality of their applications was factored into the score.

One big problem uncovered by the study is the poor quality of outsourced code and third-party code. Only 7% of outsourced code was found to be acceptable on first submission to the Veracode scan, compared with 35% of commercial software, 42% of open source code, and 46% of internally developed applications.

Even when companies claimed to be submitting internally developed code, the scan found that up to 76% of the code could actually be from outside sources, either in the form of open source modules, or components from commercial-shared libraries. There was also a nesting effect, as some third-party components contained other third-party components within them, further complicating the task of security checking.

Cross-site scripting tops the list
Cross-site scripting (XSS), which causes website visitors to be re-routed to another site without their knowledge , was by far the most prevalent vulnerability found, accounting for 51% of all vulnerabilities, and occurring in 40% of all applications scanned. Cross-site scripting XXS was mostly present in .NET applications and resulted from the use of .NET controls that do not automatically encode output.

Unencrypted or under-encrypted data was another weakness discovered in 41% of all applications scanned. A vulnerability to SQL injection, one of hackers' favourite methods for breaking into systems via Web-based applications, was discovered in 24% of applications.

Buffer overflows were a potential threat in 8% of applications, and 8% had potential backdoors, usually due to a hard-coded password left in the code.

Peachey said that although overall quality of code seems to have improved little during the last six months, the time required to remediate code has started to shorten. Open source code still does well, taking only 12 days on average to cure any errors, while in-house code is fixed in an average of 15 days and commercial code in 19 days. The figures show a considerable improvement since Veracode last measured remediation times.

He added that industry standards such as the OWASP Top 10 and the CWE/SANS Top 25 list of most dangerous software errors should serve as minimum thresholds for assessing code.

Read more on IT risk management