Software developers are still making fundamental coding mistakes that allow hackers to easily carry out SQL injection and cross-site scripting attacks, according to new research by application security specialist Veracode.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
It’s surprising that we still see so many SQL injection and cross-site scripting flaws, when they are fairly easy to fix.
Matt Peachey, VP of EMEA, Veracode
The company analysed 4,900 applications submitted by customers over the last six months, and found that developers still have a lot to learn about building basic security into their applications.
“It’s surprising that we still see so many SQL injection and cross-site scripting flaws, when they are fairly easy to fix,” said Matt Peachey, Veracode’s VP of EMEA.
More than half of all applications failed to meet acceptable security standards, and more than eight out of 10 Web applications contained vulnerabilities listed in the OWASP Top 10, which charts the most common coding faults in Web applications.
This is the third biannual study from Veracode, and the results show the level of application insecurity has remained steady, although SQL injection vulnerabilities have dropped slightly. Cross-site scripting, however, remains the most common error.
The study’s results show commercially supplied applications tend to be even less secure than those developed internally. Only 12% of commercial applications met acceptable levels of security upon their first submission for analysis, while 16% of internally developed code was deemed acceptable.
The study measures acceptable security by a combination of factors, including the seriousness of the vulnerabilities, and the criticality of the application as determined by the customer. It also measures how quickly customers remedy the flaws and re-submit the applications for a second analysis.
The software industry itself does not emerge with flying colours. Of the applications submitted by companies in the general software industry, 66% on average were found to have unacceptable security levels, and this figure was even higher for companies specialising in security software, at 72%. However, the security companies did well by fixing flaws within an average of three days, while the average time it took companies in the general software industry was around a month.
Peachey said that one reason for the apparent lack of progress is the sheer volume of applications that organisations need to check. “Companies are taking the problem more seriously, but it takes time,” he said. “They have to deal with a huge legacy in their application estate and their software ecosystem, plus they have new applications coming along.”
The research also reveals that some companies are taking much more interest in the quality of the software used by their partners. “Companies are becoming more aware of the software ecosystem, and they are getting us to audit third-party code,” Peachey said. “Enterprises need to enforce their internal security standards with their alliance partners.” According to the figures, only 25% of software submitted from third parties met the acceptability test on the first analysis.
According to Peachey, developers still appear to lack a basic understanding of security. “There are a lot of new, bright young people coming out of university with creative minds who understand new development languages, but have no experience of the world and who know nothing about security,” he said.
This view is supported by the results of tests taken by developers for some of Veracode’s customers. These show that 50% of developers taking Veracode’s application security fundamentals exam received a grade of C or lower. More than 30% received a failing grade of D or F.
Performance on other exams was a little better. Secure coding for Java had 48% scoring Grade C or worse, while Secure coding for .NET and Introduction to Cryptography, had around 36%, getting C or worse.
Ian Glover, president of the Council of Registered Ethical Security Testers (CREST), a professional body, was unsurprised at the findings. “The testing community is bored by having to deal with the same old problems over and over again in applications,” he said.
He predicted that the situation will become worse as hackers develop more sophisticated techniques, such as advanced persistent threats, while developers continue to make the same mistakes. He said much of the shrink-wrapped software his members are asked to test is found to be insecure and unusable.
Glover added that CREST is working with a number of UK universities to improve the teaching of secure coding, including Leeds, Southampton and Royal Holloway.