Website vulnerabilities declined in the past year, but most websites studied had at least one vulnerability, the latest WhiteHat Security Website Security Statistics Report has revealed.
The report correlates vulnerability data from tens of thousands of websites from more than 650 organisations with software development lifecycle (SDLC) activity data from 76 survey respondents.
The data showed that the average number of serious vulnerabilities per website continued to decline, going from 79 in 2011 down to 56 in 2012.
Despite this, 86% of websites tested were found to have at least one serious vulnerability exposed to attack every single day of 2012.
Serious vulnerabilities are those in which an attacker could take control of all or part of a website, compromise user accounts on the system, access sensitive data or violate compliance requirements.
Of the serious vulnerabilities found, on average 61% were resolved, down from 63% in 2011, and only 18% of websites were vulnerable for fewer than 30 days in 2012.
On average, resolving these vulnerabilities took 193 days from the first notification.
“Organisations need to better understand how various parts of the SDLC affect the introduction of vulnerabilities,” said Jeremiah Grossman, co-founder and CTO of WhiteHat Security.
“This is the first time we can correlate various software security controls and SDLC behaviors to vulnerability outcomes and breaches. The results are both insightful and complex,” he said.
Content spoofing the most common serious vulnerability
The most prevalent vulnerability was content spoofing, identified in 55% of websites. At 53%, cross-site scripting was a close joint second with data leakage after topping the list for the past two years.
These were followed by cross-site request forgery (26%), brute force (26%), fingerprinting (23%), insufficient transport layer protection (22%), session fixation (14%), URL redirector abuse (13%), and insufficient authorisation (11%).
SQL Injection fell out of the top 10 prevalent vulnerabilities, to 14th position with 7% of websites, down from 11% in 2011.
With the exception of sites in the IT and energy sectors, all industries found fewer vulnerabilities in 2012 than in past years.
Read more on web threats
The IT industry experienced the highest number of vulnerabilities per website (114), while government websites had the fewest with eight on average per website.
Entertainment and media websites had the highest remediation rate or average percentage of serious vulnerabilities resolved at 81%.
In previous years, the banking industry had the fewest vulnerabilities and fixed the most vulnerabilities of any industry, but this year, came in second with 11 average serious vulnerabilities found per website and a remediation rate of 54%, below the 61% across all industries.
In correlating the survey results with vulnerability data, WhiteHat Security could see how software security controls, or “best practices” affected the actual security of organisations.
The study found only 57% of organisations surveyed provided some amount of instructor-led or computer-based software security training for their programmers.
These organisations experienced 40% fewer vulnerabilities, resolved them 59% faster, but exhibited a 12% lower remediation rate.
The 39% that perform some amount of static code analysis on their websites underlying applications experienced 15% more vulnerabilities, resolved them 26% slower, and had a 4% lower remediation rate.
Of the 55% that had a web application firewall (WAF), 11% experienced more vulnerabilities, resolved them 8% slower, and had a 7% lower remediation rate.
Best practices do not necessarily equal better security
This data implies that best practices such as software security training are effective, yet some of the statistics show that following best practices does not necessarily lead to better security.
The correlated data revealed that compliance is the primary driver for organisations to resolve vulnerabilities, but also the number one reason organisations do not resolve vulnerabilities.
In other words, vulnerabilities are fixed if required by compliance mandates; however, if compliance does not require a fix, the vulnerability remains, despite possible implications to the security posture of the site.
“This collective data has shown that many organisations do not yet consider they need to proactively do something about software security,” said Grossman.
“It is apparent that these organisations take the approach of ‘wait-until-something-goes-wrong’ before kicking into gear unless there is some sense of accountability,” he said.
Grossman believes there is an opportunity for a new generation of security leaders to emerge and distinguish themselves with an understanding of real business and security challenges.
“Our hope is that they will address these issues we have identified and base their decisions on a foundation of data to improve the state of web security over time,” he said.