In 2012, the average number of serious vulnerabilities per website continued to decline, going from 79 in 2011 down to 56 in 2012. Despite this, 86 percent of all websites tested were found to have at least one serious vulnerability exposed to attack every single day of 2012.
Of the serious vulnerabilities found, on average 61 percent were resolved and only 18 percent of websites were vulnerable for fewer than 30 days in 2012. On average, resolving these vulnerabilities took 193 days from the first notification.
A closer look revealed that:
- With the exception of sites in the IT and energy sectors, all industries found fewer vulnerabilities in 2012 than in past years.
- The IT industry experienced the highest number of vulnerabilities per website at 114.
- Government websites had the fewest serious vulnerabilities with eight detected on average per website, followed by banking websites with 11 on average per website.
- Entertainment and media websites had the highest remediation rate (the average percentage of serious vulnerabilities resolved) at 81 percent.
- In years past, the banking industry had the fewest vulnerabilities and fixed the most vulnerabilities of any industry. This year, banking came in second with 11 average serious vulnerabilities found per website and a below average remediation rate of 54 percent (average is 61 percent across all industries).
“Website security is an ever-moving target, and organizations need to better understand how various parts of the SDLC affect the introduction of vulnerabilities, which leave the door open to breaches,” said Jeremiah Grossman, co-founder and CTO of WhiteHat Security. “This report – comprising survey and website vulnerability data – is the first time we can correlate various software security controls and SDLC behaviors to vulnerability outcomes and breaches. The results are both insightful and complex.”
Top ten vulnerability classes
The two most prevalent vulnerability classes in 2012 were Information Leakage and Cross-Site Scripting, identified in 55 percent and 53 percent of websites respectively. The next eight most prevalent include: Content Spoofing – 33 percent; Cross-site Request Forgery – 26 percent; Brute Force – 26 percent; Fingerprinting – 23 percent; Insufficient Transport Layer Protection – 22 percent; Session Fixation – 14 percent; URL Redirector Abuse – 13 percent; Insufficient Authorization – 11 percent.
SQL Injection continued its downward slide from 11 percent in 2011 to 7 percent in 2012, no longer making the Top 10.
Best practices may not result in better security
In correlating the survey results with vulnerability data, WhiteHat Security could see how software security controls, or “best practices” impacted the actual security of organizations. Some of the findings include:
- 57 percent of organizations surveyed provide some amount of instructor-led or computer-based software security training for their programmers. These organizations experienced 40 percent fewer vulnerabilities, resolved them 59 percent faster, but exhibited a 12 percent lower remediation rate.
- 39 percent of organizations said they perform some amount of Static Code Analysis on their websites underlying applications. These organizations experienced 15 percent more vulnerabilities, resolved them 26 percent slower, and had a 4 percent lower remediation rate.
- 55 percent of organizations said they have a Web Application Firewall (WAF) in some state of deployment. These organizations experienced 11 percent more vulnerabilities, resolved them 8 percent slower, and had a 7 percent lower remediation rate.
Accountability and compliance
In the event an organization experiences a website or system breach, WhiteHat Security found that 27 percent said the Board of Directors would be accountable. Additionally, 24 percent said Software Development, 19 percent Security Department, and 18 percent Executive Management.
Should the organizations also provide software security training to its programmers and also perform static code analysis, Software Development was held most accountable in the event of a breach.
Additionally, the correlated data revealed that compliance is the primary driver for organizations to resolve vulnerabilities, but also the number one reason organizations do not resolve vulnerabilities. In other words, vulnerabilities are fixed if required by compliance mandates; however, if compliance does not require a fix, the vulnerability remains, despite possible implications to the overall security posture of the site.
“This collective data has shown that many organizations do not yet consider they need to proactively do something about software security. It is apparent that these organizations take the approach of ‘wait-until-something-goes-wrong’ before kicking into gear unless there is some sense of accountability,” said Grossman. “This needs to change, and we believe there is now an opportunity for a new generation of security leaders to emerge and distinguish themselves with an understanding of real business and security challenges. Our hope is that they will address these issues we have identified and base their decisions on a foundation of data to improve the state of Web security over time.”
The complete report is available here (registration required).