Why Counting Flaws is Flawed

Once or twice each year, some security company trots out a “study” that counts the number of vulnerabilities that were found and fixed in widely used software products over a given period and then pronounces the worst offenders in a Top 10 list that is supposed to tell us something useful about the relative security of these programs. And nearly without fail, the security press parrots this information as if it were newsworthy.

The reality is that these types of vulnerability count reports — like the one issued this week by application whitelisting firm Bit9 — seek to measure a complex, multi-faceted problem from a single dimension. It’s a bit like trying gauge the relative quality of different Swiss cheese brands by comparing the number of holes in each: The result offers almost no insight into the quality and integrity of the overall product, and in all likelihood leads to erroneous and — even humorous — conclusions.

The Bit9 report is more notable for what it fails to measure than for what it does, which is precious little: The applications included in its 2010 “Dirty Dozen” Top Vulnerable Applications list had to:

  • Be legitimate, non-malicious applications;
  • Have at least one critical vulnerability that was reported between Jan. 1, 2010 and Oct. 21, 2010; and
  • Be assigned a severity rating of high (between 7 and 10 on a 10-point scale in which 10 is the most severe).

The report did not seek to answer any of the questions that help inform how concerned we should be about these vulnerabilities, such as:

  • Was the vulnerability discovered in-house — or was the vendor first alerted to the flaw by external researchers (or attackers)?
  • How long after being initially notified or discovering the flaw did it take each vendor to fix the problem?
  • Which products had the broadest window of vulnerability, from notification to patch?
  • How many of the vulnerabilities were exploitable using code that was publicly available at the time the vendor patched the problem?
  • How many of the vulnerabilities were being actively exploited at the time the vendor issued a patch?
  • Which vendors make use of auto-update capabilities? For those vendors that include auto-update capabilities, how long does it take “n” percentage of customers to be updated to the latest, patched version?

The reason more security companies do not ask these questions is that finding the answers is time-consuming and difficult. I should know: I volunteered to conduct this analysis on several occasions over the past five years. A while back, I sought to do this with three years of critical updates for Microsoft Windows, an analysis that involved learning when each vulnerability was reported or discovered, and charting how long it took Microsoft to fix the flaws. In that study, I found that Microsoft actually took longer to fix flaws as the years went on, but that it succeeded in an effort to convince more researchers to disclose flaws privately to Microsoft (as opposed to simply posting their findings online for the whole world to see).

I later compared the window of vulnerability for critical flaws in Internet Explorer and Mozilla Firefox, and found that for a total 284 days in 2006 (or more than nine months out of the year), exploit code for known, unpatched critical flaws in pre-IE7 versions of the browser was publicly available on the Internet. In contrast, I found that Firefox experienced a single period lasting just nine days during that same year in which exploit code for a serious security hole was posted online before Mozilla shipped a patch to fix the problem.

Bit9’s vulnerability count put Google Chrome at the Number 1 spot on its list, with 76 reported flaws in the first 10 months of this year. I’d like to propose that — by almost any objective measure — Adobe deserves to occupy the first, second and third positions on this grotesque vulnerability totem pole, thanks to  vulnerabilities in and incessant attacks against its PDF Reader, Flash and Shockwave software.

For one thing, Adobe appears to have had more windows of vulnerability and attack against flaws in its products than perhaps all of the other vendors on the list combined. Adobe even started this year on the wrong foot: On Dec. 15, 2009, the company announced that hackers were breaking into computers using a critical flaw in Reader and Acrobat. It wasn’t until Jan. 7 — more than three weeks later — that the company issued a patch to fix the flaw.

This happened again with Adobe Reader for 20 days in June, and for 22 days in September. Just yesterday, Adobe issued a critical update in Reader that fixed a flaw that hackers have been exploiting since at least Oct. 28.

True, not all vendors warn users about security flaws before they can issue patches for them, as do Adobe, Microsoft and Mozilla: In many ways this information makes these vendors easier to hold accountable. But I think it’s crucial to look closely at how good a job software vendors do at helping their users stay up-to-date with the latest versions. Adobe and Oracle/Sun, the vendors on the list with the most-attacked products today, both have auto-update capabilities, but these updaters can be capricious and slow.

Google and Mozilla, on the other hand, have helped to set the bar on delivering security updates quickly and seamlessly. For example, I’ve found that when I write about Adobe Flash security updates, Google has already pushed the update out to its Chrome users before I finish the blog post. The same is true when Mozilla issues patches to Firefox.

Marc Maiffret, CTO at eEye Digital Security, also took issue with the Bit9 report, and with Google’s position at #1.

“While many vulnerabilities might exist for Chrome, there are very few exploits for Chrome vulnerabilities compared to Adobe,” Maiffret said. “That is to say that while Chrome has more vulnerabilities than Adobe, it does not have nearly the amount of malicious code in the wild to leverage those vulnerabilities.”

There is no question that software vendors across the board need to do a better job of shipping products that contain far fewer security holes from the start: A study released earlier this year found that the average Windows user has software from 22 vendors on her PC, and needs to install a new security update roughly every five days in order to use these programs safely. But security companies should focus their attention on meaningful metrics that drive the worst offenders to improve their record, making it easier for customers to safely use these products.

Оставьте комментарий