Jumat, 17 November 2006

Further Thoughts on SANS Top 20

It seems my earlier post Comments on SANS Top 20 struck a few nerves, e.g. this one and others.

One comment I'm hearing is that the latest Top 20 isn't "just opinion." Let's blast that idea out of the water. Sorry if my "cranky hat" is on and I sound like Marcus Ranum today, but Marcus would probably agree with me.

First, I had no idea the latest "Top 20" was going to be called the "SANS Top-20 Internet Security Attack Targets" until I saw it posted on the Web. If that isn't a sign that the entire process was arbitrary, I don't know what is. How can anyone decide what to include in a document if the focus of the document isn't determined until the end?

Second, I love this comment:

Worse still, Richard misses the forest completely when he says that “… it’s called an ‘attack targets’ document, since there’s nothing inherently ‘vulnerable’ about …”. It doesn’t really matter if it’s a weakness, action item, vulnerability or attack. If it’s something you should know about, it belongs in there. Like phishing, like webappsec, and so on. Don’t play semantics when people are at risk. That’s the job of cigarette and oil companies.

This shows me the latest Top 20 is just a "bad stuff" document. I can generate my own bad stuff list.

Top 5 Bad Things You Should Worry About


  • Global warming

  • Lung cancer

  • Terrorists

  • Not wearing seat belts

  • Fair skin on sunny days


I'm not trying to make a point using a silly case. There's a real thought here. How many of these are threats? How many are vulnerabilities? (Is the difference explicit?) How many can you influence? How many are outside your control? How did they end up on this list? Does the ranking make any difference? Can we compare this list in 2006 with a future list in 2007?

Consider the last point for a minute. If the SANS Top 20 were metric-based, and consisted of a consistent class of items (say vulnerabilities), it might be possible to compare the lists from year to year. You might be able to delve deeper and learn that a class of vulnerabilities has slipped or disappeared from the list because developers are producing better code, or admins are configuring products better, or perhaps threats are exploiting other vectors.

With these insights, we could shift effort and resources away from ineffective methods and focus our attention on tools or techniques that work. Instead, we're given a list of 20 categories of "something you should know about." How is that actionable? Is anyone going to make any decisions based on what's in the Top 20? I doubt it.

Third, I'm sure many of you will contact me to say "don't complain, do something better." Well, people already are. If you want to read something valuable, pay attention to the Symantec Internet Threat Report. I am hardly a Symantec stooge; I like their approach.

I will point out that OWASP is trying to work in the right direction, but their single category ("Web Applications") is one of 20 items on the SANS list.

I realize everyone is trying to do something for the good of the community. Everyone is a volunteer. My issue is that the proper focus and rigor would result in a document with far more value.

0 komentar:

Posting Komentar