April 06, 2007

What to do about responsible disclosure?

A slight debate has erupted over Adam's presentation "Security Breaches are good for you" which makes it a success. Of course, Adam means good for the rest of us, not the victims. One can consider two classes of beneficiaries to breach information:

  1. The direct Individuals concerned. They can work out how their risks change from past events. In order for them to do anything to protect themselves, they have to be told; a company can't do it for them because it doesn't understand their situation.
  2. Companies that have similar setups. They can work out better how to defend against future events. This comes from both understanding the weaknesses and exploits, and also the risks and costs. This viewpoint is due to Stuart E. Schechter and Michael D. Smith, "How Much Security is Enough to Stop a Thief?", Financial Cryptography 2003, LNCS Springer-Verlag. At least that's where I first saw a rational explanation, even though I have been thinking about the "secrecy is bad for you" angle for many years.

SB1386 and copies address the first case, but what about the second case? Kenneth F. Belva simply doesn't agree with it. He blogs that we can't release all this information for security reasons; we don't release security-sensitive info because it gives an attacker an advantage.

This I claim is a facade. I believe that when people say "cannot release that for security sensitive reasons," there are generally other real reasons. Primarily, this is the safety of the people doing the job in that they can do their job much more happily if they can avoid the scrutiny that disclosure forces. Adam says (PDF):

So, why can’t we share? There are four reasons usually given: liability, embarrassment, customers would flee, and we might lose our jobs.

I quote Dan Geer as saying “No CSO on Wall Street has the authority to share even firewall logs because no General Counsel can see the point.”

Beyond all the excuses, I think there is a much bigger danger: that security people can do their jobs much more badly when protected by secrecy. Disclosure forces all of us to defend our position and to be judged by our peers, and trotting out the "secrecy" argument is IMO almost always used to cover bad work, not a real security risk.

That's not to say that any particular professional is bad, but that the profession as a whole, as a guild, promotes bad professional behaviour, behind the curtain of secrecy. That said, there is an element of truth in the issue. We don't disclose the root password, so where is the line drawn? Kenneth says:

Just as we have and understanding of responsible disclosure now for technical information security vulnerabilities, we need the same for breach disclosure.

If not through a centralized (not necessarily government) body Adam, what do you propose that would allow for better, more accurate and confidential disclosure that does not leak sensitive information?

Disclosures are not possible if done via some "controlled" channel. As Adam points out and as economic theory has it (following Stigler), all closed channels become controlled, stifled and then neutered (and what happens after is sometimes even worse). This is as true in the infosec world as any other.

So maybe we need a law to force public disclosure? Right?

I call it wrong. SB1386 was a big win because it forced 1. above. This was in no small part because the issue was simple: just write to the victims and tell them what was lost. The harm was clear, the victims were directly known, and they just needed to be told.

Part 2 is much harder. I doubt a law can do it, and I further doubt a law can do any good there at all. There is no victim being targetted, there is no simple harm, and there is indeed no direct or contractual relationship here at all.

We simply have no guidance to make such a law. SB1386 was a lucky break, it spotted a precise niche, and filled it. We should *not* take SB1386 as anything but an extraordinarily lucky break, and we should not expect to repeat it.

So where does that leave us with respect to responsible disclosure ? It will then fall to the companies to decide what to do ... and obviously it is far far safer to say nothing than say something. That doesn't make it right, just safer for those who's jobs are at risk.

I talk about solutions in silver bullets. Adam dares:

I am challenging everyone to face those fears, and work to overcome them. I believe that there's tremendous value waiting to be unlocked.

Lack of solid info on breaches is one cause in what makes it so hard to defend against real risks; e.g., why phishing was able to develop unimpeded: nobody dared to talk about it when it happened to them, and anyone who did was poo-pooed by those who hadn't figured it out yet.

(Quick quiz -- what's your defence against MITB? is it (a) never heard about it, in which case you are a victim of the secrecy myth, (b) have one but can't say, in which case you are perpetuating unsafety through secrecy, or (c) something else?)

Posted by iang at April 6, 2007 04:35 PM | TrackBack
Comments

Four quick points before the weekend:

1. You put me on the side of secrecy. This is skewed. I'm for partial disclosure, but not full disclosure. This is not secrecy.

2. You quote me as saying that: "He blogs that we can't release all this information for security reasons; we don't release security-sensitive info because it gives an attacker an advantage."

Your inference is only partially correct. I wrote that as information security professionals we are adverse to disclosing information and that secrecy is a valid security practice. The specific case I sited dealt with giving an attacker an advantage. One of the cornerstones of our professions is Confidentiality (CIA triad). Sometimes the information is gives an attacker an advantage, other times disclosure has other consequences (see number 3).

3. You write that: "I believe that when people say "cannot release that for security sensitive reasons," there are generally other real reasons. Primarily, this is the safety of the people doing the job in that they can do their job much more happily if they can avoid the scrutiny that disclosure forces."

I believe Dan Geer was more along the correct lines with his General Council remark (which you quote) than your statement; it's more corporate liability than incompetence. In addition, see number 4.

4. The question is secrecy/disclosure to whom. Most corporations and InfoSec departments are audited by internal and external auditors on a regular basis, especially in the US since SOX.

Posted by: Kenneth F. Belva at April 6, 2007 07:14 PM

> (Quick quiz -- what's your defence against MITB? is it (a) never heard
> about it, in which case you are a victim of the secrecy myth, (b) have
> one but can't say, in which case you are perpetuating unsafety through
> secrecy, or (c) something else?)

Man in the browser? I guess I ought to turn off Javascript and Java before typing in my bank password. I used to run with those turned off by default, but I gradually gave in and started leaving them on by default as the number of sites that I wanted to use that required them increased. No doubt the same thing will happen with Flash, although it will take longer because there is not a Free Software flash implementation yet, so I don't even have it installed on my primary platform. Also because the web is a helluvalot uglier with flash -- blinking ads everywhere. Ugh. Of course, that was part of my objection to Javascript back in the day, too.

But to come back to your query, I'm not currently spending energy to defend myself against Man in the Browser, but if I were, my defense would be to use a smaller, less featureful browser.

Posted by: Zooko at April 7, 2007 01:37 PM

Hi Kenneth, thanks for your reply, some responses!

1. OK :)

2. Right, in that there are plenty of ways to discuss why it is better to not disclose. Secrecy is a valid practice, I agree.

The point though is that the secrecy argument is way overdone. It's hard to argue against "need to preserve secrecy as a valid defence" so it is a frequently trotted out excuse. Only professionals can argue against it, and they won't. Try it some time, and see how you cause your own reputation and professionalism to be questioned.

3. OK, you probably thought I was saying that "information security professionals are incompetent." Heaven knows, it looks that way ... but that's only the outside perception.

In practice, the job is difficult for a number of reasons -- contradictions almost -- and the delivery of results is hard to be objective about. In such an environment, all reasoning to not do something becomes a useful defence against the lack of tangible deliverables.

What Dan Geer points to may well be entirely accurate. But how many professionals have gone to battle on this issue? Versus, how many have said, "oh, that's that, then, we can wash our hands of the damage being done to others...?" (See comment on Adam's original post for more on this.)

My point is that secrecy is a great excuse. Dan Geer's point is another great excuse. The great thing about excuses is that the more the better; the bad thing is that the underlying problem -- sharing information about breaches that might help other companies -- goes unaddressed.

4. I believe I addressed that by listing who to disclose to (points 1,2 in post) Disclosure of breach info to auditors -- why is that useful? A serious question!

Here's one view: It is totally worthless, presuming that the information security professionals are already doing their job and fixing the problem. Is the only value from disclosure to auditors, then, found if the security professionals aren't doing their job, or, perhaps, if they need the "fear of the auditor" in order to do their job in the face of management skepticism?

Posted by: Iang at April 7, 2007 02:05 PM

A little drift

Data Breach Notification Laws: A State-by-State Perspective
http://www.intelligententerprise.com/channels/infomanagement/showArticle.jhtml?articleID=198800638
Congress and Data Breaches
http://computerworld.com/action/article.do?command=viewArticleBasic&taxonomyName=privacy&articleId=9015819&taxonomyId=84&intsrc=kc_feat

this somewhat related to old "security proportional to risk" posting
http://www.garlic.com/~lynn/2001h.html#61

recent posts about having gotten exposed to some of cal. notification legislation (both opt-in/opt-out stuff related to privacy matters as well as breach stuff) when we were called into help wordsmith the electronic signature legislation.

http://www.garlic.com/~lynn/2007f.html#72 Securing financial transactions a high priority for 2007
http://www.garlic.com/~lynn/2007g.html#8 Securing financial transactions a high priority for 2007

Posted by: Lynn Wheeler at April 9, 2007 03:05 PM

After our last conversation I decided to try web browsing with Javascript turned off. It turns out to be surprisingly painless so far! I have two on-line banks -- Wells Fargo and Chase -- and Wells Fargo has traditionally had a simpler (better) user interface. Interestingly, Chase fails noisily without Javascript. After logging in it gives you a .jsp redirect error message or some such bogosity. Wells Fargo gets even simpler (even better), although it does add a note to the top of the page recommending that you turn Javascript back on.

P.S. I'd like to try Frank O. Trotter III's bank, next time I save up enough cash to open another bank account...

Posted by: Zooko at April 12, 2007 12:43 PM
Post a comment









Remember personal info?






Hit preview to see your comment as it would be displayed.