April 13, 2010

Ruminations on the State of the Security Nation

In an influential paper, Prof Ross Anderson proposes that the _Market for Lemons_ is a good fit for infosec. I disagree, because that market is predicated on the seller being informed, and the buyer not. I suggest the sellers are equally ill-informed, leading to the Market for Silver Bullets.

Microsoft and RSA have just published some commissioned research by Forrester that provides some information, but it doesn't help to separate the positions:

CISOs do not know how effective their security controls actually are. Regardless of information asset value, spending, or number of incidents observed, nearly every company rated its security controls to be equally effective — even though the number and cost of incidents varied widely. Even enterprises with a high number of incidents are still likely to imagine that their programs are “very effective.” We concluded that most enterprises do not actually know whether their data security programs work or not.

Buyers remain uninformed, something we both agree on. Curiously, it isn't an entirely good match for Akerlof, as the buyer of a Lemon is uninformed before, and regrettably over-informed afterwards. No such for the CISO.

Which leaves me with an empirical problem: how to show that the sellers are uninformed? I provide some anecdotes in that paper, but we would need more to settle the prediction.

It should be possible to design an experiment to reveal this. For example, and drawing on the above logic, if a researcher were to ask similar questions of both the buyer and the seller, and be able to show lack of correlation between the supplier's claims, and the incident rate, that would say something.

The immediate problem of course is, who would do this? Microsoft and RSA aren't going to, as they are sell-side, and their research was obviously focussed on their needs. Which means, it might be entirely accurate, but might not be entirely complete; they aren't likely to want to clearly measure their own performance.

And, if there is one issue that is extremely prevalent in the world of information security, it is the lack of powerful and independent buy-side institutions who might be tempted to do independent research on the information base of the sellers.

Oh well. moving on to the other conclusions:

  • Secrets comprise two-thirds of the value of firms’ information portfolios.
  • Compliance, not security, drives security budgets.
  • Firms focus on preventing accidents, but theft is where the money is.
  • The more valuable a firm’s information, the more incidents it will have.

The second and third were also predicted in that paper.

The last is hopefully common sense, but unfortunately as someone used to say, common sense isn't so common. Which brings me to Matt Blaze's rather good analysis of the threats to the net in 1995, as an afterword to Applied Cryptography, the seminal red book by Bruce Schneier:

One of the most dangerous aspects of cryptology (and, by extension, of this book), is that you can almost measure it. Knowledge of key lengths, factoring methods, and cryptanalytic techniques make it possible to estimate (in the absence of a real theory of cipher design) the "work factor" required to break a particular cipher. It's all too tempting to misuse these estimates as if they were overall security metrics for the systems in which they are used. The real world offers the attacker a richer menu of options than mere cryptanalysis; often more worrisome are protocol attacks, Trojan horses, viruses, electromagnetic monitoring, physical compromise, blackmail and intimidation of key holders, operating system bugs, application program bugs, hardware bugs, user errors, physical eavesdropping, social engineering, and dumpster diving, to name just a few.

Right. It must have not escaped anyone by now that the influence of cryptography has been huge, but the success in security has been minimal. Cryptography has not really been shown to secure our interactions that much, having missed the target as many times as it might have hit it. And with the rise of phishing and breaches and MITBs and trojans and so forth, we are now in the presence of evidence that the institutions of strong cryptography have cemented us into a sort of Maginot line mentality. So it may be doing more harm than good, although such a claim would need a lot of research to give it some weight.

I tried to research this a little in Pareto-secure, in which I asked why the measurements of crypto-algorithms received such slavish attention, to the exclusion of so much else? I found an answer, at least, and it was a positive, helpful answer. But the far bigger question remains: what about all the things we can't measure with a bit-ruler?

Matt Blaze listed 10 things in 1996:

  1. The sorry state of software.
  2. Ineffective protection against denial-of-service attacks.
  3. No place to store secrets.
  4. Poor random number generation.
  5. Weak passphrases.
  6. Mismatched trust.
  7. Poorly understood protocol and service interactions.
  8. Unrealistic threat and risks assessment.
  9. Interfaces that make security expensive and special.
  10. No broad-based demand for security.

In 2010, today more or less, he said "not much has changed." We live in a world where if MD5 is shown to be a bit flaky because it has less bits than SHA1, the vigilantes of the net launch pogroms on the poor thing, committees of bitreaucrats write up how MD5-must-die, and the media breathlessly runs around claiming the sky will fall in. Even though none of this is true, and there is no attack possible at the moment, and when the attack is possible, it is still so unlikely that we can ignore it ... and even if it does happen, the damages will be next to zero.

Meanwhile, if you ask for a user-interface change, because the failure of the user-interface to identify false end-points has directly led to billions of dollars of damages, you can pretty much forget any discussion. For some reason, bit-strength dominates dollars-losses, in every conversation.

I used to be one of those who gnashed my teeth at the sheer success of Applied Cryptography, and the consequent generation of crypto-amateurs who believed that the bit count was the beginning and end of all. But that's unfair, as I never got as far as reading the afterword, and the message is there. It looks like Bruce made a slight error, and should have made Matt Blaze's contribution the foreword, not the afterword.

A one-word error in the editorial algorithm! I must write to the committee...

Afterword: rumour has it that the 3rd edition of Applied Cryptography is nearing publication.

Posted by iang at April 13, 2010 02:25 AM | TrackBack
Comments

Interesting piece.
This relates to a short piece I wrote on Information Asymmetry in Information Security.

I made it free here:

Security derivatives: the downward spiral caused by information asymmetry
http://www.the451group.com/intake/securityderivatives/


For the author, I'd be interested in comparing notes - as I am working on a larger look at the suboptimal economic ecosystem in Information Security.

Posted by: Joshua Corman at May 14, 2010 08:59 AM
Post a comment









Remember personal info?






Hit preview to see your comment as it would be displayed.