Bruce Schneier's popular cryptogram is out with the essay that I challenged earlier. Here's another point that I missed early on:
I see this kind of thing happening over and over in computer security. In the late 1980s and early 1990s, there were more than a hundred competing firewall products. The few that "won" weren't the most secure firewalls; they were the ones that were easy to set up, easy to use and didn't annoy users too much. Because buyers couldn't base their buying decision on the relative security merits, they based them on these other criteria. The intrusion detection system, or IDS, market evolved the same way, and before that the antivirus market. The few products that succeeded weren't the most secure, because buyers couldn't tell the difference.
This is simply the dominance of Kherchoffs' 6th principle. The most important ("as I've said many times before") although cryptographers still believe the 2nd is more important. One could validly ask how many product and industry evolutions do we have to watch before we work this out? Will this happen, again and again?
For those who do not recall, K6 is simply that the tool has to be usable:
6. Finally, it is necessary, given the circumstances that command its application, that the system be easy to use, requiring neither mental strain nor the knowledge of a long series of rules to observe.
By users. It's almost as if, as suggested above, that buyers cannot base their buying decision on the security metrics, so they cast around and decide on usability.
But not quite. The truth is far more fundamental than mere ignorance and random behaviour by users. It is that a tool that is used delivers security, and a tool that is not used cannot deliver any security at all. So an easy-to-use tool will generally win over a hard-to-use tool, and this will dominate the security equation.
Or to put it in other terms (another hypothesis!) the most common security failure of all is to not be used. The security business is more about business rules than security rules, which leads me into today's hypothesis:
#6.4 Compromise on Security before DeliverySecurity experts will tell you that if you don't build in security at the beginning, then you'll have trouble building it in later. This is true, but it hides a far worse evil: if you spend too much effort on security up front, you'll either lose time and lose the market, or you'll create a security drag on the application side before it has shown its true business model.
The security-first argument speaks to security convenience, the business-first argument speaks to survival. Make sure you understand your businesss need for survival, and pray that they understand the inconveniences you put up with to ensure that goal.
This will be unpopular. But, read the snippet above, again. Kerckhoffs wrote K6 in 1883, so we've got plenty of time. Again and again.
Posted by iang at May 15, 2007 06:30 AM | TrackBacki was once at a monthly meeting where somebody talked about security evaluations (the old C2, B2, A2, etc types stuff and the newer replacement, common criteria and protection profiles). supposedly a purpose of the security evaluations would be to allow customers to make some comparable security assessment about different security products from different vendors.
One of the comments was that there had been something like 64 common criteria evaluations of particular types of product and of the 64, 62 evaluations had some sort of unpublished deviations and/or exceptions (devaluing the usefulness of security comparisons).
Posted by: Lynn Wheeler at May 15, 2007 09:39 AM