April 29, 2007

Security Expertise from Cryptographers: the Signals of Hubris

Over on the crypto forum, an old debate restarted where some poor schmuck made the mistake of asking why CBC mode should not have an IV of all zeros?

For the humans out there, a "mode" is way of using a block cipher ("encrypts only a block") to encrypt a stream. CBC mode chains each block to each next block by feeding the results from the last to the next ("cipher-block-chaining"). But, we need an initial value (IV) to start the process. In the past, zeros were suggested as good enough.

The problem surfaces that if you use the same starting key, and the user sends the same packet, then the encrypted results will be the same (a block cipher being deterministic, generally). So we use a different IV for every different use of the secret key, in order to cause different results, and hide the same packet being sent. (There are also some other difficulties, but these are not only much more subtle, they are beyond my ability to write about, so I'll skip past them blithely.)

Back to the debate. Predictably, there was little controversy over the answer ("you can't do that!"), some controversy over the fuller answer (what to do), but also more controversy over the attitude. In short, the crypto community displayed what might be considered a severe case of hubris:

"If you don't know, then you had better employ a security expert who does!"

There are many problems with this. Here's some of them:

1. At a guess, designing secure protocols is 50% business, 40% software, 10% crypto. So when you say "employ a security expert" we immediately have the problem that security is many disciplines. A security expert might know a lot about cryptography, or he might know very little. What we can't reasonably expect is that a security expert knows a lot about everything.

We have to be cut some out, and as security -- the entire field -- only rarely and briefly depends on cryptography, it seems reasonable that your average security expert only knows the basics of cryptography.

Adi Shamir said in his misconceptions in cryptography:

By implementers: Cryptography solves everything, but:
  • only basic ideas are successfully deployed
  • only simple attacks are avoided
  • bad crypto can provide a false sense of security

What then are the basics? Block ciphers are pretty basic. Understanding all the ins and outs of modes and IVs however is not. Where we draw the line is in knowing our own limitations, so therefore the original question ("what's wrong with IV of all zeros?") indicates quite good judgement.

2. But what about software? We can see it in the cryptographers' view:

"Just use a unique IV."

Although this sounds simple, it is a lot harder to implement than to say, and the person who gets to be responsible for the reality is the programmer. In glorious, painful detail.

Consider this: say you choose to put random numbers in the IV, and then you specify that the protocol must have a good RNG ("random number generator"). Then, how do you show that this is what actually happens? How can you show that the user's implementation has a good RNG? Big problem: you can't. And this has been an ignored problem for basically the entire history of Internet cryptography.

Dilbert on Randomness

This would suggest that if you are doing secure protocols, what you want is someone who knows more about software than cryptography, because you want to know the costs of those choices more than you want to know the snippety little details.

3. Unfortunately, cryptography has captured the attention as a critical element of security. Software on the other hands is less interesting; just ask any journalist.

Because of this, crypto has become a sort of rarified unchallengeable beast in technological society, an institution more displaying of religious beliefs than cold science. Mere software engineers can rarely debate the cryptographers as peers, even the most excellent ones (like Zooko, Hal Finney and Dani Nagy) who can seriously write the maths.

So while I suggest that software is more important to the end result than cryptography, the programmers still lose in the face of assumed superiority. "Employ someone better than you" smacks of false professionalism, guildmaking. It's simple enough economics; if we as a group can create an ethos like the medical industry, where we could become unchallenged authorities, then we can lock out competitors (who haven't passed the guild exam), and raise prices.

Unfortunately, this raises the chance to near certainty that we also lock out people who can point out when we are off the rails.

4. And, we can forget about the business aspects completely. That is, a focus on requirements is just not plausible for people who've buried themselves in mathematics for a decade, as they don't understand where the requirements come from. Talking about risk is a vague hand-wavy subject, better left alone because of its lack of absolute provability.

By way of example, one of the questions in the recent debate was

"you haven't heard of anyone breaking SD cards have you?"

From a business perspective this makes complete sense, it is a completely standard risk perspective with theory to back it up (albeit a simplification); from a cryptographer's perspective it is laughable.

Who's right? Actually, both perspectives have merit, and choosing the right approach is again the subject of judgement.

5. Hence, a strong knee-jerk tendency to fall-back to the old saws like the CIA of confidentiality, integrity and authentication. But, it is generally easy to show that CIA and similar things are nice for crypto school but a bad start in real life.

E.g., in the above debate, it turns out that the application in question is DRM -- Digital Rights Management. What do we know about DRM?

If anything has been learnt at massive cost over the last 10 years of failed efforts, it is that DRM security does not have to be strong. Rather it has to create a discrimination between those who will pay and those who won't. This is a pure business decision. And it totally blows away the C, confidentiality, in CIA.

Don't believe me? Buy an iPod. The point here is that if one analyses the market for music (or ringtones, or cell phones, apparently) then the last thing one cares about is being able to crack the crypto. If one analyses the market for anything, CIA turns out to be such a gross simplification that you can pretty much assume it to be wrong from the start.

6. If you've subscribed to the above, even partly, you might then be wondering how we can tell a good security expert from a bad one? Good question!

Even I find it difficult, and I design the stuff (not at the most experienced level, but way more than for the hobby level). It is partly for that reason that I name financial cryptographers on the blog: to develop a theory of just how to do this.

How is a manager who has no experience in this going to do pick from a room full of security experts, who mostly disagree with each other? Basically, you're stuffed, mate!

So where does that leave us? Steve Bellovin says that when we hire a security expert, we are hiring his judgement, not his knowledge of crypto. Which means knowing when to worry, and especially _when not to worry_, and I fully agree with that! So how then do we determine someone's judgement in security matters?

Here's a scratched-down emerging theory of signals of security judgement:

BAD: If someone says "you need to hire a security expert, because you don't know enough, and that could be dangerous" that's a red-flag.

GOOD: If they ask what the requirements are, that's good.

BAD: If they assume CIA before the requirements, that's bad.

GOOD: If they ask what the application is, then a big plus.

BAD: If they say "use a standard protocol" then that's bad.

GOOD: If they say "this standard protocol may be a good starting point" then that's good.

BAD: If they know what goes wrong when an IV is not perfect, that's a danger sign.

GOOD: If they know enough to say "we have to be careful of the IV" then that's good.

BAD: If they don't know what risk is, they cannot reduce your risk.

GOOD: If they know more software than crypto, that's golden.

Strongly oriented to weeding out those who know too much about cryptography, it seems.

Sadly, the dearth of security engineers who don't subscribe to the hubristic myths leaves us with an unfortunate result: the job is left up to you. This has been the prevalent result of so many successful Internet ventures that it is a surprise that it hasn't been more clear. I'm sufficiently convinced that I think it has reached the level of a Hypothesis:

"It's your job. Do it."

Hubris? Perhaps! It's certainly your choice to either do it yourself, or hand it over to a security expert, so let's hear your experiences!

Posted by iang at April 29, 2007 05:42 PM | TrackBack

lot of articles on cryptography found here:http://www.cryptography.ws

cryptography articles:

Posted by: topcontents at May 1, 2007 07:14 AM

I've always maintained that DRM is mainly a business problem, not a technical one. My reasoning is that in DRM you are struggling with two opposed objectives, access and denial, at the same time in an environment in which making a perfect copy of the object is dirt cheap.

Posted by: mario at May 3, 2007 07:07 PM
Post a comment

Remember personal info?

Hit preview to see your comment as it would be displayed.