September 09, 2007

The Failure of the Academic Contribution to Security Science

This blog frequently presses the case for the dysfunctional family known as security, and even presents evidence. So much so, that we've gone beyond the evidence and the conclusion, and we are more interested in the why?

Today we have insights from the crypto layer. Neal Koblitz provides his thoughts in an article named "The Uneasy Relationship Between Mathematics and Cryptography" published in something called the Notices of the AMS. As perhaps everyone knows, it's mostly about money, and Koblitz identifies several threads:

  • bandwagon effect
  • NSA-supplied money
  • "the power that an aura of mathematical certainty can have over competitive solutions" a.k.a. provable security
  • the unfortunate effect of computer science on cryptography

We've certainly seen the first three, and Koblitz disposes of them well. Definately well recommended reading.

But the last thread takes me by surprise. I would have said that cryptographers have done precisely the reverse, by meddling in areas outside their competence. To leap to computer science's defence, then, permit me to turn Koblitz's evidence:

Here the word “protocol” means a specific sequence of steps that people carry out in a particular application of cryptography. From the early years of public key cryptography it has been traditional to call two users A and B of the system by the names “Alice” and “Bob.” So a description of a protocol might go as follows: “Alice sends Bob..., then Bob responds with..., then Alice responds with...,” and so on.

I don't think so! If you think that is a protocol, you are not a computer scientist. Indeed, if you look at the designs for protocols by cryptographers, that's what you get: provable security and a "protocol" that looks like Alice talking to Bob.

Computer science protocols are not about Alice and Bob, they are about errors. Errors happen to include a bunch of things, including Alice and Bob. Also, Mallory and Eve, but also many many other things. As the number of things that can go wrong far exceed the numbers of cryptographic friends on the planet, we would generally suggest that computer scientists should write protocols, so as to avoid the Alice-Bob effect.

Just to square that circle with yesterday's post, it is OK to talk about Alice-Bob protocols, in order to convey a cryptographic idea. But it should be computer scientists who put it into practice, and as early as possible. Some like to point out that cryptography is complex, and therefore you should employ cryptographers for this part. I disagree, and quote Adi Shamir's 3rd misconception. Eventually your crypto will be handled by developers, and you had better give them the simplest possible constructions they can deal with.

Back to Koblitz's drive-by shooting on computer science:

Cryptography has been heavily influenced by the disciplinary culture of computer science, which is quite different from that of mathematics. Some of the explanation for the divergence between the two fields might be a matter of time scale. Mathematicians, who are part of a rich tradition going back thousands of years, perceive the passing of time as an elephant does. In the grand scheme of things it is of little consequence whether their big paper appears this year or next. Computer science and cryptography, on the other hand, are influenced by the corporate world of high technology, with its frenetic rush to be the first to bring some new gadget to market. Cryptographers, thus, see time passing as a hummingbird does. Top researchers expect that practically every conference should include one or more quickie papers by them or their students.

Ouch! OK, that's fair point. However the underlying force here is the market, and computer science itself is not to blame, rather, the product of its work happens to locate a lot closer to the market than, say, mathematics. Koblitz goes on to say:

There’s also a difficulty that comes from the disciplinary culture of cryptography that I commented on before. People usually write papers under deadline pressure — more the way a journalist writes than the way a mathematician does. And they rarely read other authors’ papers carefully. As a result even the best researchers sometimes publish papers with serious errors that go undetected for years.

That certainly resonates. When you spot an error, and it appears embedded in an academic paper, it stays for years. Consider a random paper on my desktop today, one by Anderson and Moore, "On Information Security -- and beyond." (Announced here.) It is exceedingly well-written, and quite a useful summary of economics thought today in academic security fields. Its pedigree is the best, and it seems unassailable.

Let us then assail. It includes an error: the "lemons" myth, which has been around for years now. Indeed, someone has pointed out the flaw, but we don't know who:

In some cases, security is even worse than a lemons market: even the vendor does not know how secure its software is. So buyers have no reason to pay more for protection, and vendors are disinclined to invest in it.

In detail, "lemons" only applies when the seller knows more than the buyer, this being one of the two asymmetries referred to in asymmetrical information theory. Other things apply in the other squares of the information matrix, and as with all such matrices, we have to be careful not to play hopscotch. The onus would be on Anderson and Moore, and other fans of citric security, to show whether the vendors even know enough to qualify for lemon sales!

Which all supports Koblitz's claims at least partially. The question then is why is it that the academic world of cryptography is so divorced? Again, Anderson and Moore hint at the answer:

Economic thinkers used to be keenly aware of the interaction between economics and security; wealthy nations could afford large armies and navies. But nowadays a web search on ‘economics’ and ‘security’ turns up relatively few articles.

Ironically, their very care and attention is reflected in the list of cited references at the end of the paper: one hundred and eight references !!! Koblitz would ask if they had read all those papers. Assuming yes, they must have covered the field, right?

I scanned through quickly and found three references that were not from reputable conferences, university academics and the like. (These 3 lonely references were from popular newspapers.)

What does a references section that only references the academic world mean? Are Anderson and Moore ignoring everything outside their own academic world?

I hasten to add, this is not a particular criticism against those authors. Many if not all academic authors, and conferences, and peer-review committee chairs would plead guilty of the same crime, proudly, even. By way of example, I have on my desk a huge volume called Phishing and Countermeasures, edited by Jakobsson and Myers. The list of contributors reads like a who's who of Universities, with the occasional slippage to Microsoft, RSA Laboratories, etc.

Indeed, Koblitz himself might be bemused by this attack. Why is the science-wide devotion to academic rigour a criticism?

Because it's security, that's why. Security against real threats is the point where scientific integrity, method and rigour unravels.

Consider a current threat, phishing. The academic world turned up to phishing relatively late, later than practically everyone else. Since arriving, they've waltzed around as if they'll solve it soon, just give them time to get up to speed, and proceeded to mark out the territory. The academics have presented stuff that is sometimes interesting but rarely valuable. They've pretty much ignored all the work that was done before hand, and they've consequently missed the big picture.

Why is this? One reason is above: academic work is only serious if it quotes other academic work. The papers above are reputable because they quote, only and fulsomely, other reputable work. And the work is only rewarded to the extent that it is quoted ... again by academic work. The academics are caught in a trap: work outside academia and be rejected or perhaps worse, ignored. Or, work with academic references, and work with an irrelevant rewarding base.

And be ignored, at least by those who are monetarily connected to the field. By way of thought experiment, consider how many peer-review committees on security conferences include the experts in the field? If you scan the lists, you don't find names like "Ivan Trotsky, millionaire phisher" or perhaps "John Smith, a.k.a. Bob Jones and 32 other aliases, wanted for tera-spamming in 6 states." Would we find "Mao Tze Ling, architect for last year's Whitehouse network shakedown?"

OK, so we can't talk to the actual crooks, but it's just a matter of duplicating their knowledge, right? In theory, it's just a matter of time before the academics turn the big guns onto the threat, and duplicate the work done outside academia. They can catch up, right?

Unfortunately not. Consider anoother of Koblitz's complaints, above where cryptography is

"influenced by the corporate world of high technology, with its frenetic rush to be the first to bring some new gadget to market."

Actually, there are two forces at work here, being the market and the attacker . In short, a real attack in this field migrates in terms of a month. Even with an accelerated paper cycle of 3 months to get the work peer-reviewed and into the next party for student quickies, to use Koblitz's amusing imagery, attacks migrate faster.

The academic world of security is simply too far away from their subject. Do they know it? No, it seems not. By way of example, those in the academic world of security and economics claim that the field started only recently, in the last few years. Nonsense! It has always been there, and received massive boosts with the work of David Chaum, the cypherpunks, Digicash, the Amsterdam hackers and many others.

What was not done was to convert this work into academic papers. The literature is there, but not in conference proceedings.

What was done was to turn the academic thought process into hugely beneficial contributions to security and society. All of the work mentioned above has led directly, traceably, to the inventions we now know as PayPal, ebay, WebMoney, gold community, and slowly moving through to the mass of society in the form of NFC, mobile and so forth. (The finance and telco sectors move slowly to accomodate these innovations, but do they move more slowly than the academic sector?)

The problem is that since the arisal of the net, the literature for fast-paced work has moved out of the academic sphere into other means of distribution: shorter essays, private circulations over email, business plans, open source experiments, focussed maillists and ... of course, blogs. If you are limiting yourself to conference proceedings on security, you are dead in the water, to take a phrase from naval security of a bygone age.

So much so that when you add up all these factors, the conclusion suggested is that the academic world will possibly never be able to deal with security as a science. Not if they stick to their rules, that is. Is it possible we now live in a world where today's academia cannot practically and economically contribute to an entire sector of science? That's the claim, let's see how it pans out.

Some criticisms also apply to Koblitz. Maybe mathematics is so conveniently peaceful as to support an academic tradition, but why is it that on the first page, he waxed longingly and generously on the invention of public key cryptography, but without mentioning the authors of the paper? He might say that Diffie and Hellman did not include any mathematics in their paper, but I would say "tosh!"

Posted by iang at September 9, 2007 02:17 PM | TrackBack
Comments

Some criticisms also apply to Koblitz's paper, but I couldn't fit them in the context of the drive-by shooting above. Here's one:

Maybe mathematics is so conveniently peaceful as to support an academic tradition, but why is it that on the first page, he waxed longingly and generously on the invention of public key cryptography, but without mentioning the authors of the paper? He might say that Diffie and Hellman did not include any mathematics in their paper, but I say "tosh!"

Posted by: Iang at September 9, 2007 11:16 AM

Hey,
I have noticed that your blog doesn't read right on my Feed Reader . There are no space, no block divisions etc., its just one long paragraph. Maybe you could fix that?

Regards,
duryodhan

Posted by: duryodhan at September 9, 2007 11:48 PM

Thought the following would be of (some?) interest.

"Someone writes in with the following question:

I've been studying Information Technology risk for some time now and so your work is of great interest. In IT risk we have several problems that a Bayesian approach would seem to help us address. Namely:" Further in the link.

http://www.stat.columbia.edu/~cook/movabletype/archives/2007/09/bayes_and_risk.html

Posted by: Darren points to risk at September 11, 2007 10:33 AM

I enjoyed Koblitz's personal perspective on the fast paced cryptography "development" as part of computer science a lot. But that doesn't necessarily mean that he's right in every way.

> > As perhaps everyone knows, it's mostly about money, and Koblitz
> > identifies several threads:
> > * bandwagon effect

True, it's difficult to restrain yourself from profiling and selling your work to the industry when strapped for research funds.

> > * NSA-supplied money

Bof, US-specific situation. The crypto world is bigger than work done in the US alone.

> > * "the power that an aura of mathematical certainty can have over
> > competitive solutions" a.k.a. provable security

Sigh, mathematicians appear to beter salespeople than one would expect. Well, if you want to fool yourself, go ahead.

> > * the unfortunate effect of computer science on cryptography
> >
> > We've certainly seen the first three, and Koblitz disposes of them
> > well. Definately well recommended reading.

- snip -

> > the number of things that can go wrong far exceed the numbers of
> > cryptographic friends on the planet, we would generally suggest that
> > computer scientists should write protocols, so as to avoid the
> > Alice-Bob effect.

Well, that is interesting as I agree completely with you. But as I wrote above, Koblitz's outlook is limited. There is interesting research going taking the problems with the Dolev-Yao abstraction at the horns!

Look for recent work by Backer, Pfitzmann and Waidner. This team recognizes the neglect of the lower levels of a protocol representation and instantiation.

For example the Dolev-Yao abstraction can't deal with the peculiarities of an XOR as encryption function as it crosses from abstractions to the actual bit representation of protocol message instances.

Therefore Koblitz's generalisation is not completely warranted. Besides the intrepid trio BPW themselves say that they are just beginning and don't claim to have solved the problem (as good researchers should do to protect their jobs and funding ;-)

> >
> > Just to square that circle with yesterday's post, it is OK to talk
> > about Alice-Bob protocols, in order to convey a cryptographic idea.
> > But it should be computer scientists who put it into practice, and as
> > early as possible. Some like to point out that cryptography is
> > complex, and therefore you should employ cryptographers for this part.
> > I disagree, and quote Adi Shamir's 3rd misconception. Eventually your
> > crypto will be handled by developers, and you had better give them the
> > simplest possible constructions they can deal with.

ok


I wish I had the time to respond to the remainder.

Oh boy I wish I had the time to write so much text ;-)

May my comments will be continued, if I'm granted time.

Posted by: Twan at September 11, 2007 04:41 PM

The habit of academic papers to cite only other academic papers is a huge problem, similar to the tendency of patents to only cite other patents.

The excuse in both cases is that writers need some way to prune their research. With Google is that any longer a viable excuse?

Of course, blogs often don't cite the academic papers or anything else they are based on, so the problem is mutual.

Posted by: nick at September 20, 2007 09:58 PM
Post a comment









Remember personal info?






Hit preview to see your comment as it would be displayed.