September 10, 2005

Open Source Insurance, Dumb Things, Shuttle Reliability

(Perilocity reports that) LLoyds and OSRM to issue open source insurance, including being attacked by commercial vendors over IP claims.

(Adam -> Bruce -> ) an article of the "Six Dumbest Ideas in Computer Security" by Marcus Ranum. I'm not sure who Marcus is but his name keeps cropping up - and his list is pretty good:

  1. Default Permit (open by default)
  2. Enumerating Badness (cover all those we know about)
  3. Penetrate and Patch
  4. Hacking is Cool
  5. Educating Users
  6. Action is Better Than Inaction

I even agree with them, although I have my qualms about these two "minor dumbs:"

  • "Let's go production with it now and we can secure it later" - no, you won't. A better question to ask yourself is "If we don't have time to do it correctly now, will we have time to do it over once it's broken?" Sometimes, building a system that is in constant need of repair means you will spend years investing in turd polish because you were unwilling to spend days getting the job done right in the first place.

The reason this doesn't work is basic economics. You can't generate revenues until the business model is proven and working, and you can't secure things properly until you've got a) the revenues to do so and b) the proven business model to protect! The security field is littered with business models that secured properly and firstly, but very few of them were successful, and often that was sufficient reason for their failure.

Which is not to dispute the basic logic that most production systems defer security until later and later never comes ... but there is an economic incentives situation working here that more or less explains why - only valuable things are secured, and a system that is not in production is not valuable.

  • "We can't stop the occasional problem" - yes, you can. Would you travel on commercial airliners if you thought that the aviation industry took this approach with your life? I didn't think so.

There's several errors here, starting with a badly formed premise leading to can/can't arguments. Secondly, we aren't in general risking our life, just our computer (and identity as of late...). Thirdly, it's a risk based thing - there is no Axiomatic Right From On High that you have to have secure computing, nor be able to drive safely to work, nor fly.

Indeed no less than Richard Feynman is quoted (to support #3) which talks about how to deal and misdeal with the occasional problem.

Richard Fenyman's [sic] "Personal Observations on the Reliability of the Space Shuttle" used to be required reading for the software engineers that I hired. It contains some profound thoughts on expectation of reliability and how it is achieved in complex systems. In a nutshell its meaning to programmers is: "Unless your system was supposed to be hackable then it shouldn't be hackable."

Feynman found that the engineering approach to Shuttle problems was (often or sometimes) to rewire the procedures. Instead of fixing them, the engineers would move the problems into the safety zone created conveniently by design tolerances; Insert here the normal management pressures including the temptation to call the reliability as 1 in 100,000 where 1 in 100 is more likely! (And even that seems too low to me.)

Predictibly Feynman suggests not doing that, and finishes with this quote:

"For a successful technology, reality must take precedence over public relations, for nature cannot be fooled."

A true engineer :-)

See earlier writings on failure in complex systems and also compare Feynman's comments on the software and hardware reliability with this article and earlier comments.

Posted by iang at September 10, 2005 08:32 AM | TrackBack
Comments

When I read about Marcus I thought of his using of plaintext passwords: :) http://searchsecurity.techtarget.com/originalContent/0,289142,sid14_gci1113473,00.html

Posted by: Jens Kubieziel at September 10, 2005 06:47 PM

I finally got around reading the "Six Dumbest Ideas...".
I would like to emphasize the "hacking is cool" part. Offensive thinking and defensive thinking are quite different and there's no guarantee that a good attacker is any good at defense. After reading some of K. Mitnick's -- otherwise very entertaining and even somewhat educational -- books, I would say that he's a very poor security expert and I wouldn't trust him to secure my assets.
Most of Kevin's recommendations wouldn't have worked against him. Simple systems with no human in the decision loop would have probably frustrated most of his efforts at penetration.

Posted by: Daniel A. Nagy at September 11, 2005 09:47 PM

>When I read about Marcus I thought of his using of
>plaintext passwords

As usual, there's more behind a story than meets the eye. Up until shortly before the incident occurred, I had been hosting my email at a friend's service and he'd been too overloaded to set up IMAP+SSL for the mail server. Usually at conferences I only use the wireless to surf to skanky porn sites ;) but I accidentally kicked off Eudora to get some info out of my in-box and, of course, it immediately went to check my mail. Certainly, a fumble.

What's ironic is that my friend had to lose my hosting business over the incident, and it was one of his other friends who was doing the sniffing.

But you're focusing on the wrong issue and you're trying to blame the victim. Of course I know how to secure my Email but I had chosen not to. In fact, as weird as it sounds, it is my right to leave my Email insecure - but it's a federal crime (violation of USC18, ECPA and also the Federal Wiretap Statute) to access it. Because I periodically have used my computer (in my capacity as a consultant) for federal government projects, the violation is actually more severe than just ECPA and FWS. Should I have secured my Email? Maybe. Should a reputable security professional and self-professed privacy advocate have been sniffing passwords? Never.

As you can imagine, lots of people have thrown this incident in my face as if it somehow means that - well - what does it mean? That I don't know anything about security? No. That I don't practice what I preach? No. I don't like my door at home, either. You're welcome to come out and walk right in. The things I choose to do and the things my customers should do are sometimes different. Indeed, sometimes the things my customers choose to do and the things they should do don't line up.

mjr.

Posted by: Marcus Ranum at September 12, 2005 04:24 PM

mjr wrote:
> As usual, there's more behind a story than meets the eye.

Sure, I think this is just one of those things. Just because one deals in security doesn't mean that one has to spend all ones life eating *only* ones own dog food.

OTOH, I think there are limits to asking people to be "responsible" when at conferences such as those. It's not that it is pointless or unprofessional or whatever, it's that there needs to be an escape valve somewhere, somehow. Conferences seem to be it; and if it wasn't there, I'd say it would have to be somewhere else.

Otherwise, I'd simply conclude that it was all talk, all conspiracy theory (either way) and all sales. All of that and no real way to determine if any of it is of any value whatsoever.

Posted by: Iang at September 14, 2005 04:13 PM

>I'm not sure who Marcus is but his name keeps cropping up

I can't believe you don't know who the greatest pioneer in firewalls/proxies and NIDS is?

Marcus is a bit of a hero of mine. He shoots straight. Calls B.S. when it's B.S.
He also makes the best modern Sun Tsu quotes.

Posted by: Pe5kyTac0 at September 16, 2005 01:56 PM
Post a comment









Remember personal info?






Hit preview to see your comment as it would be displayed.