Comments: How the Classical Scholars dropped security from the canon of Computer Science

around the time we were doing some work on the original payment gateway (for what has since became to be called e-commerce)

I frequently commented that to take a standard, well-tested application and turn it into an industrial strength dataprocessing "service", typically required 4-10 times the original effort. as mentioned in the above references ... two of the people that we had worked with when we were doing our high-availability (ha/cmp) product ... were now responsible for the commerce server. old posting about an old ha/cmp meeting

misc. past posts mentioning ha/cmp

we have also made the statement that that payment gateway could be considered the original SOA ... part of that is from coming up with 3-tier architecture in the 80s

when we started ha/cmp we did a detailed failure mode analysis on tcp/ip protocol and code. some of the items we turned up ... showed up in attacks nearly 10 years later .... i.e. from the standpoint of doing industrial strength dataprocessing ... a failure mode analysis is a superset of some types of security treat analysis .... or you upgrade the term threat analysis to any type of problem ... not solely limited to stuff related to human attackers.

of course part of the history was having done a lot of work in the 60s & 70s on operating systems that were used for commercial time-sharing services

where the implicit assumption was that the full community of users could have hostile intent with regard to each other ... compared to today where very few people take seriously that the default assumption is the standard operating environment is extremely hostile.

there used to be the old joke that computer science institutional memory is no more than five years. this somewhat goes along with the current idea that free software is something new ... as opposed to be the default in the 50s, 60s, and 70s. some amount of the change was gov. litigation and the resulting 23jun69 unbundling announcement that started trend charging for software ... some number of past posts about transition from free software to charge-for/licensed software

some past posts mentioning identifying failure modes ... some of which didn't actually materialize (in attacks) to nearly ten years later: Infiniband - practicalities for small clusters command line switches [Re: [REALLY OT!] Overuse of symbolic constants] [Lit.] Buffer overruns Caller ID "spoofing" The Pankian Metaphor

some number of past posts mentioning it taking 4-10 times the effort to turn a well-tested application into an industrial strength service Test and Set (TS) vs Compare and Swap (CS) Buffer overflow Buffer overflow Wanted: the SOUNDS of classic computing IBM says AMD dead in 5yrs ... -- Microsoft Monopoly vs. IBM A Dark Day The BASIC Variations Mars Rover Not Responding Automating secure transactions Vintage computers are better than modern crap ! "Perfect" or "Provable" security both crypto and non-crypto? Systems software versus applications software definitions Systems software versus applications software definitions Systems software versus applications software definitions [Lit.] Buffer overruns Development as Configuration Data communications over telegraph circuits The System/360 Model 20 Wasn't As Bad As All That

Posted by Lynn Wheeler at October 5, 2006 02:17 PM

there is something of an analogy with driving ... long ago you could have an environment that didn't have bumpers, safety glass, collapsible steering column, padded dashboard, safetybelts, door locks, ant-theft devices, alarms, air bags, guard rails, impact barriers, crush zones, stop signs, and a ton of other countermeasures for all the possible bad things that might happen.

part of the current programming state of the art ... is that it appears that nearly all of the countermeasures have to be re-invented from scratch for every new effort (if there is any attempt to address the issues at all).

note that in some of the scenarios ... a particular piece of software is more like a single component ... like the engine ... so a lot of the necessary countermeasures aren't part of the engine operation ... it would be part of other components.

a basic issue is designing and implementing an overall infrastructure that has all the necessary countermeasures designed in as part of the fundamental operation .... aftermarket piecemeal is significantly less effective than having it alld designed in from the start.

re: How the Classical Scholars dropped security from the canon of Computer Science

another scenario ... is the x9.59 paradigm change.

if you take the security acronym PAIN

P -- privacy (or sometimes cain and confidentiality)
A -- authentication
I -- integrity
N -- non-repudiation

much of the current retail payment environment is dependent on privacy (information hiding and encryption). the x9a10 financial standard working group was given the requirement to preserve the integrity of the financial infrastructure for all retail payments (as part of the x9.59 standard work)

x9.59 changed the paradigm and requires integrity and authentication ... but no longer requires data hiding encryption (privacy) as part of the transaction in order to preserve the integrity of the financial infrastructure for all retail payments.

this is somewhat the past comments about moving away from naked payment/transaction paradigm: New ISO standard aims to ensure the security of financial transactions on the Internet Naked Payments IV - let's all go naked Microsoft - will they bungle the security game? Naked Payments IV - let's all go naked Naked Payments IV - let's all go naked Naked Payments IV - let's all go naked Naked Payments IV - let's all go naked Naked Payments IV - let's all go naked Naked Payments IV - let's all go naked DDA cards may address the UK Chip&Pin woes DDA cards may address the UK Chip&Pin woes DDA cards may address the UK Chip&Pin woes Interesting bit of a quote Naked Payments IV - let's all go naked Naked Payments II - uncovering alternates, merchants v. issuers, Brits bungle the risk, and just what are MBAs good for? More Brittle Security -- Agriculture Identity v. anonymity -- that is not the question RSA SecurID SID800 Token vulnerable by design WESII - Programme - Economics of Securing the Information Infrastructure OpenSSL Hacks OT - J B Hunt the personal data theft pandemic continues the personal data theft pandemic continues the personal data theft pandemic continues

Posted by Lynn Wheeler at October 5, 2006 03:14 PM

i guess i have to admit to being on a roll.

one of the big issues is inadequate design and/or assumptions ... in part, failing to assume that the operating environment is an extremely hostile environment with an enormous number of bad things that can happen.

i would even assert that the enormous amounts of money spent attempting to patch an inadequate implementation can be orders of magnitude larger than the cost of doing it right in the first place.

some of this is security proportional to risk ... where it is also fundamdental that what may be at risk is correctly identified.

this is my old standby security proportional to risk posting

these are postings in a pki/ca security proportional risk thread from today

and as i've observed before that possibly part of the "yes card" exploits has been assumptions that there would be attacks on the chipcard ... where, in fact, the "yes card" attacks are on the terminal and infrastructure (and the chipcard attack countermeasures pretty much failed to address terminal/infrastructure attacks at all)

Posted by Lynn Wheeler at October 5, 2006 07:43 PM

But the real solution is to make the right folks responsible for the task. We had a big problem with stolen credit cards until the credit card providers were financially responsible for (almost) all the losses incurred. Suddenly, security measures were taken which reduced the problem to whatever the losers were willing to bear. That works. But when they can make the problem an "externality" (as happens with identity theft these days) the problem will not be contained. The people who can contain it lack the motivation. End of story.

Bruce Schneier writes clearly about these topics at so you can read it for free.

Posted by Richard Karpinski at October 6, 2006 11:21 AM

Funny... yes, incentives is part of the story, but not the whole part. It's the Internet, remember, and the attackers don't like your funny incentives. Also, your defenders need to get their training somewhere. And if you don't do that, then the company goes out and hires some company like Counterpane or Symantec or ..., and we're back with muddled incentives again.

Here's something you should read: someone reaching out and communicating with the blackhats. Coz he needs their help, and they might even do it ... not because he got his ethics out of a cornflakes packet.

Posted by Reading is free, learning costs... at January 3, 2007 03:06 PM

** *** ***** ******* *********** *************

Teaching Viruses

Over two years ago, George Ledin wrote an essay in "Communications of the ACM," where he advocated teaching worms and viruses to computer science majors: "Computer science students should learn to recognize, analyze, disable, and remove malware. To do so, they must study currently circulating viruses and worms, and program their own. Programming is to computer science what field training is to police work and clinical experience is to surgery. Reading a book is not enough. Why does industry hire convicted hackers as security consultants? Because we have failed to educate our majors."

This spring semester, he taught the course at Sonoma State University. It got a lot of press coverage. No one wrote a virus for a class project. No new malware got into the wild. No new breed of supervillain graduated.

Teaching this stuff is just plain smart.

Essay: or or or

Posted by from Schneier's crypto-gram at June 25, 2007 12:51 AM
Post a comment

Remember personal info?

Hit Preview to see your comment.
MT::App::Comments=HASH(0x5641739d1c40) Subroutine MT::Blog::SUPER::site_url redefined at /home/iang/www/fc/cgi-bin/mt/lib/MT/ line 125.