TLS Stapling looks like it will reduce the load on CAs for OCSP, since OCSP responses have to be signed like certificates. This means these responses are expensive to turn around in real time.
Even if the CA does not pregenerate and cache OCSP responses, TLS Stapling will allow the web server to cache the OCSP response - so it does not have to be signed for every single relying party.
Posted by Morgan Collett at April 18, 2006 07:01 AMwhen we were doing x9.59
http://www.garlic.com/~lynn/subpubkey.html#x959
http://www.garlic.com/~lynn/x959.html#x959
and aads
http://www.garlicl.com/~lynn/x959.html#aads
starting in mid-90s we called it *parameterized risk management* ... sort of extension of some of the transaction fraud scoring ... looking at individual transactions ... amount of the transaction, where the transaction originated, what kind of merchant the transaction was coming from, possibly what kind of purchases ... as well as sequence/aggregation of transactions, patterns, etc. ... types of stuff that you can do in a online paradigm system
(can't do with an offlinestale, static, redundant and superfluous certificate paradigm).
this was then coupled with the overall integrity level of the token, and number of authentication factors involved
(potentially requesting additional authentication factors for higher risk operations). furthermore given the kind of token to integate the integrity level (as opposed to fixed integrity level recorded), the token integrity level could be updated in real time as circumstances changed and evolved (a token last month good for a million dollar transaction might only be good for a five hundred dollar transaction this month ... if some vulnerability in particular tokens had been discovered).
If it was a token that had to be used with terminals (POS terminal, ATM terminal) ... it could also take into account the integrity characteristics of the terminal (as well as changes to integrity of the terminal as technology advances over time) and various other physical location characteristics of the terminal that might increase or decrease occurance of risk/fraud.
a trivial example was somewhat the state-of-the-art with biometrics at the point. biometrics was fuzzy match and the infrastructure chose a fixed biometric scoring threshold ... above the threshold it would be accepted, below the threshold it would be rejected (this is somewhat the source of all the stuff in biometrics for false negatives and false positives). parameterized risk management just took the biometric scoring value and included it with the rest of the calculations. if a particular biometric scoring value was less than dynamic threshold necessary included with all the other calculations ... transaction might be rejected or another biometric value requested. the biometric state-of-the-art of the period would have totally different deployed infrastructure for low-value and high-value payments. parameterized risk management had the same online, realtime infrastructure and real-time parameterized risk management would dynamically adjust the requirements to meet the overall risks of a particular operation.
"dynamic authentication" is essentially just a small subset of parameterized risk management.
the commonly used and well-known example of fraud scoring
from the period was about a credit card being used for a $1 fuel purchase in LA (checking to see if lost/stolen card had been deactivated with no risk to the theif) followed within 20 minutes for the purchase of certain other kinds of goods.
course i've always claimed that parameterized risk management is just another application of Boyd for agile and adaptable operation.
misc. past posts mentioning Boyd
http://www.garlic.com/~lynn/subboyd.html#boyd
misc. references to Boyd from around the web
http://www.garlic.com/~lynn/subboyd.html#boyd2
the other comes out of the dynamic adaptive resource management (aka a kind of dynamic scoring of everything that went on) that I did as undergraduate in the 60s. It was picked up and included in products shipped at the time. In the mid-70s, some of it was dropped in mainframe operating system transition from 360s to 370s. However, I was allowed to re-introduce it within a couple years as the Resource Manager product ... 30th anniversary of the blue letter announcing the Resource Manager comes up on May 11th.
Posted by Lynn Wheeler at April 18, 2006 11:44 AM