Comments: Finally, the media gets it: The cyber-jihad that the NSA bought to hometown America

Having been on the inside for over a decade I blame the entire thing on accountability or lack of. FISMA, for all it's bloat and misguided garbage, did force NIST to put out some solid process documents which, if actually followed, would do wonders to decrease the threat profile and increase the defensive posture of the Federal networks. Sure the law mandated agencies follow and apply them (hell see the latest FY13 FISMA/Cyberscope guidance from OMB which reinforces this) but the RMF is intentionally vague and misguided to simply allow the authorities (who don't like security anyways) to simply ignore it under the guise of risk acceptance. It's hard to secure a government when the majority of the links are all equally weak as a result of "risk accept not doing anything because security interferes with operations, i.e. I can't install that latest malware widget or I'm required to actually read and do something (i.e. work)".

I'm not going to argue about absolute security by compliance, etc but the simply fact is compliance will get you most of the way there so you can focus your limited resources on the sexy zero day / apt / manual hacks / obscure stuff. Until compliance with the basics like patch and configuration management happens, who cares about the rest.

Posted by Peter at October 29, 2012 08:31 PM

IanG,

It's nice to see the link of to "Bruce" ;-)

But yes I would agree the NSA has a lot to answer for for the lementable state of security on the Internet.

It was not just "the hand on the shoulder" routien over research it was other little triks such as "lies of ommission"

If you think back to NIST's AES competition the NSA were NISTs technical advisors not just with reviewing the candidates but also in setting up the competition requirments.

One of which was candidate evaluation code being put up on the NIST site which could be freely downloaded and making one of the candidate requirments for AES being high performance on standard processors etc.

Now the NSA must have been more than aware of the following,

1, Efficincy -v- Security, as a general case when one goes up the other goes down in a similar way to Usability -v- Security.

2, Time based side channels from the use of Cache Memory.

3, Nearly all implementations would use the speed optomised freely download code.

4, Nearly all users would use AES in "online mode" enciphering not "offline mode".

5, Nearly all "code libraries" would either use the freely downloaded code or their own "speed optomised" code.

Thus you could say they either "rigged the contest" or failed to warn of the consequences of the contest, which we are still lliving with.

Oh and we have good reason to believe the NSA are fully aware of this, if you go and look at their specs for the equipment they use AES in such as the inline Media Encryptor (IME) they are careful to say it's only certified for "data at rest" not whilst it's encrypting or decrypting.

Some analysis has shown that for all the AES finalists the worst offending as far as side channels go is the one finaly selected...

Oh and there are a few other interesting little tricks they have pulled in the past as well.

If I was the NSA with their required "schizophrenic" behaviour I would these days look to attacking not the individual algorithms or how they are implemented but the standards for the protocols they are to be used in. Both SSL and TSL have had protocol weaknesses and I expect there to be others.

The thing about protocol failures is they are in many sytems and thus vulnerabilities that will still be around in thirty years due to the likes of embeded systems in "smart meters" and "Implantable Medical Devices". Thus systems to talk to old broken protocols will likewise have to be around for a similar period of time. And this gives rise to a nice little man in the midle attack of "fallback" in "initial protocol negotiation".

That is when you connect to another system the first thing that happens is the protocol negotiates to find the strongest compatible algorithms etc. The process is one where the protocols and algorithms selected are the strongest in the mutual subset. Now if you sit in the middle when this negotiation goes on you can ensure that the subset only contains the weakest or broken protocols or algorithms. And you will get away with this most times because commercial software providers will put everything in they can think of including the "kitchen sink of plain text" just to ensure they have maximum compatability. But further to stop the users worrying their pretty little heads and making lots of tech support calls the software providers will make the negotiation process "invisible" to the user, so the user has no idea that they are using the weakest not the strongest of protocols and algorithms...

Oh and the chances are the software provider won't make it easy for you to remove or lockout the weak protocols and algorithms you don't want to use...

Why do we know this well look at the debacle with Webbrowsers and CA Certs and just how difficult it is to do those useful things like locking out CA's and Certs you don't want to use...

Posted by Clive at November 25, 2012 06:45 PM
Post a comment









Remember personal info?






Hit Preview to see your comment.
MT::App::Comments=HASH(0x561e4431d458) Subroutine MT::Blog::SUPER::site_url redefined at /home/iang/www/fc/cgi-bin/mt/lib/MT/Object.pm line 125.