I often wondered why it couldn't be transparently added to web servers such that any service offered on the http port would also be available on https, and browsers should be configured to try to use https first..
Ideally the only configuration required should be the ability to flag pages as only accessible via https.
On the negative side, IT department policies might inhibit a move to greater security. It seems the US parent company of where I am now have just rolled out a 'websense' configuration which blocks https connections - presumably because they can't monitor the content...
They have also just blocked 'skype' and all other non-Microsoft Internet Telephony applications.. a quick web search does seem to confirm news reports of a 'Microsoft/Websense# team up...
Posted by DigbyT at March 4, 2011 05:41 PMReminds me of dad says to mom, with three or four young children present, "debbie found the C-o-o-k-i-e-s today, so i hid them in the H-a-l-l- C-l-o-s-e-t. "
The internet giants and commercial software industry have always cooperated with the banking industry and government to make computers just barely secure enough to keep the complete amateurs from accessing each others' computers, but
never secure enough for anything worth more than about 5 cents, on the average. I really think it's been the banks preventing emergence of security, all along
There is a problem with many of these "standardizing security" ideas.
On the face of it using a more "secure" system sounds good but... two little words "backwards compatability" combined with on simple attack vector "man in the middle" can make it worse than usless.
Backwards compatability is needed for "legacy support" that is the latest version of a product still has to work with earlier versions that is a market given. The question is how far back should you go. The reality is usually right back to day zero to prevent loss of customers. And early versions will use weaker or even broken crypto...
So a modern product will have broken or non existant crypto/security built in the question then becomes how does an attacker take advantage of this?
Well there is this little talked about "auto negotiation phase" where to make the software "user friendly" the client and server software negotiate compatable crypto etc. The way auto-negotiation is aranged is it ends up with the "first common" algoritm, starting from the strongest and "working donwards".
Ever asked yourself what happens if there is no "common" algorithm?
Well guess what there is usually "plain text" for testing etc at the bottom of the list, so that can be the "common" algorithm...
To use it all a Man in The Middle attacker has to do is upset the autonegotiation phase so that the lowest common denominator is picked by the client and server and then "step asside" by becomming effectivly transparent for the rest of the users session as the damage is done and the Lowest Common Denominator has been forced and due to "user friendly operation" the chances are the user is blissfully ignorant of this...
Thus when people say "use security Tech X" they should also say at what minimum level as well and ensure that "Tech X" has a way to mandate this requirment...
This is one of the reasons I keep banging on about NIST and frameworks. Because, if properly implemented frameworks allow these requirments to be not just set as policy, they also alow for older insecure algorithms methods and protocols to be compleatly removed. Importantly frameworks ensure that products can not be built that can not have their algorithms, methods and protocols updated, otherwise they would be non compliant and people would not buy them.
Posted by Clive Robinson at March 19, 2011 08:23 PM