From a couple of sources posted by Lynn:
- A single run only hits 0.0005 percent of users,
- 1% of customers will follow the phishing links.
- 0.5% of customers fall for phishing schemes and compromise their online banking information.
- the monetary losses could range between $2.4 million and $9.4 million annually per one million online banking clients
- in average ... approximately 832 a year ... reached users' inboxes.
- costs estimated at up to $9.4 million per year per million users.
- based on data colleded from "3 million e-banking users who are customers of 10 sizeable U.S. and European banks."
The primary source was a survey run by an anti-phishing software vendor, so caveats apply. Still interesting!
For more meat on the bigger picture, see this article: Ending the PCI Blame Game. Which reads like a compressed version of this blog! Perhaps, finally, the thing that is staring the financial operators in the face has started to hit home, and they are really ready to sound the alarm.
Posted by iang at December 5, 2009 06:35 PM
With respect to "Ending the PCI Blame Game".
I think the author is slightly "over egging the pudding" in his conclusion that this will be "the death of the internet" however I do agree that it has National Security Implications.
However I regard his five solutions as being short term and akin to rubbing ointment on a stress related skin disorder (or perhaps more seriously sticking a "band aid" on a broken bone).
That is they are aimed at aleviating the apparent symptoms not cureing the actual cause of the symptoms.
Now take a leap of faith with me for a moment when I say the issues is,
"Not charging for out bound data packets on the Internet"
Yes I can hear the WTF thought from hear 8)
I tend to regard the software development process as being extreamly immature and lacking in,
1, Security (assurance).
Now I have said it before that the design test and build process of the intangable product software industry, should be the same as the tangable product manufacturing industry such as that for Fast Moving Consumer Electronics (FMCE).
It is very clear it is not.
In that it is "marketing feature" and "time to market driven" it has similarities to FMCE however it stops there.
Where as FMCE design is usually more driven by quality and reliability issues.
The question is why?
The answer is the asymetric cost of returns on tangable goods, criticaly effects profitability.
There are three costs to a warenty repair,
1, Cost of good return
2, Cost of good rework
3, Cost of developing the rework.
Of these three the first does not even feature in software "patching".
The second is mainly bourn by the customer not the software manufacturer who's cost is having the servers from which the patch can be downloaded, thus we are talking maybe a few cents per customer not dollars of manpower costs.
The third cost is bourn by the manufacturer.
However the relativly low cost of patching has a knock on effect. A quick fix will do. That is the software industry has traind it's customers to take the bulk of the defects cost on the chin every "patch tuesday" etc.
Thus there is no economic cost incentive to break the endless patch cycle.
However the profit wiping out cost of warenty returns on tangable goods forced Quality Assurance into FMCE and other tangable goods manufacture.
If software manufactures had to pay for the outbound traffic of their patches there would be a bottom line incentive to clean up their development process.
And importantly kick out the single leg of the articale authors argument (trust) that he bases his "death of the internet" prediction on.
After all it's not as though we don't know how to do it "embbeded systems" design by (real not software) engineers has been doing it quite happily for years.
So have safety system design engineers and those who design high assurance and availability systems.
OK most of these systems are not pretty and have limited functionality and longer design and test cycles but they don't go wrong.
Saddly we are now seeing "patching on appliances" such as high end mobile phones. Again it is the customer not the manufacture that is paying for what is a warenty repair.
Replace the current neigh on zero cost with significant "asymetric" cost (aproximatly the equivalent of the boxed consumer product price), and patching would cease very rapidly. As a result testing would go up as would reliability thus availability and high assurance design would fairly quickly become the norm.
Oh and "code cutters" would actually be forced to stop being "tempramental divarish artists" or "bodge it on till it stops breaking artisans" and become real engineers (if they can or burger flippers if not).
The down side is product development cycles would slow (no bad thing) and superflous features would tend to disappear as well (again no bad thing) and software would become more robust and aproach the universal functionality of cars and phones which your average user would appreciate.
Yes there would be downsides the AV and Malware industry would take a significant hit as would the training and certification industry and people would have to start paying realistic costs for things.
One asspect would be media solutions web designers would put in less bells and whistles again improving security. And live media would actually start using multicasting properly (which I regard as a win)
It would also importantly make longterm economic sense as the increased stability would bring in more traditional and well understood busines models (a side note if you look at organisations that are actually making money on the internet you will find they are actually running traditional well understood business models).
So hopefully the WTF thought has been replaced with a more thoughtfull view.
As always I'm open to counter arguments, as at the end of the day if we don't start talking about the issue it will forever more remain the Elephant in the Room and that is to nobodies long term interests.