September 25, 2008

another quiet week in finance

This curious article bears out some of the predictions made previously:

In fact, most Wall Street computer models radically underestimated the risk of the complex mortgage securities, they said. That is partly because the level of financial distress is “the equivalent of the 100-year flood,” in the words of Leslie Rahl, the president of Capital Market Risk Advisors, a consulting firm.

But she and others say there is more to it: The people who ran the financial firms chose to program their risk-management systems with overly optimistic assumptions and to feed them oversimplified data. This kept them from sounding the alarm early enough.

Top bankers couldn’t simply ignore the computer models, because after the last round of big financial losses, regulators now require them to monitor their risk positions. Indeed, if the models say a firm’s risk has increased, the firm must either reduce its bets or set aside more capital as a cushion in case things go wrong.

In other words, the computer is supposed to monitor the temperature of the party and drain the punch bowl as things get hot. And just as drunken revelers may want to put the thermostat in the freezer, Wall Street executives had lots of incentives to make sure their risk systems didn’t see much risk.

“There was a willful designing of the systems to measure the risks in a certain way that would not necessarily pick up all the right risks,” said Gregg Berman, the co-head of the risk-management group at RiskMetrics, a software company spun out of JPMorgan. “They wanted to keep their capital base as stable as possible so that the limits they imposed on their trading desks and portfolio managers would be stable.”

One way they did this, Mr. Berman said, was to make sure the computer models looked at several years of trading history instead of just the last few months. The most important models calculate a measure known as Value at Risk — the amount of money you might lose in the worst plausible situation. They try to figure out what that worst case is by looking at how volatile markets have been in the past.

So, what's going on here? It's a simple cycle.

  1. Something goes wrong.
  2. Someone creates a fix.
  3. Something goes wrong.
  4. We discover that those that fixed it were ok, and those that didn't failed.

At this point, the bureaucrats and worry-worts leap into action and demand that the fix be regulated. But then what happens is this:

  1. Something goes wrong, and some fail as above.
  2. The fix is mandated.
  3. The fix is implemented.
  4. Someone bypasses the fix, creatively, because it reduces profits.
  5. Something goes much wronger because the system is now more complex.
  6. Those who bypassed the fix demand a bailout.

Why is this? Managers took their eye off the ball of risk in 4 above. But, they followed the rules! Perversely, then, they can credibly go back and insist they did all that was asked of them. Therefore, the bailout is necessary, because the responsibility for risk is now passed from the risk takers to the rule makers.

In time this pervades the market, so we end up with this:

  1. For everything that goes wrong, a new fix is mandated by the rule makers.
  2. For every mandated fix that is implemented, those that reduce profits are bypassed, creatively.
  3. The system is now much more complex.
  4. The complexity exceeds the ability of the rule makers, because while they understand the rules, they do not understand the bypasses.
  5. The complexity exceeds the ability of the risk takers, as they understand the bypasses, but have lost sight of the reasons for the fixes.
  6. We enter the territory known as "fragility to Black Swans".
  7. Black Swan arrives.

None of this is any surprise to engineers. Complexity makes things really collapse in big and complex ways. The other solution is somewhat simpler:

  1. Something goes wrong.
  2. Those that covered the issue, survive. Those that didn't, die.
  3. The strong survive.

Think of it as a plane with more than one engine... But this is really only possible when regulators and the public alike realise that these are complex systems, and are not amenable to the notions of total reliability. This is the territory where redundancy is king, and failure is encouraged.

Bankrupcy is healthy to the eco-system. If you try and avoid it, watch out for a bigger failure later on. Another salutory lesson comes from the auditors. They were supposed to protect the public investor from the managers in the firm; when Arthur Andersen was caught out for allegedly protecting the managers from the investors ... it collapsed. When KPMG found itself on the wrong side of someone's stake holder list, it almost folded but wise regulators said we can't lose another one.

The message here is very clear: It's a cycle thing again:

  1. Become very friendly with those who can save you.
  2. Make lots of money.
  3. Call in the favours when it goes wrong.

To close, another perspective on all this is from the Black Swan of Nassim Nicholas Taleb. Here's an introductory article on Black Swans, pointed out by Twan, around 20 pages, describing the statistical anomaly that causes complex systems to fail spectacularly. This is no source of mystery to engineers, but for finance people, worth reading.

Posted by iang at September 25, 2008 12:42 PM | TrackBack

> The message here is very clear: It's a cycle thing again:
> 1. Become very friendly with those who can save you.
> 2. Make lots of money.
> 3. Call in the favours when it goes wrong.

This sounds like the current Fortis / ABN Amro stuff going on... They bought ABN for 24 billion, now sell it for 10, get 4 billion support from dutch government (and 7 from belgium&luxemburg).

Just a reminder: every year for the last few years, the NET profit of ABN was >4 billion... Somehow there is something fishy going on...

Posted by: BigMac at September 29, 2008 06:58 AM

Hi Ian

Interesting times.

Re the limits of statistics, the ET site has this;sid=2008/9/24/1456/73379



Posted by: ET: on the limits of statistics at September 29, 2008 07:06 AM

I used to manage the risk system for XXX when it went bust in mid 1990s - the figures were pure fantasy - no one understood them - as long as they were roughly the same - no one really bothered. If they changed drastically, it was usually a feed failure - it some piece of data went awry - so once that was fixed, as long as the figures returned to around the same level - all is well. In reality, traders were exchanging data and models on floppy disks - no one knew the risk.

Then one day - the three heads of risk were fired - and the bank collapsed. YYY bought them out and it became known as ZZZ - then as peoples memories faded - it morphed to AAA ZZZ - then AAA, now one of the national champions. Of course - it could never happen again...

Posted by: Highly Anonimised at September 29, 2008 09:18 AM

somewhat long winded post repeated a few places ... archived flavors:


so on one side now there is:

Risk Management Failings Spur Big Financial IT Investments

and on the other side there is:

How Wall Street Lied to Its Computers

(postings to the above blog have added some number of examples)

I've sporadically drawn the parallel between CDOs and obfuscating "observation" in OODA-loops CDOs subverting Boyd's OODA-loop

... CDOs were used two decades ago in the S&L crisis to obfuscate underlying values and unload the properties.

This is long-winded, decade-old post about many of the current problems, including the need for visibility into CDO-like instruments:

There is possibility that one of the people responsible for the analysis that resulted in citibank getting out of the home mortgage business (in the troubles two decades ago, mentioned in the above post) ... was involved with increasingly sophisticated analytics:

How Conventional CDO Analytics Missed the Mark

from (20Dec2007) above:

"Two years ago the Wall Street Journal in a page 1 story pointed out the dangers in relying on the copula approach for CDO valuation, but investors were slow to realize the magnitude of their model risk"

... snip ...

the above isn't inconsistent with claims that computerized risk was manipulated.

however, somewhat the other side (although again not necessarily inconsistent)

Subprime = Triple-A ratings? or 'How to Lie with Statistics'

disclaimer, in the past we've visited these guys' offices (nice location); misc. past posts mentioning Kamakura: Newsweek article--baby boomers and computers As Expected, Ford Falls From 2nd Place in U.S. Sales As Expected, Ford Falls From 2nd Place in U.S. Sales Computer Science Education: Where Are the Software Engineers of Tomorrow? Toyota Sales for 2007 May Surpass GM Toyota Sales for 2007 May Surpass GM independent appraisers dollar coins

Then there is (from last march) the following which includes mention that wall street took $137B out in bonuses during the period (reward for creating the current situation?, the proposed $700B would in part be used to replenish the amount taken out of the system by those bonuses):

The Fed's Too Easy on Wall Street

The GAO has started a database of the increasing numbers of financial restatements (despite SOX). Basically financial statements are inflated ... and bonuses for top executives taken based on the inflated statements. Later the numbers are restated, but the bonuses not forfeited. An example, Freddie was fined $400m in 2004 for $10B inflation in statements and the CEO replaced ... however, the CEO kept tens of (hundred?) millions.

Part of inflating the bottom line, as method of boosting bonuses, included enormous leveraging ... despite it creating enormous long term risk. It could be considered consistent with the theme that such extremely risky behavior is rewarded and there being no corresponding downside risk (and therefor promotes
moral hazard)

There was a business school article from last spring that estimated approx. 1000 executives are responsible for 80percent of the current crisis and that it would go a long way to correcting the situation if the gov. figured out a way for them to loose their job.

Posted by: Lynn Wheeler at September 29, 2008 09:58 AM

Long Tailed events are not Black Swans. It's a nice attempt to introduce the unlearned into the concept, but there's plenty of useful priors for long tailed events (or else they wouldn't be represented in the distribution).

Posted by: Nigel Mellish at September 29, 2008 10:52 AM
Post a comment

Remember personal info?

Hit preview to see your comment as it would be displayed.