Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

"The housing bubble was BLATANT. No black swans involved."

Yes, but what was not blatant was that the housing bubble popping would kill Lehman Brothers, cause runs on banks and almost bring down the whole world economy.

Taleb's point is that you can't go around behaving like any system is, to borrow a phrase "too big to fail" - or rather too cleverly thought out/regulated to fail. To some extent this is simple engineering. I can test the resilience of my cluster to network failure by pulling out an ethernet cable and see what happens - maybe there's a small hiccup while systems fall back onto local resources, maybe nothing fails over properly and I am hosed for 12 hours. As I understand Taleb, his argument is that the economic powers (bankers, traders, politicans) spend too much time trying to figure out how to prevent network failure, rather than making sure that when the network does go out, it doesn't take your whole mission down with it. Because, no matter how well you think you have engineered the system, the chance of your network going out somehow, sometime is non-zero.

I do agree with the posters that Taleb overstates the cleverness of his own insight; but unfortunately, in his field, it seems that people really were too dumb/greedy to grasp this. Irrespective of what you think of his interpretation of events or the validity of his suggestions (and I too question them), I think as systems engineers we can appreciate the risk of designing political and economic systems under the assumption that they can't fail.



the problem isn't that people were too dumb/greedy to notice. it's that they weren't punished for not noticing it. you'd be amazed what a little incentive does to people's willingness to self educate.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: