Much of the research into humans' risk-avoidance machinery shows that it is antiquated and unfit for the modern world; it is made to counter repeatable attacks and learn from specifics. If someone narrowly escapes being eaten by a tiger in a certain cave, then he learns to avoid that cave. Yet vicious black swans by definition do not repeat themselves. We cannot learn from them easily.
The first flaw is the error of excessive and naïve specificity. By focusing on the details of the past event, we may be diverting attention from the question of how to prevent future tragedies, which are still abstract in our mind. To defend ourselves against black swans, general knowledge is a crucial first step.
Yet infinite vigilance is not possible. Negligence in any specific case needs to be compared with the normal rate of negligence for all possible events at the time of the tragedy ‹ including those events that did not take place but could have. Before 9/11, the risk of terrorism was not as obvious as it seems today to a reasonable person in government (which is part of the reason 9/11 occurred). Therefore the government might have used its resources to protect against other risks ‹ with invisible but perhaps effective results.
The third flaw is related. Our system of rewards is not adapted to black swans. We can set up rewards for activity that reduces the risk of certain measurable events, like cancer rates. But it is more difficult to reward the prevention (or even reduction) of a chain of bad events (war, for instance). Job-performance assessments in these matters are not just tricky, they may be biased in favor of measurable events. Sometimes, as any good manager knows, avoiding a certain outcome is an achievement.
The greatest flaw in the commission's mandate, regrettably, mirrors one of the greatest flaws in modern society: it does not understand risk. The focus of the investigation should not be on how to avoid any specific black swan, for we don't know where the next one is coming from. The focus should be on what general lessons can be learned from them. And the most important lesson may be that we should reward people, not ridicule them, for thinking the impossible. After a black swan like 9/11, we must look ahead, not in the rear-view mirror.
Nissim has valuable insights but he, like all money managers from Soros to Victor whom I am learning, talks lots of bullshit. People gives them their money to manage because they have confidence in their competence, knowledge, academic titles - and the more they talk in priestly Latin (or Aramean) the more confidence they generate. By the way, I have been thinking about probability and risk for fifty years, and dont understand it.