It's not about connecting dots, it's about management and leadership

It's become apparent that the media has jumped all over this 'spy agencies didn't connect the dots' meme about the Xmas Detroit flight bomber. They act shocked, shocked that this has happened. Wasn't this the whole problem with 9/11 that was fixed with the new security procedures, etc.?

No, actually the unconnected dots story is old and seems immune to modern attempts to correct it. And it is not just a national security story. And it is not about connecting the dots. The truth is that this is a failure of management and leadership, not analysis and data collection. See if you analysts out there can figure out the pattern here:

1986: The space shuttle Challenger explodes supposedly because no one connected the dots on the O-ring and freezing temperatures. NASA's investigation actually points the blame finger at how decisions are made and problems with management sacrificing safety for keeping to the launch schedule.

1999: The dot-com bubble causes the stock market to skyrocket and then tank despite data showing irrational exuberance. People lament the lack of connecting the dots by the Fed, but the problem was that Greenspan didn't take action by raising interest rates and call for more regulation in lending.

2001: 9/11 happens because no one connected the dots on al Qaeda plots. The country solemnly concludes that it wasn't a problem in collecting the information, or analyzing it, but on acting on it. Ditto for Enron, MCI, etc.

2003: The shuttle Columbia explodes because no one 'connected the dots' on damage to the heat shield. But, hint hint, the investigation concludes that management failed to act, not that the analysis was lacking.

2003: We go to war in Iraq looking for weapons of mass destruction, even though all of the reliable analysis points to there being none. Attempts to blame the analysis fall on deaf ears when it becomes clear that management dismissed the analysis before it was even started.

2008: The housing market bubble pops, taking the stock market bubble down with it. If only the Fed, Fannie Mae and Freddie Mac and the rest of the government connected the dots, right? It turns out that all of the data was collected, the analysis was done, but that decision-makers and action takers chose to ignore it because they wanted to let the good times roll.

Spring 2009: Mexican authorities repeatedly and desperately alert the World Health Organization about an outbreak of a new flu. WHO drops the ball and does not follow its own procedures for dealing with it, and the delay in their response allows a global pandemic to spring forth, killing thousands.

Xmas 2009: An almost plane bombing happens not because the analysis was not done, but because at each step, the decision makers did not act when their own procedures told them to, starting from our embassy in Nigeria, to the NSA intel guys who knew about Nigerian terrorists training in Yemen, to the DHS people who are supposed to flag people who pay for one-way tickets with cash and don't check luggage. Any one of these pieces of intel was enough to blow the whistle, revoke the visa, do the extra search, etc. and block the guy from boarding the plane. But no one pulled the trigger. Why? They didn't think their piece of information was serious enough or sufficient even though all known procedures and common sense says it was. That is a decision-making failure, not an analytical one. And it happened about a half dozen times in this one case alone.

Why does this keep happening? The answer is pretty straightforward, if you connect the dots.

A) Collecting data is much easier than analyzing it. We are really good at collecting data.

B) Analysis is not all that hard either. In all of these cases, the analysis was done and the conclusions were straightforward enough that a kid in elementary school would know what to take away. (Heat shield probably busted: don't reenter atmosphere!) But the analysis (and analysts) are easy scapegoats for the real problem, which is that...

C) The authorities dismiss the analysis because they don't wanna do what it calls them to do. Acting on the analysis requires decision-makers who are conditioned/trained/promoted to move incrementally, cautiously and with consensus to do something drastic, do it quickly and piss a lot of people off. It's just more convenient for them to dismiss the analysis and justify their dismissal under the guise of prudence, pragmatism, etc. And as a result, the disaster happens.

Solution: we need decision-makers who will act when needed, who don't pass the buck or shy away from their responsibility. There is no procedural reform, no mild incentive changes or organizational reshuffling that will make it happen. This is where the nitty gritty of leadership comes in. Executives need a line of Captain Kirks, David Farraguts and George S. Pattons who will act swiftly and they need to support them so they can carry out their whistle-blowing capability. Lives are on the line.

For you, dear reader, a bonus example where you get to play the manager, with your own money:

2010: the stock market is likely overvalued. Here's some analysis in the NYT that is the equivalent of announcing that a 20% downturn is due, according to PE ratios. You now have the data and the analysis. What will you do? If the analysis is right (and I am not saying that it is) and you take that 20% hit in your stocks' value, then you have little to complain about when others take the same cautious route.

No comments: