Ever since Americas intelligence services blew 9/11 so badly, Americans have wanted to know how and why Americas intelligence services could have blown 9/11 so badly. A conventional wisdom has arisen that, for once, has the virtue of being almost correct.
According to this interpretation, America failed because for the last thirty years, we have emphasized technical over human intelligence: satellites and phone taps and gee-whiz gadgetry over the dreary, ugly business of recruiting foreign spies, infiltrators, etc. We did this partly because of our technological superiority (nations go with what theyve got); partly because of overly-fastidious legal and bureaucratic restrictions (less here than meets the eye); and partly because of sheer bureaucratic inertia. In addition, terrorists operate in that gray area between law enforcement as traditionally conceived and foreign military threat, again as traditionally conceived. All those inter-agency working groups and SOPs notwithstanding, timely intelligence sharing and all source fusion are still difficult affairs at best.
Other experts add, not wrongly, that the intelligence communitys suffering a post-Cold War institutional depression that only a generational change (and a major infusion of cash) can cure.
But deeper problems also account for the failure. Not problems, exactly: just the utterly paradoxical nature of the intelligence craft. A few simple examples may illustrate.
The more effective your warning system, the more inevitable a catastrophic failure. Imagine youre a battalion commander, defending some hill. Your intelligence officer comes charging into your tent. Sir, he puffs, all indicators show the bad guys will attack tonight. You order full alert. The bad guys see youre ready and call it off. The attack didnt come because your intelligence officer was wrong. It didnt come because he was right. Now, repeat this pattern a few times. Soon nobody believes the intelligence officer about anything . . . and then the bad guys charge through the wire, and everybody blames all the lousy intelligence.
Something similar seems to have happened in the weeks preceding 9/11, both within the United States and overseas. Something similar will happen again.
An extremely hazardous course of action becomes less hazardous by virtue of its risk. Analysts naturally equate the most risky enemy options with the least likely. Wrong again. From Hannibal crossing the Alps to Pearl Harbor, thence to Anwar Sadats 1973 attack on Israel, the seemingly suicidal, especially the very complicated/seemingly suicidal, segues into the spectacularly successful. And when youre dealing with threats that use deliberate suicide as a weapon enough said.
You have to get your sources to tell you what they dont know. This is tricky. Intelligence analysis is often a matter of piecing together a very complex puzzle when you dont know what the final picture should or might be. And often youre dealing with sources who dont know the picture, either. Basic doctrine says that the credibility of a human source is determined in part by whether or not he’s in a position to acquire the proffered information. But thats never enough. For example, say an enemy prisoner tells you that he saw a convoy on the road before he was captured. What kinds of vehicles? Well, there was this strange truck, very long with odd fenders and a cannister on top. To the prisoner, it may be just a strange truck. To an analyst: mobile missile transporter. In other words, what you need to know is where you find it, not where it ought to be.
You have to be able to zeroize your brain. From time to time, its vital to put aside all preconceptions and experiences and look at the sources and the data anew. Analysts and operatives can be a lot like professors and pundits; they fall in love with their theories. They can also be a lot like adolescents. Its not that they dont know. Its that they dont want to know.
Personality matters. Some are Cassandras, others perpetual disbelievers, others buckaroos, faddists, obsessives. And yet, you have to be able, intuitively at least, to take seriously the most outlandish (and unwelcome) ideas from the most suspect sources, then make the right leaps. Imagine an anonymous flight school instructor sends you an email, telling you that a certain Middle Eastern student wants simulator training on how to fly a 767, but not how to land it. He also pays cash for everything. Prior to 9/11, what might you have concluded? Some oil-rich Arab getting ready for a high-stakes video game contest?
And now to the final dilemma. The consumers of intelligence, from the president on down, must let the producers be wrong. In the end, the power of the intelligence community resides in its ability to influence the government and the military. Since the stakes are so high, and influence so easily lost, and careers so easily shattered, the natural producer tendency is: better to be useless than wrong. And the natural consumer tendency: You blew it once, Ill never trust you again.
But never is a long time, and were all in this together. And how often its true that the best way to make people trustworthy is to trust them.