Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Spy IQ

Ever since America’s intelligence services blew 9/11 so badly, Americans have wanted to know how and why America’s intelligence services could have blown 9/11 so badly. A conventional wisdom has arisen that, for once, has the virtue of being almost correct.

According to this interpretation, America failed because for the last thirty years, we have emphasized technical over human intelligence: satellites and phone taps and gee-whiz gadgetry over the dreary, ugly business of recruiting foreign spies, infiltrators, etc. We did this partly because of our technological superiority (nations go with what they’ve got); partly because of overly-fastidious legal and bureaucratic restrictions (less here than meets the eye); and partly because of sheer bureaucratic inertia. In addition, terrorists operate in that gray area between law enforcement as traditionally conceived and foreign military threat, again as traditionally conceived. All those inter-agency working groups and SOPs notwithstanding, timely intelligence sharing and “all source fusion” are still difficult affairs at best.

Other experts add, not wrongly, that the intelligence community’s suffering a post-Cold War institutional depression that only a generational change (and a major infusion of cash) can cure.

But deeper problems also account for the failure. Not problems, exactly: just the utterly paradoxical nature of the intelligence craft. A few simple examples may illustrate.

The more effective your warning system, the more inevitable a catastrophic failure. Imagine you’re a battalion commander, defending some hill. Your intelligence officer comes charging into your tent. “Sir,” he puffs, “all indicators show the bad guys will attack tonight.” You order full alert. The bad guys see you’re ready and call it off. The attack didn’t come because your intelligence officer was wrong. It didn’t come because he was right. Now, repeat this pattern a few times. Soon nobody believes the intelligence officer about anything . . . and then the bad guys charge through the wire, and everybody blames all the lousy intelligence.

Something similar seems to have happened in the weeks preceding 9/11, both within the United States and overseas. Something similar will happen again.

An extremely hazardous course of action becomes less hazardous by virtue of its risk. Analysts naturally equate the most risky enemy options with the least likely. Wrong again. From Hannibal crossing the Alps to Pearl Harbor, thence to Anwar Sadat’s 1973 attack on Israel, the seemingly suicidal, especially the very complicated/seemingly suicidal, segues into the spectacularly successful. And when you’re dealing with threats that use deliberate suicide as a weapon – enough said.

You have to get your sources to tell you what they don’t know. This is tricky. Intelligence analysis is often a matter of piecing together a very complex puzzle when you don’t know what the final picture should or might be. And often you’re dealing with sources who don’t know the picture, either. Basic doctrine says that the credibility of a human source is determined in part by whether or not he’s in a position to acquire the proffered information. But that’s never enough. For example, say an enemy prisoner tells you that he saw a convoy on the road before he was captured. What kinds of vehicles? Well, there was this strange truck, very long with odd fenders and a cannister on top. To the prisoner, it may be just a strange truck. To an analyst: mobile missile transporter. In other words, what you need to know is where you find it, not where it ought to be.

You have to be able to zeroize your brain. From time to time, it’s vital to put aside all preconceptions and experiences and look at the sources and the data anew. Analysts and operatives can be a lot like professors and pundits; they fall in love with their theories. They can also be a lot like adolescents. It’s not that they don’t know. It’s that they don’t want to know.

Personality matters. Some are Cassandras, others perpetual disbelievers, others buckaroos, faddists, obsessives. And yet, you have to be able, intuitively at least, to take seriously the most outlandish (and unwelcome) ideas from the most suspect sources, then make the right leaps. Imagine an anonymous flight school instructor sends you an email, telling you that a certain Middle Eastern student wants simulator training on how to fly a 767, but not how to land it. He also pays cash for everything. Prior to 9/11, what might you have concluded? Some oil-rich Arab getting ready for a high-stakes video game contest?

And now to the final dilemma. The “consumers” of intelligence, from the president on down, must let the producers be wrong. In the end, the power of the intelligence community resides in its ability to influence the government and the military. Since the stakes are so high, and influence so easily lost, and careers so easily shattered, the natural producer tendency is: better to be useless than wrong. And the natural consumer tendency: You blew it once, I’ll never trust you again.

But never is a long time, and we’re all in this together. And how often it’s true that the best way to make people trustworthy is to trust them.

Philip Gold

Dr. Philip Gold is a senior fellow of the Discovery Institute, and director of the Institute's Aerospace 2010 Project. A former Marine, he is the author of Evasion,: The American Way of Military Service and over 100 articles on defense matters. He teaches at Georgetown University and is a frequent op-ed contributor to several newspapers. Dr. Gold divides his time between Seattle and Washington, D.C.