Defense Media Network

Prevailing Beliefs: Why Intelligence Analysis Sometimes Fails

There’s a rule of thumb in social science methodology that applies to intelligence communities worldwide, says James Wirtz.

“If I know what people are thinking, I can pretty much predict what they’ll do, regardless of the information they have. The way human cognition works, you’re more receptive to information you expect and more likely to dismiss information that doesn’t fit with your preexisting beliefs.”

Jervis’ postmortem emphasized that contemporary U.S. analysts expected the Shah to respond forcefully to revolutionary protesters. When the Shah failed to act, American analysts read it as an indication that the protesters weren’t very powerful. However, they failed to appreciate how much pressure the Carter Administration was placing on Iran’s ruler not to respond with force.

Accounting for their biases is one of the things America’s intelligence analysts struggle with. It’s a challenge Wirtz ,of the Naval Postgraduate School in Monterey, Calif., recently discussed in the journal Intelligence and National Security. Writing about postmortem investigations which examine the performance of intelligence gathering following intelligence failures and successes, Wirtz references the work of Columbia University professor Robert Jervis, who, in the 1970s, wrote his book Perception and Misperception in International Politics.

The book brought Jervis to the attention of the CIA, which asked him to do a postmortem of the agency’s intelligence estimates that failed to foresee the fall of the Shah of Iran in 1979. Jervis’ postmortem emphasized that contemporary U.S. analysts expected the Shah to respond forcefully to revolutionary protesters. When the Shah failed to act, American analysts read it as an indication that the protesters weren’t very powerful. However, they failed to appreciate how much pressure the Carter Administration was placing on Iran’s ruler not to respond with force. American analysts assessed the situation within Iran but did not appreciate the effects U.S. policy was having on the situation.

Iranian Revolution

Iranian protestors during the Iranian Revolution, ca. 1978. CIA intelligence analysts failed to foresee the collapse of the Shah of Iran’s regime. Sajed photo

Despite some initial resistance, Wirtz says Jervis’ report was taken to heart and the intelligence community has since embraced postmortems. In 2003, Jervis was again asked do an autopsy on the failure of the Iraq National Intelligence Estimate (NIE) and Wirtz, who had studied under Jervis, joined the analysis. Their work examined the psychological biases affecting the NIE, essentially trying to identify what was on people’s minds at the time of analysis and explaining how they interpreted the information they received.

Every intelligence estimate is unique, Wirtz acknowledges, but common mistakes generally apply. For example, when policy makers get wind of an estimate that conflicts with their expectations, they often dismiss it as harebrained, too unconventional for the enemy to undertake, he says.

At Pearl Harbor, Wirtz asserts, American analysts “knew the Japanese wanted to seize parts of Southeast Asia, but they thought it made no sense for the Japanese to attack and draw the U.S. into the war. Their concern was how the U.S. would [justifiably] get involved if the Japanese did undertake conquest in the Pacific. They were worried about that.” Cue the Adm. Husband E. Kimmel/Gen. Walter Short supporters.

Despite a dozen Blue Ribbon panels/congressional committee reports in the decade preceding 9/11 citing a disconnect between foreign/defense intelligence and domestic law enforcement, policymakers weren’t motivated to correct the issue.

“We knew there was a problem, and Al Qaeda drove an airplane through it,” Wirtz observes.

A similar kind of “rationality bias” applied in the Iraq NIE, Wirtz argues.

“It was apparent to analysts that Saddam was putting a lot of effort into denial and deception at very real political cost to impede UNSCOM inspectors’ search for weapons of mass destruction [WMD]. They believed that no rational [actor] would bear those political costs unless they had something to hide.”

The idea that Hussein’s deception was evidence of WMDs meant that peripheral evidence to the contrary wasn’t seriously considered. The picture made perfect sense to analysts, Wirtz says, who lost track of an alternative analytical framework. “No one stepped back and said, ‘we can’t disconfirm this anymore.’”

But recognizing your own biases is very difficult, he admits. “It’s like being on a moving walkway at an airport. When you’re in that flow of events, one of the hardest things you can do is to step outside your own cognitive framework and look at the world from a different perspective.”

Facing an intelligence failure, Wirtz adds, implies a failure of your policy. Despite a dozen Blue Ribbon panels/congressional committee reports in the decade preceding 9/11 citing a disconnect between foreign/defense intelligence and domestic law enforcement, policymakers weren’t motivated to correct the issue.

“We knew there was a problem, and Al Qaeda drove an airplane through it,” Wirtz observes.

Intelligence analysts must perpetually deal with the problem that policy makers seek definitive statements because they’re the ones who have to act, Wirtz explains. While this may lead to the assumption that policymakers tend to select advisers who confirm their perceptions, that isn’t always the case.

“In all honesty,” Wirtz allows, “there was not enough information to disconfirm the idea that Hussein would obtain WMD sooner or later. He wanted them. What they failed to do, and what we still don’t do well in the U.S. government, is assess how well U.S. policy is working.”

“We looked for politicization in the Iraq NIE and we really found no evidence of it. It’s a strange case. The intelligence community and the Bush Administration agreed on the fundamentals of what was going on in Iraq. They believed Hussein would do everything in his power to get weapons of mass destruction and use them. If left to his own devices, he probably would have restarted his [WMD] programs.”

The fact that the intelligence community was not pressured to come to the WMD conclusion, but was pressured to produce evidence that Saddam Hussein and Al Qaeda were working together, and fought back hard against the idea, confirms its independence, Wirtz maintains. Still, it didn’t calibrate correctly.

Some of it can be explained by organizational bias, Wirtz explains. Organizations as well as individuals focus on a dominant line of analysis, excluding alternatives. The bias is natural human psychology amplified by organization.

Another factor was that, as in 1979, the intelligence community did not do a net policy assessment.

“In all honesty,” Wirtz allows, “there was not enough information to disconfirm the idea that Hussein would obtain WMD sooner or later. He wanted them. What they failed to do, and what we still don’t do well in the U.S. government, is assess how well U.S. policy is working.”

Operation Iraqi Freedom

U.S. Marines assigned to C/Company, 5th Regimental Combat Team (RCT-5) 1st Battalion, 5th Marines, man defensive fighting position in northern Kuwait, wearing Mission-Oriented Protective Posture response level 4 (MOPP-4) gear during the first phase of Operation Iraqi Freedom, March 20, 2003. The intelligence community misinterpreted Saddam Hussein’s deceptions regarding WMD as confirmation that he possessed them. DoD photo by Sgt. Kevin R. Reed

The U.S. had effectively denied the regime WMD through a decade of counter-proliferation policy from the first Bush Administration, through the Clinton Administration and George W. Bush’s early tenure. International sanctions and military campaigns (Desert Fox, Operations Northern Watch, Southern Watch) underpinned by international support kept Hussein from restarting WMD programs.

“The irony of the second Gulf War is that it was prompted in part by our lack of recognition of how successful our policy was,” Wirtz reflects.

Still, bias problems will persist. “It raises its head in different ways every time,” he admits. “And it’s difficult to catch it until it’s too late.”

The intelligence community in the U.S. doesn’t see policy assessment as its function. “The only way we do that in this country is through elections,” Wirtz stresses. But he says the CIA and others are working hard to get around the problem of psychological bias, embracing the work of social scientists like Richards Heuer Jr. whose books on analysis tradecraft are widely discussed. The notions that “fresh eyes” must periodically be brought in and that improving agents’ analytical skills will yield better results than another bureaucratic reshuffle are taking hold.

“The U.S. is a leader in this sort of work,” Wirtz confirms. “Many intelligence communities don’t even try to do this.” Still, bias problems will persist. “It raises its head in different ways every time,” he admits. “And it’s difficult to catch it until it’s too late.”

By

Eric Tegler is a writer/broadcaster from Severna Park, Md. His work appears in a variety...