From Timothy Noah’s “Why No More 9/11s?: An interactive inquiry about why America hasn’t been attacked again” (Slate: 5 March 2009):
… I spent the Obama transition asking various terrorism experts why the dire predictions of a 9/11 sequel proved untrue and reviewing the literature on this question. The answers boiled down to eight prevailing theories whose implications range from fairly reassuring to deeply worrying.
I. The Terrorists-Are-Dumb Theory
“Acts of terrorism almost never appear to accomplish anything politically significant,” prominent game theorist Thomas C. Schelling observed nearly two decades ago. Max Abrahms, a pre-doctoral fellow at Stanford’s Center for International Security and Cooperation, reaffirmed that conclusion in a 2006 paper for International Security titled, “Why Terrorism Does Not Work.” Abrahms researched 28 groups designated “foreign terrorist organizations” by the U.S. State Department since 2001, identifying among them a total of 42 objectives. The groups achieved those objectives only 7 percent of the time, Abrahms concluded, and the key variable for success was whether they targeted civilians. Groups that attacked civilian targets more often than military ones “systematically failed to achieve their policy objectives.”
In a 2008 follow-up essay, “What Terrorists Really Want,” Abrahms explained that terrorist groups are typically incapable of maintaining a consistent set of strategic goals, much less achieving them. Then why do they become terrorists? To “develop strong affective ties with fellow terrorists.” It’s fraternal bonds they want, not territory, nor influence, nor even, in most cases, to affirm religious beliefs. If a terrorist group’s demands tend to sound improvised, that’s because they are improvised; what really matters to its members—even its leaders—is that they are a band of brothers. Marc Sageman, a forensic psychiatrist and former Central Intelligence Agency case officer in Afghanistan, collected the biographies of 400 terrorists who’d targeted the United States. He found that fully 88 percent became terrorists not because they wanted to change the world but because they had “friendship/family bonds to the jihad.” Among the 400, Sageman found only four who had “any hint of a [psychological] disorder,” a lower incidence than in the general population. Think the Elks, only more lethal. Cut off from al-Qaida’s top leadership, they are plenty dangerous, but not nearly as task-oriented as we imagine them to be.
II. The Near-Enemy Theory
Jihadis speak of the “near enemy” (apostate regimes in and around the Middle East) and the “far enemy” (the United States and the West generally). The man credited with coining these terms, Mohammed Abd al-Salam Faraj, did so largely to emphasize that it was much more important to attack the near enemy, a principle he upheld by organizing the 1981 assassination of Egyptian President Anwar Sadat. (The Egyptian government affirmed the same principle in executing Faraj.) In 1993, a militant Egyptian group called al-Gama’a al-Islamiyya (“the Islamic Group”), which had extensive ties to al-Qaida, broke with the “near enemy” strategy and bombed the World Trade Center. In 1996, al-Qaida followed suit and formally turned its attention to the far enemy. But according to Fawaz A. Gerges, an international affairs professor at Sarah Lawrence and author of The Far Enemy: Why Jihad Went Global, other jihadist groups around the world never really bought into this shift in priorities. Even al-Gama’a al-Islamiyya had by late 1999 declared a cease-fire, a move that outraged its incarcerated spiritual leader, Omar Abdel-Rahman (“the blind sheikh”) and caused the group to splinter. With the 9/11 attacks, Bin Laden hoped to rally jihadis outside al-Qaida’s orbit to join the battle against the far enemy. Instead, he scared them off.
III. The Melting-Pot Theory
In the absence of other evidence, we must conclude that inside the United States, homegrown, al-Qaida-inspired terrorist conspiracy-mongering seldom advances very far.
That record stands in stark contrast to that of the United Kingdom, which since 9/11 has incubated several very serious terrorism plots inspired or directed by al-Qaida. … Even when it isn’t linked directly to terrorism, Muslim radicalism seems more prevalent—and certainly more visible—inside the United Kingdom, and in Western Europe generally, than it is inside the United States.
Why the difference? Economics may be one reason. American Muslims are better-educated and wealthier than the average American. In Europe, they are poorer and less well-educated than the rest of the population—in Germany, only about 10 percent of the Turkish population attends college. The United States has assimilated Muslims into its society more successfully than Western Europe—and over a longer period. Arabs began migrating to the United States in great numbers during the second half of the 19th century. Western Europe’s Arab migration didn’t start until after World War II, when many arrived as guest workers. In Germany and France, a great many Muslims live in housing projects segregated from the rest of the population. In the United States, Muslims are dispersed more widely. An exception would be Detroit, which has a large Muslim community but not an impoverished one.
The relative dearth of Islamist radicalism in the United States is at least as much a function of American demographics as it is of American exceptionalism. Muslims simply loom smaller in the U.S. population than they do in the populations of many Western European countries. Muslims account for roughly 3 percent of the population in the United Kingdom, 4 percent in Germany, and 9 percent in France. In the United States, they’re closer to 1 percent and are spread over a much larger geographic area. As both immigrants and descendants of immigrants, Muslims are far outnumbered in the United States by Latinos. It’s quite different in Western Europe. Muslims represent the largest single immigrant group in France, Germany, Belgium, the Netherlands (where they constitute a majority of all immigrants), and the United Kingdom (where they constitute a plurality of all immigrants).
Somewhere between one-quarter to one-half of U.S. Muslims are African-American. Historically, American-born black Muslims have felt little kinship with Arab and foreign-born Muslims, and while al-Qaida has sought to recruit black Muslims, “there’s no sign” they’ve met with any success, according to Laurence. … Among foreign-born Muslims in the United States, nearly one-quarter are Shiite—many of them refugees from the 1979 Iranian revolution—and therefore harbor little sympathy for al-Qaida’s Sunni following. Europe’s Muslim population, by contrast, is overwhelmingly Sunni, hailing typically in France from Algeria and Morocco; in Germany from Turkey; and in the United Kingdom from Pakistan and the subcontinent.
All right, then. American Muslims are disinclined to commit acts of terror inside the United States. Why don’t American non-Muslims pick up the slack?
Actually, they do. In April 1995 Timothy McVeigh and Terry Nichols bombed a federal building in Oklahoma City, killing 168 people and injuring 500 more. In April 1996, Ted Kaczynski, the “Unabomber,” was arrested for killing three people and wounding 22 others. In July 1996, a former Army explosives expert named Eric Rudolph set off a bomb at the Olympics in Atlanta, killing one person and injuring 11; later, he set off bombs at two abortion clinics and a nightclub frequented by gay men and women, killing a security guard* and injuring 12 others. In September and October 2001, somebody sent anthrax spores to media outlets and government offices, killing five people. The FBI believes it was an Army scientist named Bruce Ivins who killed himself as the investigation closed in on him. These are just the incidents everybody’s heard of. The point is that domestic terrorism inside the United States is fairly routine. The FBI counted 24 terror incidents inside the United States between 2002 and 2005; all but one were committed by American citizens.
IV. The Burden-Of-Success Theory
In fact, the likelihood of nuclear terrorism isn’t that great. Mueller points out that Russian “suitcase bombs,” which figure prominently in discussions about “loose nukes,” were all built before 1991 and ceased being operable after three years. Enriched uranium is extremely difficult to acquire; over the past decade, Mueller argues, there were only 10 known thefts. The material stolen weighed a combined 16 pounds, which was nowhere near the amount needed to build a bomb. Once the uranium is acquired, building the weapon is simple in theory (anti-nuclear activist Howard Morland published a famous 1979 article about this in the Progressive) but quite difficult in practice, which is why entire countries have had to work decades to acquire the bomb, only sometimes meeting with success. (Plutonium, another fissile material, is sufficiently dangerous and difficult to transport that nonproliferation experts seldom discuss it.)
V. The Flypaper Theory
The 9/11 attacks led to a U.S. invasion of Afghanistan, whose Taliban regime was sheltering al-Qaida. That made sense. Then it led to a U.S. invasion of Iraq. That made no sense. The Bush administration claimed that Iraq’s Saddam Hussein had close ties to al-Qaida. This was based on:
a) allegations made by an American Enterprise Institute scholar named Laurie Mylroie, later discredited;
b) an al-Qaida captive’s confession under threat of torture to Egyptian authorities, later retracted;
c) a false report from Czech intelligence about a Prague meeting between the lead 9/11 hijacker, Mohamed Atta, and an Iraqi intelligence agent;
d) Defense Secretary Donald Rumsfeld’s zany complaint at a Sept. 12, 2001, White House meeting that “there aren’t any good targets in Afghanistan, and there are lots of good targets in Iraq”;
e) certain Oedipal preoccupations of President George W. Bush.
VI. The He-Kept-Us-Safe Theory
A White House fact sheet specifies six terror plots “prevented in the United States” on Bush’s watch:
- an attempt to bomb fuel tanks at JFK airport,
- a plot to blow up airliners bound for the East Coast,
- a plan to destroy the tallest skyscraper in Los Angeles,
- a plot by six al-Qaida-inspired individuals to kill soldiers at Fort Dix Army Base in New Jersey,
- a plan to attack a Chicago-area shopping mall using grenades,
- a plot to attack the Sears Tower in Chicago.
The Bush administration deserves at least some credit in each of these instances, but a few qualifications are in order. The most serious terror plot listed was the scheme to blow up airliners headed for the East Coast. That conspiracy, halted in its advanced stages, is why you aren’t allowed to carry liquids and gels onto a plane. As noted in “The Melting-Pot Theory,” it originated in the United Kingdom, which took the lead in the investigation. (The undercover agent who infiltrated the terror group was British.) We also learned in “The Melting-Pot Theory” that the plan to bring down the Sears Tower was termed by the Federal Bureau of Investigation’s deputy director “more aspirational than operational” and that the prosecution ended in a mistrial.
The JFK plot was unrelated to al-Qaida and so technically infeasible that the New York Times, the airport’s hometown newspaper, buried the story on Page A37. The attack on the Library Tower in Los Angeles was planned in October 2001 by 9/11’s architect, Khalid Sheikh Mohammed, who recruited volunteers from South Asia to fly a commercial jetliner into the building. But Michael Scheuer, a veteran al-Qaida expert who was working at the Central Intelligence Agency in 2002, when the arrests were made, told the Voice of America that he never heard about them, and a U.S. government official told the Los Angeles Times that the plot never approached the operational stage. Moreover, as the story of United Flight 93 demonstrated, the tactic of flying passenger planes into buildings—which depended on passengers not conceiving of that possibility—didn’t remain viable even through the morning of 9/11 (“Let’s roll”).
The Fort Dix plot was inspired by, but not directed by, al-Qaida. The five Muslim conspirators from New Jersey, convicted on conspiracy charges in December, watched jihadi videos. They were then foolish enough not only to make one of their own but to bring the tape to Circuit City for transfer to DVD. A teenage clerk tipped off the FBI, which infiltrated the group, sold them automatic weapons, and busted them. The attempted grenade attack on the CherryVale Mall in suburban Chicago was similarly inspired but not directed by al-Qaida. In this instance, the conspirators numbered only two, one of whom was an FBI informant. The other guy was arrested when an undercover FBI agent accepted his offer to trade two stereo speakers for four grenades and a gun. He is now serving a life sentence.
VIII. The Time-Space Theory
The RAND Corp. is headquartered in a blindingly white temple of reason a few blocks from the Pacific Ocean in Santa Monica, Calif. It was here—or rather, next door, in the boxy international-style offices it inhabited for half a century before moving four years ago into a new $100 million structure—that America’s Cold War nuclear strategy of “mutual assured destruction” was dreamed up. Also, the Internet. Created by the Air Force in 1948, the nonprofit RAND would “invent a whole new language in [its] quest for rationality,” Slate’s Fred Kaplan wrote in his 1983 book The Wizards of Armageddon.
RAND is the cradle of rational-choice theory, a rigorously utilitarian mode of thought with applications to virtually every field of social science. Under rational-choice theory, belief systems, historical circumstances, cultural influences, and other nonrational filigree must be removed from consideration in calculating the dynamics of human behavior. There exists only the rational and orderly pursuit of self-interest. It is the religion that governs RAND. …
Lakdawalla and RAND economist Claude Berrebi are co-authors of “How Does Terrorism Risk Vary Across Space and Time?” a 2007 paper.
One goal inherent in the 9/11 attacks was to do harm to the United States. In “The Terrorists-Are-Dumb Theory” and “The Melting-Pot Theory,” we reviewed the considerable harm that the furious U.S. response to 9/11 caused al-Qaida. But that response harmed the United States, too. Nearly 5,000 U.S. troops have died in Iraq and Afghanistan, and more than 15,000 have come home wounded. More than 90,000 Iraqi civilians have been killed and perhaps as many as 10,000 Afghan civilians; in Afghanistan, where fighting has intensified, more than 2,000 civilians died just in the past year. “In Muslim nations, the wars in Afghanistan and particularly Iraq have driven negative ratings [of the United States] nearly off the charts,” the Pew Global Attitudes Project reported in December. Gallup polls conducted between 2006 and 2008 found approval ratings for the U.S. government at 15 percent in the Middle East, 23 percent in Europe, and 34 percent in Asia. To be sure, civilian casualties have harmed al-Qaida’s standing, too, as I noted in “The Terrorists-Are-Dumb Theory.” But to whatever extent al-Qaida hoped to reduce the United States’ standing in the world, and especially in the Middle East: Mission accomplished.
Rational-choice theory is most at home with economics, and here the costs are more straightforward. In March 2008, the Nobel Prize-winning economist Joseph Stiglitz, and Linda Bilmes of Harvard’s Kennedy School of Government, put the Iraq war’s cost at $3 trillion. In October 2008, the Congressional Research Service calculated, more conservatively, an additional $107 billion for the Afghanistan war and another $28 billion for enhanced homeland security since 9/11. According to CRS, for every soldier the United States deploys in Iraq or Afghanistan, the taxpayer spends $390,000. Let me put that another way. Sending a single soldier to Iraq or Afghanistan costs the United States nearly as much as the estimated $500,000 it cost al-Qaida to conduct the entire 9/11 operation. Not a bad return on Bin Laden’s investment, Berrebi says. President Bush left office with a budget deficit of nearly $500 billion, and that’s before most of the deficit spending that most economists think will be required to avoid another Great Depression even begins.