Free Novel Read

The Frugal Superpower: America's Global Leadership in a Cash-Strapped Era Page 5


  No good deed, an old saying has it, goes unpunished. The American role as the world’s government partly bears this out: the governmental services that it provides qualify as good deeds in that they confer benefits on others who do not have to pay for them. In providing them, however, the United States does not act in an entirely selfless fashion: tranquillity in East Asia and Europe, where the United States fought several wars in the course of the twentieth century, and an open international economic order in which the United States can import and export freely and invest and receive capital across national borders, bring considerable benefits to America as well. And while anti-Americanism can be unpleasant, it is generally not painful for the United States. The good deeds of global governance are not so much punished as unappreciated. For this there are several reasons.

  First and foremost, the United States does not look, or in every way act, like the world’s government. It does not have, nor, despite what critics of American foreign policy sometimes claim, does it seek to have on a global scale the defining property of an ordinary state within its domain—a monopoly of force. Moreover, the most important governmental functions that the United States carries out for the rest of the world—reassurance and enforcement—are not only not readily recognizable, they are all but invisible. They depend simply on the presence of American forces.

  Americans themselves were never asked to provide governance to the rest of the world and do not think of themselves as doing so. For the most part, the global services they underwrite represent the continuation, in the twenty-first century, of policies adopted during the Cold War as part of the political, military, economic, and ideological competition with the Soviet Union and international communism.

  Nor do the citizens of other countries see America as providing a de facto world government. They would certainly be loath to concede to the United States the special status that explicit recognition of what it does globally would bring. The governments of other countries understand far better than their citizens what American military and economic power does for the rest of the world. They vote, in effect, with their money in favor of American-supplied global governance by holding dollars, and in some cases with their sovereign territory by permitting the (occasionally secret) installation of American military facilities in Europe, Asia, and the Middle East. By one estimate, at the outset of the twenty-first century American special forces operated in no fewer than 125 countries. These other countries seldom acknowledge what they owe to the United States, however, no doubt in part to avoid encouraging the idea that since American foreign policies do so much to make them secure and prosperous, they should contribute far more than they do to pay for these policies.

  While not volunteering to support the governmental services the United States furnishes, the governments of every other major country have done nothing to oppose America’s global role. The absence of active opposition demonstrates that, whatever they say or refrain from saying publicly, other governments recognize privately how important American foreign policy is for their own countries’ well-being. Historically, when a single country became as powerful as the United States came to be after the Soviet Union collapsed, other powers banded together to restrain it. In the wake of the Cold War no such thing has happened, and the absence of such an anti-American coalition testifies eloquently to the American role as the world’s government.

  The term that hints at the special character of America’s role in the world that has found its way into common usage, at least among other governments and within the ranks of the American foreign policy community, is “leadership.” Although not recognized as such by those who use the term, in this context “leadership” is a synonym for “government.” It is this role, by whatever name it is known, that the coming economic challenges will place at risk.

  Because what the United States does beyond its borders is, on the whole, extremely constructive, everyone, not only Americans, has a great deal to lose from a reduction in American power. Indeed, other countries may have more to lose from a diminished American global role than does the United States, which will remain powerful enough to protect itself in a more dangerous international order and whose economy is large enough to minimize the damage from a reduction in international trade and investment. All countries, however, the United States included, would suffer from a less stable, less prosperous international system.

  The central task of American foreign policy, even as these economic challenges constrain it, is to preserve as many of the vital governmental services the United States supplies to the world as possible. The challenge for American policy in the second decade of the twenty-first century is to provide leadership on a shoestring—or at least on a much reduced budget. There are two obvious strategies for doing so.

  One is to discard some responsibilities, the better to sustain others. To govern is to choose, and in its capacity as the world’s government the United States will have to choose for continuation the policies that make the most important contributions to its own and the world’s well-being, while discontinuing others that, however worthy, do less to promote American interests and a benign world order. The other strategy is to share the burden of furnishing global services with other countries.

  The impending scarcity of foreign policy resources in the United States is, on the whole, an unfortunate development. It puts in jeopardy a variety of American-provided services that have made the world a safer and more prosperous place than it would have been without them. Scarcity does, however, have one potentially beneficial consequence. Just as losing weight can make a person healthier, the discipline that scarcity will impose can actually improve the conduct of American foreign policy by precluding the kind of errors that carelessness, itself the product of an abundance of power, produced in the first two post–Cold War decades.

  CHAPTER THREE

  ADAPTATION TO SCARCITY

  THE SEAT BELT EFFECT

  The Duchess of Windsor, distilling the lessons of a life of affluent idleness, once decreed that a person “can never be too thin or too rich.” Of this widely repeated maxim someone once commented that the life of Howard Hughes, the very eccentric billionaire who starved himself to death, provided evidence to the contrary.

  For the international system, the equivalent of the Duchess’s rule is that a country can never have too much power; but the experience of the United States in the post–Cold War era, like the case of the late Mr. Hughes, counts as an exception. During the two decades following the collapse of communism, the United States stood at the zenith of its power, with a margin of superiority over all other countries as great as, perhaps greater than, any single country had ever enjoyed in the long history of international relations. Yet in this period the United States committed two costly, foreseeable, and avoidable foreign policy blunders.

  The misguided and dangerous decision by the Clinton administration to expand the North Atlantic Treaty Organization (NATO) in the mid-1990s, over the objections of the newly non-communist Russia, and the disastrous ineptitude with which the Bush administration conducted the occupation of Iraq in 2003 and afterward, substantially weakened the American position in the world. Just as Howard Hughes’s vast wealth and peculiar dietary habits, far from improving his life, actually hastened his own death, so the unprecedented strength of the United States in comparison to every other country after the end of the Cold War, far from preventing these serious foreign policy errors, actually contributed to them. With fewer resources to devote to foreign policy, America is less likely to make mistakes like these.

  The two great post–Cold War mistakes followed the logic of what economists, in another context, have called the Peltzman Effect. Named for its discoverer, the University of Chicago economist Sam Peltzman, the Peltzman Effect refers to the occasional tendency of regulations that governments impose on the economy to have, perversely, the opposite of their intended effects. An often-cited example involves seat belts in automobiles, the use of which, according to s
ome studies, actually increases the rate of accidents. The reason is that seat-belted drivers become more confident and less careful, and drive more recklessly. So it was with post–Cold War American foreign policy.

  This does not mean that seat belts should be eliminated. Accidents that do take place—and not all are caused by the excessive confidence of the seat-belt wearer—generally have less serious consequences for those who buckle up than for those who do not. Nor will the United States or the world be better off with a less rather than a more powerful America; the contrary is true. But as with the wearing of a seat belt, overwhelming power did lead to unnecessary post–Cold War costs for the United States, and for the same reason. It bred carelessness, and carelessness led to serious mishaps in Eastern Europe and the Middle East.

  In 1999, after a vigorous debate in the United States, the sixteen-nation NATO extended membership to three formerly communist-ruled Central European countries: Poland, Hungary, and the Czech Republic. In the next ten years nine other formerly communist countries joined, including Lithuania, Latvia, and Estonia, all three of which had once been part of the Soviet Union itself and shared borders with the new Russia.

  NATO expansion soured relations with Russia because expansion broke the promise that Soviet leaders believed, with good reason, they had received from their Western counterparts, as the Cold War wound down, that NATO would not extend its reach into what had been communist Europe. The result was to create festering doubts in the minds of Russians about the trustworthiness of the West and particularly of the United States.

  The Russians objected publicly and frequently to NATO expansion but their objections were ignored. They were ignored because the United States and its allies assumed they could afford to ignore them: Russia was too weak to stop the process. Bitter at what they saw as the exploitation of their weakness, the Russian political class and much of the Russian public turned against the United States, and opposition to American initiatives became the default position for Russian foreign policy. This proved costly for the United States. The chances of preventing Iran from acquiring nuclear weapons, for example, a major American goal, came to depend heavily on enlisting wholehearted Russian support; and Russian assistance to the American campaign against Iranian nuclear proliferation was anything but wholehearted. Overall, the Russian resentment of and opposition to American power that NATO expansion generated resulted in the weakening of the United States.

  The Russian reaction should not have come as a surprise. Few geopolitical developments have been so widely predicted. When the initial expansion was being debated, the consensus view of experts on Russia was expressed by perhaps the most eminent of them, George F. Kennan, a former diplomat, a noted historian, and one of the principal architects of the policy of containment of the Soviet Union that the United States had followed during the Cold War. In a 1997 article in The New York Times, Kennan called expansion “the most fateful error of American policy in the entire post–Cold War era.”

  The Clinton administration gave as its rationale for expansion the extension of democracy eastward. But it offered inconsistent versions of this rationale and none made sense. On some occasions Clinton administration officials described NATO membership for the formerly communist countries as a reward for becoming democracies. Why this was an appropriate basis for an invitation to join NATO was never made clear, especially since undemocratic countries during the Cold War (Greece and Turkey under military rule, for example) had been members of the alliance in good standing. On other occasions, NATO expansion was advertised by the Clinton administration as a way of promoting democracy where it had not yet fully taken hold. This made no sense because the democratic political direction of the first new members, Poland, Hungary, and the Czech Republic, was not in doubt. Moreover, if the United States had truly believed that a place in NATO would guarantee free elections and constitutional rights, the offer of membership should immediately have been extended to the largest formerly communist country, where the fate of democracy was of paramount importance and where its success still hung in the balance: Russia. Instead, the Clinton administration told the Russians that they would never be invited to join.

  The Clinton administration’s principal motive for expanding NATO was to make political gains in advance of the 1996 presidential election among American voters of Polish, Hungarian, and Czech heritage by promising NATO membership to the countries from which their forebears had immigrated. During the Cold War, such political benefits would have been outweighed by the risks involved in angering America’s most powerful rival, the Soviet Union. But by the mid-1990s the Cold War had ended and the new Russia was far less formidable than the old Soviet Union. The risks seemed to Bill Clinton and his colleagues—wrongly, as it turned out—to be negligible. The administration felt free to ignore Russia’s wishes. With the geopolitical equivalent of a seat belt—American strength and corresponding Russian weakness—seemingly securely fastened, it plowed recklessly ahead with expansion despite all the warnings it received.

  NATO expansion might be imputed to the peculiar features of the administration that launched it—the all-consuming desire of the president to please every possible domestic constituency and the susceptibility of his Democratic Party to policies that appeared on their face to support American values (as distinct from the national interest). The next administration, however, with a different presidential personality and a different approach to foreign policy, fell into a similar trap. Again ignoring ample warning, the administration of George W. Bush committed a costly blunder in Iraq.

  The debacle in Iraq resembled, in important ways, the misstep of NATO expansion. The resemblance strongly suggests that at the root of both was a feature of post–Cold War foreign policy that transcended partisan differences. What the two fiascoes had in common was expressed in F. Scott Fitzgerald’s 1925 novel, The Great Gatsby, by the book’s narrator, Nick Carraway, in describing the wealthy socialites Tom and Daisy Buchanan. “They were careless people. . . . They smashed up things and creatures and then retreated back into their money or their vast carelessness.” The carelessness at the root of the foreign policy blunders was born of the remarkably favorable geopolitical circumstances in which the end of the Cold War left the United States.

  Like NATO expansion, the American war in Iraq, which began with an invasion of the country in March 2003, proved costly. Seven years later more than 4,000 Americans had been killed and 30,000 had been wounded, 14,000 of them severely enough not to return to duty within seventy-two hours. The death toll of Iraqis, almost all of them civilians, was by a rough count around 100,000. The war had cost almost a trillion dollars, and by the estimate of economists Linda Bilmes and Joseph Stiglitz the ultimate bill might well come to three times that amount. Iraq divided the American public more sharply than had any issue since the conflict in Vietnam a generation previously. Its unpopularity beyond America’s borders cost the United States prestige and influence in the international arena.

  While the decision to invade Iraq in the first place aroused controversy, and the absence of the weapons of mass destruction that the Iraqi dictator Saddam Hussein was believed to have—the possession of which had provided the principal justification for the invasion—provoked even more, the main reason the conflict turned out to be so damaging was the failure to pacify the country quickly and easily. Removing Saddam’s regime, the initial goal, proved a relatively straightforward task, accomplished in three weeks with light casualties. The United States then found itself occupying the country, however, and that is where the trouble began. America’s difficulties in post-Saddam Iraq were catalyzed by a series of mistakes that, like NATO expansion, were at once foreseeable, costly, and avoidable.

  The American military had made no plans for the occupation. Post-conquest policies had to be improvised, leading inevitably to mistakes that careful planning might have avoided. Once it became clear that the United States would have to assume at least temporary responsibility for Iraqi stability, A
merican officials in Washington and Baghdad had to make a series of crucial decisions concerning how to discharge that responsibility. They decided to install an American civilian administration rather than an interim Iraqi government. They decided to disband the Iraqi army. They decided to purge the government of personnel associated with Saddam Hussein’s Ba’ath Party down to the middle ranks of the bureaucracy, including clerks and schoolteachers, rather than simply dismissing the top leadership.

  These decisions turned out badly. They alienated important sectors of Iraqi society and inspired a deadly insurgency against the American occupation. To be sure, different decisions would also have had costs. Leaving intact the army, for example, which was controlled by the Sunni Muslims who had dominated Iraq under Saddam, might have turned the more numerous but historically oppressed Shia Muslims into active opponents of the United States.

  The Bush administration, however, scarcely weighed the costs and benefits of the different options it confronted. Instead, by all accounts, it made these decisions off the cuff, haphazardly, without the kind of debate and deliberation that responsible decision-making requires. It conducted the occupation, in a word, carelessly. The carelessness extended to appointing people to staff the occupation authority on the basis of their allegiance to the conservative domestic program of the Bush administration rather than their expertise on the region: sometimes disapproval of the Supreme Court’s principal abortion decision, Roe v. Wade, seemed to count for more than, say, mastery of the Arabic language.