You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.
Skip Navigation

What the Boom Forgot

A few significant lessons that the 1990s didn't teach us.

IN AN UNCERTAIN WORLD: TOUGH CHOICES FROM WALL STREET TO WASHINGTON By Robert E. Rubin and Jacob Weisberg (Random House, 427 pp., $35)

THE CHASTENING: INSIDE THE CRISIS THAT ROCKED THE GLOBAL FINANCIAL SYSTEM AND HUMBLED THE IMF, Revised and Updated By Paul Blustein (PublicAffairs, 435 pp., $18) 

THE ROARING NINETIES: A NEW HISTORY OF THE WORLD'S MOST PROSPEROUS DECADE By Joseph E. Stiglitz (W.W. Norton, 379 pp., $25.95)

IMF ESSAYS FROM A TIME OF CRISIS: THE INTERNATIONAL MONETARY SYSTEM, STABILIZATION, AND DEVELOPMENT By Stanley Fischer (MIT Press, 535 pp., $50)

GROWING PUBLIC: SOCIAL SPENDING AND ECONOMIC GROWTH SINCE THE EIGHTEENTH CENTURY, VOLUME 1, THE STORY By Peter H. Lindert (Cambridge University Press, 384 pp., $65)

I.

WHEN ROBERT E. Rubin resigned as secretary of the treasury in May 1999, after holding the job for almost four and a half years, President Clinton suggested that he had been “the most effective treasury secretary since Alexander Hamilton.” This was one of those characteristic Clinton statements, somewhere between a charming exaggeration and a calculated fib. It's doubtful that Clinton even knew the names of most of the treasury secretaries between Hamilton and Rubin, much less compared their achievements: they included Albert Gallatin, who financed the War of 1812; Salmon P. Chase, who completely overhauled the banking system, created a new paper currency, and financed the Civil War; and Henry Morgenthau Jr., who played an important role during the Great Depression and World War II.

Still, in many ways Clinton's exaggeration was apt, because when Rubin left Washington the nation's economy was in the midst of the longest boom in its history, of which Rubin was widely seen as a prime architect. In the view of his many admirers, he helped to engineer the expansion by bringing runaway federal budget deficits under control and by helping to defuse the Asian financial crisis of 1997-1999. Rubin seemed to embody that rare combination of competencies: a clear vision of what needed to be done and the personal talent and temperament to put the vision into practice. If not exactly an economic savior, he was a clearsighted custodian whose calm and common sense were stabilizing forces.

In hindsight, though, Rubin's accomplishments seem less grand. The economic expansion of the early 1990s would have occurred even if he had not joined the Clinton administration in 1993, initially as head of the White House's National Economic Council. Similarly, the emergence of a budget surplus in 1998—the first since 1969--stemmed mainly from outside events: the end of the Cold War, which prompted a large (and probably excessive) reduction in defense spending; and the economic boom, which caused a huge and unanticipated surge in tax revenue. Good luck, more than good policy, produced the surpluses.

As important, Rubin's book shows that he is no economic seer: if we're looking for a guide to the future, he isn't it. He doesn't have much to say about two big questions that now confront American economic policy. The first involves society's aging. By 2030, more than seventy million of the baby- boomers will have passed normal retirement age. Unless the programs are modified, government spending on Social Security, Medicare, and Medicaid will explode. What, if anything, should be done about that? The second issue is globalization: namely, the fact that the American economy is increasingly interconnected with—and influenced by—forces beyond its borders. How, if at all, can the United States protect its economic sovereignty?

The man who emerges from these pages is intelligent, shrewd, sober, confident, and decent—a responder, not a prime mover. He didn't foresee the Asian financial crisis (almost no one did), but once it emerged he grappled successfully with its implications. In many large and small matters, he was a wise adviser. He suggested that the president avoid inflammatory “class warfare rhetoric.” He counseled Clinton against criticizing Alan Greenspan in public or claiming credit for the stock market's extraordinary boom. But Rubin didn't have then—and, based on this book, doesn't have now—many original economic ideas beyond the bland standbys of curbing budget deficits at home and promoting free trade and free markets abroad.

WE SHOULD NOT BE surprised. In his twenty-six-year career at Goldman Sachs, Rubin was essentially a trader: a man focused on specific transactions, not on sweeping concepts. Rubin made his mark at the firm in a specialty called risk arbitrage, which involves betting on whether announced mergers will actually be completed. This sort of speculation requires steely nerves and strong powers of analysis. The profits can be quick and large, but if you guess wrong, the losses can be much larger. “Flux and uncertainty made risk-arbitrage quite nerve-racking for some people. But somehow or other, I was able to take it in reasonable stride,” writes Rubin. “Intermittent losses—sometimes greatly in excess of your worst-case expectations—were a part of the business. I accepted that, though some in our business did seem highly stressed much of the time.”

Rubin handled the anxiety of risk-arbitrage in part by becoming almost religiously wedded to what he calls “probabilistic” thinking. “What has guided my career in both business and government,” he remarks, “is my fundamental view that nothing is provably certain. One corollary of this view is probabilistic decision making.” Translation: based on the best information, weigh the odds of things going your way, then decide whether the odds justify the risk that they might not. For Rubin, even decisions that turned out badly may not have been wrong, as long as the odds warranted the risk.

Rubin built his career by making more good judgments than bad. He rose at Goldman not only because he excelled at risk-arbitrage but also because he performed well during crises. In 1970, the bankruptcy of the Penn Central Railroad plunged the firm into a period of “immense anxiety,” as he puts it. Goldman had sold investors large amounts of Penn Central securities; the railroad had defaulted, and the infuriated creditors sued Goldman Sachs for providing misleading information. Rubin then began serving as an informal adviser to the firm's head, the legendary Gus Levy, and suggested the firm shift attorneys. Later Rubin helped turn around a faltering Goldman subsidiary that specialized in commodity trading. He ended up serving as Goldman's co- chairman, along with Stephen Friedman (who is now an economic adviser at the White House), for about two years before his departure in 1993.

Little wonder that Rubin isn't a big ideas man, a creature of grand strategy. Indeed, his obsession with “probabilistic” thinking involves a prejudice against big concepts. If nothing is certain, then sweeping theories about the way the world works, or ought to work, are to be distrusted. The economic legacy of Rubin and Clinton is usually thought to lie in what they accomplished: balancing the budget, coping with the Asian crisis. These achievements, to which Rubin devotes most of his narrative, are supposed to provide guidance for the future: just do what they did, and the economy will behave well enough.

Unfortunately, it's not that simple. We have just completed a four-decade economic era dominated by the rise and fall of inflation. On the way up from negligible levels in 1960 to double digits by 1980, inflation abetted many bad surprises: deeper recessions, higher unemployment and interest rates, a stagnant stock market, slower economic growth. On the way back down from double digits in the early 1980s to today's negligible levels, inflation fostered many good surprises: lower unemployment and interest rates, longer economic expansions, a rising stock market, faster economic growth. But now we are in a new epoch, in which inflation and its side effects no longer dominate. Instead, the increasing size of government and growing globalization loom as immense forces for good or ill. The next economics needs to clarify how these forces operate, and how they can be channeled for good. The Clinton-Rubin years barely illuminate the changes and the choices.

II.

CLINTON ROUTINELY DESCRIBED his economic strategy as having three parts: deficit reduction, free trade, and “investment in people.” But to the public the essence of “Clintonomics”—or “Rubinomics”—was the budget. The basic idea was simple. Declining deficits (and ultimately a surplus) would reduce interest rates, which would increase economic growth and, through expanded private investment, result in a more productive economy with higher living standards. Since persistent deficits seem irresponsible—”living beyond our means” and shifting the costs of today's government benefits to future generations—a policy that reduces deficits appears both morally and economically virtuous. The fact that Clinton's initial program of deficit reduction in 1993 barely passed Congress, with not one Republican vote in either the House or the Senate, only increased the administration's identification with the approach. The economy's subsequent strong growth seemed to vindicate the underlying logic.

Actually, it didn't. Over the years, budget deficits have assumed a symbolic importance out of all proportion to their true economic significance. By themselves, budget deficits—and surpluses, too—do not matter nearly as much as the public and many economists believe. The economic expansion of the early 1990s was connected only loosely, if at all, to Clinton's 1993 budget plan. After all, the deficit reduction between 1992—the last year of George H.W. Bush's term—and 1995 was about 2.5 percent of national income, or gross domestic product (GDP). It may seem strange that such a small change could determine what happened to the other 97.5 percent—and so it is. Even in 1992, the economy was recovering from the mild 1990-1991 recession; and the recovery remained mild through late 1996, when other forces—the persistence of low inflation, the gathering enthusiasm for the Internet—ignited a boom. GDP grew 3.3 percent in 1992, 2.7 percent in 1993, 4 percent in 1994, and 2.5 percent in 1995. By contrast, it averaged 4.1 percent from 1996 to 2000.

Deficit mythology reflects our political and media culture. Politicians and the press need sharp distinctions to sustain their debates. Partisan controversy must remain uncluttered by detail or ambiguity; good and evil must be obvious. In this world, budget deficits have become the scarlet letter of bad behavior: anyone who runs them is branded with the big D. For economists, the emphasis on deficits can also be self-serving. To be influential, they need identifiable levers of government action (budget, tax, and monetary policies) that decisively affect the economy. The less powerful the levers are, the less powerful are economists. It is only natural that economists—at least those interested in public policy—deemphasize the importance of anything that they cannot easily influence (technology, management, culture). The resulting deficit mythology serves everyone's political interests. But it is profoundly misleading, and it needs to be punctured.

LET US CONSIDER A FEW OF THE widespread myths.

  • Myth: President Reagan created modern budget deficits. Reality: Modern deficits originated in the 1960s with the Kennedy and Johnson administrations, owing to the advice of Keynesian economists who argued that a slavish devotion to balanced budgets needlessly slowed economic growth.

    Until Kennedy, there had been an unwritten political taboo on running a deficit, except in the cases of wars and depressions. Truman produced five budget surpluses, Eisenhower three (and some deficits were tiny). But in 1962 Kennedy gave a famous speech at Yale denouncing outworn economic “mythology,” based primarily on an obsession with balanced budgets. He later proposed tax cuts that, when passed, enlarged deficits. Once the taboo was broken, it has proved almost impossible to restore, because politicians of both parties prefer to spend more or to tax less—or both. From 1960 to 1998, there was only one tiny surplus. What can be held against Reagan is that he expanded the deficits by lowering taxes and raising defense spending without a compensating decline in domestic spending. From 1980 to 1989, deficits averaged 3.9 percent of GDP, compared with 2.1 percent of GDP in the 1970s and 0.8 percent in the 1960s.

  • Myth: Budget deficits raise interest rates. Reality: Deficits are only one of many influences on interest rates, and at present a minor one.

    People believe this myth because it seems logical. The more the government borrows, the greater the demand for credit. All other things being equal, interest rates should rise. But other things are rarely equal, and actual interest rates reflect many forces: current and expected inflation (lenders want a return on their money after inflation); the state of the economy (which affects the overall demand for credit); Federal Reserve policy (which affects the supply of lendable funds); the willingness of foreigners to buy U.S. stocks and bonds (this too affects the supply of funds); the perception of risk (higher risks cause lenders to want higher rates). Studies that try to disentangle the various influences claim to find small effects of budget deficits on rates, but the real-world impact is hard to detect.

    It is true that long-term bond rates declined in 1993 after Clinton's budget plan passed Congress, but they had already been declining slowly for four years, despite rising deficits. The annual averages for ten-year Treasury bonds were 8.55 percent in 1990, 7.86 percent in 1991, 7.01 percent in 1992, and 5.87 percent in 1993. The major causes of the declines were probably a weak economy and lower inflation. By one measure, inflation in 1993 was the lowest since 1964. Worse for the theory, interest rates rose in 1994 (to 7.09 percent for ten-year Treasury bonds)—despite Clinton's budget plan. The present situation confirms the loose link between deficits and interest rates. Under President Bush, deficits have exploded. But the rate on a ten-year Treasury bond is about 4 percent; even adjusted for inflation, this is lower than the 1993 rate (inflation in 1993 was 2.7 percent; in 2003, it was 1.9 percent). Rates on home mortgages and corporate bonds were recently the lowest since the 1960s. The point is not that deficits don't matter; it is that their influence depends on circumstances and, so far, has been small.

  • Myth: Big budget deficits depress private investment and thereby hurt future wages and living standards. Reality: The adverse effect is tiny or non-existent.

    Again, the logic seems impeccable: higher federal borrowing would seem to “crowd out” some private investment. Since investment raises productivity (the source of higher wages and profits), lower investment ultimately hurts living standards. In practice, however, the logic isn't infallible. Bigger budget deficits can also crowd out consumer spending (partly financed by borrowing) or housing construction (heavily dependent on borrowing). In the 1980s, business investment averaged 12.7 percent of GDP, which was higher than in either the 1960s (9.9 percent) or the 1970s (11.1 percent). In reality, the deficits do not yet seem to have crowded out anything. Americans have enjoyed high consumption without sacrificing much, if any, investment. The reason lies in large trade deficits. We are able to spend more than we produce. The trade deficits reflect the dollar's special role as the major global trading and investment currency. Foreigners keep some of the dollars they earn from their American exports for other trade and investment. This holds the dollar's exchange rate at a level—our imports cheaper, our exports more expensive—that produces a trade deficit.

    More important, it's not just how much a country invests but how well. As the recent dot-com and telecom debacles attest, investment is not automatically productive. Generally, though, America seems to invest better than other societies. Managers face intense pressure to make new investments profitable. Inefficient and bankrupt firms are allowed to shrink or disappear, rather than being propped up by governments (as in Europe) or banks (as in Japan). Throughout the 1980s, commentators routinely predicted that Japan's living standards would soon overtake America's because Japan's investment rates— investment as a share of GDP—were twice as high. That hasn't happened, mainly because much of Japan's investment has been wasted.

  • Myth: The Clinton administration's fiscal responsibility—or, if you prefer, the Republican Congress's after 1994—led to budget surpluses in the late 1990s. Reality: The economic boom and the end of the Cold War were the basic causes of the surpluses.

    True, the White House and Congress constantly fought over deficits and negotiated periodic agreements, including Clinton's 1993 plan to reduce the deficit. But those plans confuse more than they clarify. Even in the 1990s, most spending continued to rise faster than inflation. Defense was the big exception, because the end of the Cold War led to deep cuts in ships, planes, and combat divisions. From 1992 to 1999, defense spending dropped from 4.9 percent of GDP to 3 percent; in 2000, this decline amounted to about $185 billion. Meanwhile, the economic boom produced an unpredicted surge in income taxes and capital gains taxes (mainly on stock profits). In 2000, the surprise increase easily totaled 2 percent of GDP. That was another nearly $200 billion. Together, these windfalls more than explain the surpluses of $126 billion in 1999 or $236 billion in 2000.

    It is also worth recalling that the Clinton administration did not initially favor balanced budgets. Its goal was the more modest and vaguer “deficit reduction,” and in 1995, when Newt Gingrich advocated balancing the budget by 2002, all the president's economic advisers objected. “We had successfully addressed the deficit problem in 1993 and intended to continue reducing the deficits,” Rubin writes. “Going all the way to a truly balanced budget would require additional program reductions that would serve little economic purpose. “ Ultimately, Clinton overruled his economic advisers, deciding that it was important politically to match the Republicans' commitment.

  • Myth: The central budget problem is eliminating stubborn budget deficits. Reality: The true problem is federal spending, driven by higher retirement benefits for aging baby-boomers. Spending, not deficits, dominates the budget's economic and social impact. To understand why, consider two hypothetical (and extreme) budgets. One equals 5 percent of GDP and has a deficit of 2 percent of GDP. The other equals 50 percent of GDP and is balanced. In the first, the role of government is small, despite the deficit. Public services are skimpy, and a modest tax increase could easily cover the deficit. In the second, government plays a huge role. Government services and transfers are huge. Taxes are already steep. Any unanticipated spending or loss of tax revenues might create economic or social strain. In both cases, spending sets government benefits and determines the required level of financing, whether by taxes or by borrowing. If taxes or deficits get too high, they could harm the economy or cause a political backlash.

Now consider the actual U.S. budget outlook. Social Security, Medicare, and Medicaid already represent about two-fifths of federal spending. With more retirees, this spending will shoot up even if benefits aren't sweetened as they were last year, when Congress passed a Medicare drug benefit. Depending on future health costs, by 2030 federal retirement spending could easily increase by 80 percent as a share of GDP, according to Congressional Budget Office projections. If other spending is not cut dramatically, taxes will rise sharply or deficits could become so large that lenders might refuse to lend, fearing a government default.

ONCE DEFICIT MYTHOLOGY IS stripped away, the Clinton-Rubin budgetary achievement diminishes. To be fair, their focus on the deficit precluded major new domestic spending proposals (to the consternation of many liberals). But this restraint reflected practical politics as much as conscious policy. In 1992, Ross Perot, running against George H.W. Bush and Bill Clinton, won almost twenty million votes by inveighing against budget deficits. In 1993, Congress was in no mood for more spending; and the same was true after the Republican victory in 1994 (though many Republicans later turned spendthrift). As noted, the 1993 budget package had only a small effect on the economy. It was sometimes said that Greenspan pledged to keep short-term interest rates low if the White House cut the budget deficit. But Rubin doesn't mention any agreement, and the Fed actually raised short-term rates in 1994 to deter inflation.

Indeed, Clinton's budget policy failed in one critical respect: it did not address the long-term problems of the baby-boomers' retirement. Not only did the administration refuse to face the issue, it opposed anyone who tried. As a member of the party that created Social Security and Medicare, Clinton might have made a historic break with the past, much as Richard Nixon broke with Republican dogma when he went to China. Clinton might have argued that longer life expectancies and more retirees justified slowly increasing the eligibility age—with ample advance warning—and tying benefits more directly to income. It is almost impossible for one party to propose changes alone without facing withering partisan attacks. But there were many opportunities for a bipartisan approach. In 1993, Senator Bob Kerrey of Nebraska provided the decisive vote to pass Clinton's budget plan; his single condition was that the president appoint a presidential commission on entitlement spending. Clinton did, and then he ignored it. In his second term, he undermined a commission on Medicare reform. Less conspicuous opportunities were similarly discarded.

Instead, Clinton played the traditional Democratic game of using Social Security and Medicare to enhance his own popularity and to attack the Republicans. There are those who think that if Monica Lewinsky had never worked in the White House, Clinton might have acted more boldly to overhaul these programs. Rubin dismisses this speculation:

Did the Lewinsky scandal harm Clinton's second term on a substantial level? Some people argue that the administration missed opportunities as a result, particularly with regard to reform of the Social Security system. But my instinct is that Clinton could not have gotten more done, at least in this area, even if the scandal never struck. We had begun to explore Social Security reform in 1997. When we floated one relatively modest change—revising the annual cost-of-living adjustment to better reflect inflation—we basically had our heads handed to us by Democrats in Congress and interest groups.... Some might argue that Clinton should have gone to war with his own supporters on this issue or that he could have done so if he hadn't needed their support in the impeachment fight. But well before the scandal, we already felt stuck.

This is self-serving. Had Clinton tried, he might have forged a coalition of middle-of-the-road Democrats and Republicans to make major changes in Social Security and Medicare. But he wasn't interested. In The Natural, Joe Klein recounts Clinton's delight in early 1998 (just as the Lewinsky scandal was breaking) in inventing the slogan “Save Social Security First.” To be fair, Clinton's pitch was also an argument against using any budget surpluses for tax cuts—as Republicans were then unwisely urging—as opposed to reducing the federal debt. But mainly it was a plug for the status quo: preserving all future benefits of Social Security and Medicare. Eligibility ages would not be raised; benefits for wealthier retirees would not be trimmed; retirees would not be asked to pay more. Future taxpayers—children and young workers, who don't vote or don't vote in great numbers—were ignored.

And what was Rubin doing all this time? Not much. There is no evidence that he ever pushed for major spending curbs on government retirement programs. What we have is the paradox of a man who has built his reputation as a paragon of budgetary rectitude but essentially ignored the major budgetary problem of his time. It may be an uncertain world, as he says, but some things are fairly certain. One is that baby boomers will age and clamor for their federal benefits. That is the central engine driving spending, taxes, and the deficit. While he was a member of the administration, Rubin's silence might have been rationalized as pragmatic. If the president wasn't interested, why fight a futile fight? But Rubin still ignores the issue. In this book of roughly four hundred pages, he never discusses it in any detail. He focuses singlemindedly on budget deficits as if they were freestanding evils and little else mattered.

As a society, we would have been better off never abandoning the pre-Kennedy folk wisdom of generally avoiding large peacetime deficits in good times and tolerating them grudgingly in bad times. But, having created almost-permanent deficits, we need to understand them better. In political terms, this is easy: they denote a desire to spend more than we tax; they are the politics of convenience. But in economic terms, the answer is maddeningly vague: it depends. As circumstances change, so will the effects of deficits. When the economy is weak, deficits may sustain spending and employment. That is good. When deficits are modest and federal debt (the accumulation of past deficits) is low, they may not matter much, for good or ill. That is at least half true now. At the end of World War II, publicly held federal debt equaled 109 percent of GDP; in 2003, it was 36 percent of GDP. In general, the economy—our national income— has grown faster than the debt.

But there could easily be circumstances in which the dire consequences envisioned by Rubin and others might materialize: if deficits indefinitely remain large (say, in excess of 2 percent to 3 percent of GDP—now about $220 billion to $330 billion annually); if foreigners' desire to hold dollars sharply diminishes; if government spending continues to rise; if there is no increase in Americans' willingness to save. Then interest rates might rise, choking off investment and promoting stagnation. Worse, huge deficits could trigger a financial crisis, because many investors—doubting the government's ability to refinance its outstanding debt—might refuse to hold Treasury bonds. There is indeed a possibility of a distant budgetary “crack-up,” but Clinton and Rubin did less to avert it than they—and much of the public—believe.

III.

THE CRACK-UP THAT RUBIN and others did help to prevent is now fading from popular memory, precisely because it was avoided. What we call “the Asian financial crisis” posed the most serious danger to the international economy since the 1930s. The most important fact about it is that it did not become a worldwide calamity. Global stock markets did not crash, nor did the world trading system implode. To some, particularly Europeans, the threats were always overdrawn. But Rubin and a few others decided—correctly, in my opinion— that the risks weren't worth running.

The nub of the problem was simple. Some developing countries (now called “emerging markets,” because they have escaped the most extreme poverty and achieved modest affluence) were exhausting their foreign-exchange reserves of “hard” currencies, such as the dollar and the euro, acceptable in international trade. These countries had borrowed too much abroad—again, in hard currencies— and couldn't repay all their debts. The practical question was whether these debtor countries would be rescued from their predicament by massive loans from the International Monetary Fund (IMF) and other international agencies.

If a few countries experience crises, it is a big problem for them, but not for anyone else. The great danger was “contagion,” which is a fancy word for panic. If many (or all) emerging markets became suspect, capital flight— lenders calling in their loans, investors selling the debtors' stocks and bonds- -might lead to economic collapse. As countries, lacking new international credit, depleted their foreign-exchange reserves, their ability to import would shrivel. Meanwhile, their desire to export and to earn scarce hard currencies would become frantic. One country's imports are necessarily another's exports. So the ultimate threat was a worldwide downward spiral. Global exports would drop—in 1997, all emerging markets accounted for 43 percent of American exports. Cheap imports would depress prices and profits in many countries, including the United States. Unemployment would rise; stock markets would fall.

We can never know what would have happened if Rubin and others had failed to act. After the crisis, Time magazine dubbed Rubin, Greenspan, and Lawrence Summers—then Rubin's principal deputy, later treasury secretary, and now president of Harvard—”The Committee to Save the World.” This is misleading. These three men did not originate many of the details of the IMF loan packages and, in some cases, only reluctantly agreed to them. But they did make the critical decision to get involved. If they had remained passive, terrified investors and desperate governments might have scrambled to save themselves in whatever ways they could. The struggle to contain the crisis, despite day-to- day setbacks, succeeded in its larger purpose of buying time and preventing a complete loss of confidence. As long as private investors and governments knew that someone was trying to prevent the worst, they were constrained. Investors had less reason to sell securities—stocks, bonds—that were otherwise profitable. Governments had less reason to go protectionist, adopting unilateral restrictions on trade or financial flows (movements of money in and out of the country). A free-for-all was averted.

The most detailed and compelling chronicle of the crisis remains Paul Blustein's The Chastening: Inside the Crisis That Rocked the Global Financial System and Humbled the IMF, which appeared a few years ago. Blustein pieced together a rich narrative from interviews with about 140 people—bankers, government officials, economists. Rubin's account of these events might have added to our understanding of what happened and why. It doesn't. One turning point of the crisis, for example, involved convincing many of South Korea's foreign bankers not to demand immediate repayment of their loans. This spared Korea from defaulting—an event that Rubin and others feared could have led to a domino effect. Rubin reportedly persuaded some recalcitrant banks to join the standstill. But he doesn't say which ones they were.

The crisis started quietly in July 1997, when Thailand devalued its currency, the baht, against the dollar. For most of the next two years, a string of countries—Indonesia, South Korea, Russia, and Brazil—experienced similar currency collapses. All had, in one way or another, become dependent on foreign capital, usually in dollars. Thai banks had borrowed heavily abroad. In Indonesia, banks and local companies had all borrowed. The same was true in Brazil; and the government also had foreign creditors. Russia had sold huge quantities of government securities to overseas lenders. The main incentives for these loans and investments were interest rates. Dollar lenders could get higher rates on emerging-market debt than on straight dollar loans in the United States; similarly, borrowers in these countries usually got lower rates than on loans made in their own currencies. Typically, the borrowed dollars were converted into national currencies, which could then be spent locally; the dollars would augment the countries' foreign exchange reserves and could be purchased—for local currency—by anyone needing to pay for imports. The catch was that hard-currency loans ultimately had to be repaid in hard currency.

For much of the post-World War II era, money (a.k.a. “capital”) had not moved so freely across borders. Countries imposed restrictions on who could bring it in and who could take it out. Hard-currency earnings resulted mostly from trade, and what happened within a country's borders involved the management of the country's national money, whether lira, yen, or pesos. But as countries grew wealthier and global trade expanded, restrictions on capital diminished. Richer countries lifted them first. Then, in the late 1980s and early 1990s, many poorer countries followed suit. What is less clear is why.

In his new book, Joseph Stiglitz, the Nobel Prize-winning economist who served on Clinton's Council of Economic Advisers from 1993 to 1997 and then was chief economist of the World Bank, blames the U.S. Treasury and the IMF, which “had pushed for rapid financial and capital market liberalization.” They are the villains; they created their own crisis. This seems too glib, and Stiglitz's account does not convincingly substantiate the accusation. But something less conspiratorial may have occurred. The end of the Cold War unleashed rampant optimism about how free markets and new technology were transforming the world. Countries wanted to join the global bonanza. They more willingly embraced prospects—whether from the IMF, the United States, foreign banks, or local businesses—of becoming more interconnected with the world economy, including being more open to foreign capital. That this might create problems first became obvious in late 1994, when massive capital flight from Mexico drained the country's foreign exchange reserves. Rubin, who had just become treasury secretary, organized a controversial $40 billion rescue package, fearing that an economic breakdown in Mexico would create a social crisis that would spill over into the United States.

WITH HINDSIGHT, WE KNOW that there were two problems underlying Asia's financial crisis. One was weak banking and financing systems in the debtor countries. In theory, borrowing abroad ought to be good. If the money is well invested, the borrowers will be enriched and will easily repay their loans, and the lenders will earn a decent return. But if the money is poorly invested, everyone can lose. Even in advanced countries, the mechanisms that transform investment capital into profitable projects are highly imperfect. Witness the Japanese and American “bubbles” of the late 1980s and 1990s. But in the emerging markets, banking and financial systems were both more primitive and less policed. In practice, much of the borrowed foreign money was squandered by the banks or companies that borrowed it.

This might not have mattered but for the second problem: pegged exchange rates. All the countries in crisis had pegged their currencies to the dollar. Before the crisis, Thailand fixed the baht's exchange rate at about twenty-five to the dollar; South Korea's exchange rate was roughly 915 won to the dollar. Pegged exchange rates have advantages: exporters and importers know how much their goods will cost, and borrowers in foreign currencies know how much local currency they will need to repay overseas loans. But there are also disadvantages. Pegged exchange rates may encourage capital inflows by (falsely) suggesting that there's no currency risk. Moreover, if a currency becomes “overvalued”—too high in relation to other currencies—the country's exports may become less competitive in world markets. That's what happened. In late 1995, the dollar began rising on foreign exchange markets. Currencies pegged to the dollar also rose. Thailand, South Korea, and other Asian countries lost competitiveness to Japan. Just as these countries were increasing their dollar debts, their ability to earn dollars through exports eroded.

In Thailand, the two problems intersected. By early 1997, banks' nonperforming loans—loans on which payments were at least six months overdue— reached 12 percent, according to Blustein. Banks and finance companies had contributed to a real estate boom by providing loans for apartments, shopping malls, and office buildings well in excess of demand. Foreign lenders to Thai banks began pulling their money out by not renewing their loans. With a pegged exchange rate, Thailand was committed to paying a dollar for every twenty-five baht presented. The trouble was that Thailand didn't have unlimited dollars. Once its foreign-exchange reserves were exhausted, the baht would fall to whatever the market dictated. If sellers of dollars demanded forty or fifty baht, that's what it would be. Thailand's ability to import, which depended on dollars, would drop sharply. Some companies would not be able to buy essential raw materials or components. Some would not be able to repay their debts. The economy would reel. Companies and banks might fold.

What was unrecognized in mid-1997 was that other countries' financial systems faced similar dangers. In South Korea, government policies for industrial development had skewed credit flows. The “chaebols”—vast conglomerates of many enterprises—had received huge quantities of cheap credit to build factories for computer chips, cars, and steel beyond what global demand required. In Indonesia, banks had often funneled loans to well-connected businessmen, including relatives of General Suharto, the strongman since 1965. The grim specter that slowly emerged was of an uncontrollable chain reaction. As more countries experienced financial problems, global investors might flee many—or most—emerging markets without waiting to see if any individual country actually had difficulties. There would be panic, a rush for the exits.

One of the great advances of practical economics, made in the nineteenth and early twentieth centuries, was the discovery that financial panics can sometimes be halted or limited. Someone could intervene to protect depositors, lenders, and investors against losses. In Britain, it was the Bank of England. In the United States during the Panic of 1907, it was J.P. Morgan and the banks that he could mobilize behind him. If some huge amalgam of collective money could reassure all depositors that they would be repaid, they would not rush to withdraw. The banking system, providing the loans essential for a healthy economy, would not be needlessly crippled by an uncontrollable outflow of funds. The same logic could be applied to panic in any market, including the stock market. The central question posed by the Asian crisis was whether to mount a similar rescue operation on a global scale.

The hastily constructed IMF rescue packages were messy and controversial. Both the IMF and the debtor countries made mistakes. Thailand and South Korea concealed the depletion of their foreign-exchange reserves. Once revealed, these lies made restoring confidence harder. In Indonesia, Suharto didn't abide by the IMF program he accepted. Also in Indonesia, the IMF erred by shutting some shaky banks and thereby triggering a general depositor panic. The IMF loans failed in more general ways. None of the countries succeeded in defending its exchange rate. All suffered huge depreciations and savage recessions bordering on depressions.

But the IMF, Rubin, and Greenspan muddled through. By 1999, Asian economies were growing again: South Korea's growth was 10.9 percent, Thailand's 4.4 percent. (Indonesia was an exception.) More important, the repercussions were limited. In the United States—contrary to more fearsome, though reasonable, expectations—the “Asian crisis” may actually have contributed to the gathering boom by holding down inflation (those “cheap” imports) and redirecting even more foreign capital toward the American stock and bond markets. Exuberant American consumers and businesses, by buying Asia's exports, provided a safety net for its economies and helped sustain a global expansion. Still, the final stages of the crisis suggested that a more timid response might have had more frightening consequences.

On August 17, 1998, Russia defaulted on its government debt, much of it held by foreign investors. This time, the IMF did not provide a rescue, because it feared feeding “moral hazard”: if people—investors, in this case—are rescued from the consequences of their bad behavior, it will abet more bad behavior. In Russia, moral hazard had gone into overdrive. Those foreign investors— including some “hedge funds” owned by the super-wealthy—were collecting exorbitant interest rates (up to 100 percent at the end) on the Russian debt on the assumption that the IMF would have to bail out a nuclear power. Investors converted dollars (or other hard currencies) into rubles, put the rubles in Russian government debt, collected the lavish interest, and then, when they were ready, reconverted rubles back into dollars. But if the rubles couldn't be re-exchanged for dollars, the profitable operation would cease—and investors would suffer massive losses on worthless Russian securities. This is what happened when the IMF refused to provide more dollars. By itself, Russia's economy didn't matter much; it was less than 3 percent of the world economy. But the specter of a major government defaulting on its debt, something considered unthinkable in the modern era, rattled investment managers from hedge funds to insurance companies.

The rules had changed, but how? No one could tell. Everything seemed riskier. As a result, investors fled into cash or the safest investments they could find. By early September, the Dow Jones Industrial Average was down 19 percent from its July peak. By early October, Greenspan was saying publicly that bond markets were in disarray. “I have been looking at the American economy on a day- by-day basis for almost a half-century, but I have never seen anything like this,” he said. “[Investors] are basically saying, 'I want out. I don't want to know anything about whether a particular investment is risky or not. I just want to disengage.'” To dispel any panic, the Fed cut interest rates three times in the fall of 1998 and arranged for the rescue of Long-Term Capital Management, a gigantic hedge fund whose insolvency might have caused more losses and insecurity. But the episode vividly demonstrated how fraying confidence might easily give way to a larger financial or economic crisis.

Avoiding that stands as Rubin's greatest achievement. It seems clearer now than it did then, because—as both his and Blustein's accounts make clear—the rescue process verged on chaos. The negotiations that ultimately defused the crisis were incredibly complex: between the IMF and the debtor countries; between different segments of the IMF staff; between the IMF and the U.S. Treasury; between Rubin's Treasury Department and other parts of the U.S. government (the State and Defense Departments, the National Security Council); between the Americans and the Europeans and the Japanese. It was often touch and go, but Rubin's “probabilistic” approach served him well. It meant keeping an open mind while recognizing that the inevitable uncertainty was not a sufficient excuse for inaction and paralysis.

IV.

THERE WILL ALWAYS BE economic crises, and when they occur we will be fortunate to have leaders of Rubin's intelligence and calm. Naturally we want to minimize these moments by better understanding our economic surroundings so as to anticipate possible dangers, and in this regard Rubin offers almost no help. His devotion to the certainty of uncertainty discourages this sort of exercise, because the future is by definition unknowable. But people will always try to prepare for the future that they imagine—and if their picture is wildly unrealistic, then they may become victims of their own delusions. This is our present predicament.

A common thread connects the problems of the federal budget and of globalization: two progressive developments—the rise of the welfare state and the expansion of international commerce—may now have undesirable side effects. Both have alleviated suffering and advanced the human condition. Possibly, they will continue to do so. But they could also reduce economic growth and foster instability. What to do? We cannot reverse these changes. The welfare state is a permanent part of the national fabric, as in most advanced societies. No one is going to repeal Social Security, Medicare, or food stamps, because doing so would tear up too many lives and, not incidentally, constitute political suicide. Equally, we cannot undo globalization. Jet travel, fiber-optics, and the Internet will not disappear. Nor will worldwide production systems suddenly be dismantled and reassembled within national boundaries.

Somehow we will adjust to new realities—but better to do so mostly on our terms than to wait for events to overtake us. The present conventional wisdom about the welfare state and globalization dates to the Great Depression, World War II, and the Cold War. By discrediting raw capitalism, the Depression made the welfare state an acceptable way to combat misery and political turmoil. Free trade was another legacy of the Depression, because protectionism in the early 1930s was thought to have worsened the economic collapse and to have contributed to the onset of war. The Cold War reinforced these beliefs. Countries that trade together (we thought) would stay together—and, through greater prosperity, resist communism. We traded mainly with our military allies.

On all fronts, times have changed. The obsession with budget deficits merely masks a larger crisis of the welfare state. The impending rise in retirement spending (in the United States, Europe, and Japan) is so large that it may push taxes, deficits, or both to punishing levels. The ultimate danger is a death spiral. Faltering economic growth makes it harder for governments to pay promised benefits. This leads to higher taxes or deficits, which may further reduce economic growth; or to drastic cuts in promised benefits, which may trigger a political backlash. Higher taxes may also discourage younger workers from having children, because the costs of child-rearing seem too steep compared with after-tax income. Fewer children eventually means fewer workers, compounding the problem of paying for retirees.

Similarly, the Asian financial crisis was not just a dress rehearsal for others like it. So much else has changed. We no longer trade mainly with our closest allies—who, now that the Cold War is over, aren't so close anymore. China is our second-largest source of imports. Nor is trade limited to wheat, steel, cars, and other goods. The low cost and high speed of communications means that many jobs (database managers, software designers, call-center operators) can be “outsourced” to countries where wages are lower. Globalization, at its best, reduces poverty and increases incomes and freedom. But its dangers include the transmission of diseases and terrorism across borders, the vulnerability of worldwide communications to breakdown, the disruption of traditional societies, and old-fashioned economic instability.

It's not just Rubin who has declined to stretch his mind on these essential subjects. Stiglitz, whose specialty is defining the proper boundary between government and private-market activity, omits from his review of the 1990s any extended discussion of the aging of the population and its budgetary, economic, and social implications. His book mostly catalogues complaints, many trivial, about the policies of the administration that he once served. Stanley Fischer, another leading economist and the number-two man at the IMF during the Asian crisis, has collected a group of essays that were first presented to academic and professional audiences. They usefully review many recent developments and concede some IMF mistakes (supporting pegged exchange rates, for instance), but they do not break much new ground.

THERE IS AN IMPRESSIVE exception in Growing Public: Social Spending and Economic Growth Since the Eighteenth Century by the economic historian Peter H. Lindert. He asks a simple question: is it true—as many economists and politicians, as well as this reviewer, assume—that bigger government usually hurts economic growth, by increasing taxes and spending? The prevailing wisdom has been that societies face a choice. They can have more social justice, but only at the cost of some economic growth. Not so, says Lindert—at least not until now. The welfare state so far has been a “free lunch,” he argues. Bigger government has not reduced economic growth.

He offers three reasons for this apparent paradox. First, some government spending (especially on education) can increase growth. Second, generous unemployment and disability benefits may raise unemployment, but because the added jobless are mostly unskilled and unproductive, their loss doesn't matter much. And finally, big-government societies, mostly in Europe, have favored taxes—particularly the value-added tax, a national sales tax—that fall mainly on consumption, not on work effort or investment. Growth does not suffer.

If Lindert is right, oversized welfare states are not much of a threat. But his case may be overdrawn. Consider per-person incomes in Europe's richer states—France, Germany, Belgium, and the Netherlands. They are roughly 15 percent to 30 percent below those of the United States. Interestingly, labor productivity (output per hour worked) in these countries and the United States is about the same. The income gap reflects shorter working hours and fewer workers (proportionately) in Europe. In part, this may be cultural: Europeans want longer vacations and more leisure. But high tax rates also discourage extra work. It doesn't pay. Taking Lindert at face value would justify a dangerous complacency. Even he thinks that retirement benefits should be scaled back to avoid overburdening younger workers with taxes.

In the United States, this means trimming Social Security and Medicare. It is dismaying that someone such as Rubin, even out of office, will not discuss the problem candidly. What needs to be acknowledged is that the nature of these programs has changed. They started as a safety net for those who could no longer work and could not pay for unanticipated health expenses. They have now also become retirement subsidies for people who could work or could afford some (or all) of their health costs. The young are being compelled to subsidize the leisure of the old—a tolerable situation when there were so many more young than old, but now increasingly unbearable. These programs should be nudged back toward their original purpose.

In contrast to the welfare state—where conflicts are obvious if unpalatable- -the problems of globalization are often murky. One overlooked question is whether or not the global economy is stable. When trade was modest and limited mainly to goods, from steel to sugar, countries were mostly in charge of their own economies. It was national politics and culture that largely determined which societies succeeded and failed, and though trade was often vital—to Japan and many Asian nations especially—it was rarely decisive. But as trade has expanded, as production systems have become more global, and as international flows of money and information have increased, the domestic model of how economies operate has to be modified. No one would now say that the economy of Connecticut exists apart from the larger American economy, even though Connecticut's economy is different from, say, Colorado's. The same parallel is not yet fully true of the American and the global economy, but it is becoming more so.

THE ASIAN FINANCIAL CRISIS showed how developments in one part of the world might threaten economic stability elsewhere. But there are other potential paths to instability. Consider the “dollar standard.” Since 1945, the dollar, replacing gold, has served as the main currency for international trade and investment. In recent years, this has involved a bargain between the United States and the rest of the world that may now be vulnerable to breakdown. We have run ever-larger trade deficits (now about 5 percent of GDP) that, until recently, satisfied almost everyone. For foreigners, the buoyant U.S. market sustained domestic employment and investment. For Americans, the trade deficits meant—as long as unemployment was fairly low—that we could spend more than we produced. This (again) minimized the adverse effects of federal budget deficits. In effect, we exchange dollars (which foreigners usually re-invest in U.S. stocks, bonds, companies, or real estate) for goods and services (which we consume).

But the bargain could unravel if the outflow of dollars exceeds what foreigners want to hold. Then dollars would be sold for euros, yen, and other currencies. There are twin dangers. One is a panicky flight from dollars, which would result in massive sales by foreigners of American stocks and bonds, sending these markets into a tailspin and triggering a broader economic slump. The other possibility is that the dollar could decline gradually against other currencies (as has occurred recently), increasing the competitiveness of American exports and decreasing the competitiveness of American imports. Our trade deficit would stabilize, diminish, or disappear. Fine, so far. The danger arises if other countries cannot compensate for the loss of exports to the United States. If they were economically addicted to the American market, the result might be prolonged global sluggishness or stagnation. Countries would compete for pieces of a relatively fixed economic pie. This might inspire protectionism or frustrate efforts to promote development in the poorest countries.

Our understanding of today's ever-changing global markets is sketchy. The latest uncertainties involve China and India, which have decisively joined the world economy in the past fifteen years. With one-third of the world's population, they represent both a huge potential source of supply (of cheap labor and products) and a huge potential source of demand (for almost everything, starting with raw materials). The bad specter is of millions of jobs being drained from the United States, Europe, and many poor countries, including many white-collar jobs that go to Indian or Chinese software engineers or accountants. The better prospect is of a global boom driven heavily by their development needs. In theory, the more China and India sell abroad, the more they ought to buy abroad. China's population is more than four times ours; its GDP is only an eighth of ours. For years we have heard that all our jobs would flee to Japan, South Korea, or Mexico. They haven't fled. Some have been lost; others have taken their place. But maybe China and India—by their sheer size—are different. To avoid unrest, China's central need is to create jobs. What better way than by selling to Americans?

Countries may automatically adjust to changes in the world economy and thereby maintain overall stability and expansion. But there is no guarantee that today's glaring imbalances, symbolized by gigantic American trade deficits, will spontaneously and painlessly correct themselves. If you ponder the possibilities, you can imagine some grim outcomes. The worst would be a merging of the problems of globalization and bigger welfare states. The most prosperous societies—the United States, Europe and Japan—slowly stagnate. Developing countries cannot generate strong domestic economic growth to offset their loss of exports to the United States. Everyone saunters or sprints to protectionist and mercantilist policies intended to advance their own business and employment goals at someone else's expense, but when taken together corrode confidence and further subvert global economic growth. Whatever happens, we face some stubborn contradictions. Politics remain firmly national, driven increasingly by promises to pay more government benefits, while economics—the ultimate basis for redeeming all those promises— is growing increasingly international. Meanwhile, our global commercial relations no longer mirror our global political alliances. Oil represents one obvious problem, but there are others. The next economics ought to help us grapple with these contradictions, but it is nowhere in sight.

It is worth reverting to Alexander Hamilton for a more hopeful precedent. When he became the first secretary of the treasury, the United States was a nation of about four million people that was largely agrarian and had something of a shipping industry. But it lacked most manufacturing, had almost no financial system, was without any national taxes, and was saddled with a patchwork of Revolutionary debts that were in various states of repudiation. This disarray did not daunt Hamilton. He had firm notions about how repaying the debt, creating a central bank, and encouraging manufacturing would promote national prosperity and power. Like many of the Founders, he was a man of both ideas and action. Whom do we have now like that?

Robert J. Samuelson writes a column for Newsweek and The Washington Post Writers Group. He is the author most recently of Untruth: Why the Conventional Wisdom is (Almost Always) Wrong (Random House). 

This article originally appeared in the May 3, 2004, issue of the magazine.