WHEN GOVERNMENT ECONOMISTS, academics, and the talking heads on bubblevision speak of “modest price inflation,” they know full well that the effects of the quantitative easing policies that they have advocated and implemented have not been fully expressed in America’s consumer price index (CPI). Rather, the best evidence of runaway inflation can be found, among other areas, in the markets for commodities, foreign exchange, equities, bonds, farmland, real estate, and art. Savvy statisticians know this, of course, and many of them have impeached the U.S. government methodology used to compute the CPI. For example, using the methodology according to which CPI was computed in 1980, recent CPI inflation is estimated to have been close to 10 percent. Using the government’s methodology of 1990, CPI inflation for the same period is estimated to have been closer to 6 percent.
Whatever the true rate of inflation, one thing of which we can all be sure is that those workers earning fixed salaries and wages, and those retirees living on pensions and fixed incomes, know that their paychecks and their minuscule interest income from savings accounts do not keep up with their expenses, which must be paid for at rising, true market prices. And almost every American has learned that the financial class—with access to cheap credit courtesy of the Fed and the big banks—has enriched itself not only with bailout subsidies, but through its symbiotic relationship with the very same monetary system that is eating away at their own standard of living. This inequitable arrangement at the heart of the U.S. financial system is the single most fundamental cause of the rising inequality of wealth in America. Fed quantitative easing may be summed up as asset bubbles for the rich, food stamps for the poor, declining real income for the middle class.
How did we arrive here? Is there a way out of this funhouse of distorted prices and perverse incentives in which we seem to be trapped? Let us look back at the series of twists and turns that brought us from pre-World War I peace and prosperity to the present calamitous situation, and see what answers might be revealed.
During the long period of European economic growth, the unique commercial and industrial revolution that lasted from approximately 1700 to 1914, the universally accepted monetary standard was a defined weightof a real article of wealth—precious metal money inherited from centuries of past trading experience.
The English word money originates in the Latin word moneta, literally meaning coin or mint. Further, not by chance was standard British money called the pound. The pound sterling was a standard weight unit of precious metal, originally one pound of silver refined into coined money. (Though it is true that medieval coins were not always precisely minted; weights varied from locality to locality, and coins were arbitraged in marketplaces and river-valley trade fairs by moneychangers with accurate scales.)
Such a system of hard currency—or of paper money convertible to a defined weight of precious metal—had prevailed against all competitors for almost two millennia among both primitive and modern trading communities. It is easy to see why. Intuitively, virtually anyone can perceive the value of a weight unit of money, the measurable amounts of labor, capital, and natural resources resting behind it, and the comparative value of one’s own real labor, capital, and resources that one must expend in exchange for it. The accuracy of these perceptions were validated in the only reliable laboratory of economic research available to us: namely, the evidence of economic history.
WHY DID THE civilized world abandon a monetary standard that had allowed for the Industrial Revolution, the growth of middle-class prosperity, and the rise to pre-eminence of Europe (and specifically Great Britain) in world trade and finance? The answer to this, as with so many of history’s questions, is war. On the eve of World War I, convertibility and the international gold standard that had emerged were suspended by the various belligerents. Total war on the scale of World War I destroyed major institutions of civilization, and hard money was no exception. Inflation began to creep up on the West during the war, and caught up with it by the time of its conclusion. Between 1914 and 1924, the expansionary monetary policies deployed by European nations to finance both the war and the resulting deficits gave way to the great paper and credit inflations in France (1924-1926), Germany (1920-1923), and Russia (1916-1918), among other European countries. The ensuing convulsions of the social order and the virtual obliteration of the savings of its middle classes led directly to the rise of Bolshevism in Russia and Nazism in Germany. Revolution and counter-revolution—and the subsequent end of the established social orders of many nations—were at least in part a consequence of the rise of inconvertible paper currencies.
One cannot acquit our ancestors on the grounds that they did not appreciate the dangers of inflation. After World War I, at the time of the Paris Peace Conference of 1919, John Maynard Keynes argued that there was no surer means of “overturning the existing basis of society than to debauch the currency.” The process of inflation, Keynes warned, “engages all the hidden forces of economic law on the side of destruction, and does it in a manner which not one man in a million is able to diagnose.” Keynes was a shrewd and experienced political economist, and in this single phrase he summed up the evil released by the destruction of the gold standard and its replacement by government-managed, nominal, paper money. Keynes understood inflation. He knew its effects destroyed the wellsprings of the future by consuming their source. He had observed the catastrophic devastation firsthand of World War I. In 1922 he wrote:
If gold standards could be introduced throughout Europe, we all agree that this would promote, as nothing else can, the revival not only of trade and production, but of international credit and the movement of capital to where it is needed most. One of the greatest elements of uncertainty would be lifted…and one of the most subtle temptations to improvident national finance would be removed; for if a national currency had once been stabilized on gold basis, it would be harder (because so much more openly disgraceful) for a Finance Minister so to act as to destroy this gold basis.
Later, as England approached insolvency—and when he found it in her interest—Keynes himself dismissed the gold standard as “a barbarous relic.”
World War I destroyed the existing worldwide hard money consensus, but the inauguration in 1913 of the U.S. Federal Reserve System, followed by the 1922 international monetary conference at Genoa, also changed the financial history of the world. Genoa paved the way for the reserve currency system we know today, one based on the dollar, the pound, and the discretion of central bankers. What followed, of course, was the central bank-caused credit and equity bubble, the economic boom of the 1920s, and the subsequent collapse into worldwide depression. The idea of the Federal Reserve System was born in the aftermath of the severe banking panic of 1907—one century before the panic of 2007-08. It was created under the legal fiction of a private corporation, subject to federal government control, with a statutory monopoly over the currency issue and near exclusive authority to regulate the banking system. The Fed could create new credit and money primarily by purchasing gold and by advancing money or credit against secured, short-term promissory notes of merchants and producers—which is to say, the Fed was limited by the workings of the gold standard.
After the early deflationary phase of the Great Depression (1929-1933) the process of worldwide inflation got underway again, punctuated by brief periods of disinflation. The moral, legal, and financial signal for the great American inflation was first lighted in 1933, when Franklin D. Roosevelt expropriated and enforced payment of $20 per ounce for all private gold and gold coins owned by American citizens. Then in 1934, following the lead of Great Britain, he reduced the value of the monetary standard by reducing the gold weight of the dollar; or, as it is more generally expressed, by raising the gold price from $20 to $35 per ounce. The effect of this devaluation was, overnight, to collect for government spending the new, higher value for the gold that had originally belonged to its dispossessed American owners.
ROOSEVELT’S MONETARY DECISIONS stirred great controversy. Constitutional questions arose during the 1930s over the authority of the president to violate ex post facto the value of lawful contracts enjoining stipulated payments in gold dollars. On the face of it, ex post facto laws are unconstitutional. After hearing cases from damaged plaintiffs, the Supreme Court ruled in favor of President Roosevelt and the legislature he effectively controlled. Existing gold contracts were pronounced dead: They were declared by congressional resolution to be “against public policy.” American citizens were also prohibited by law from owning gold—a right only restored by statute in January 1975 after many years of public debate. It was clear to all that the dollar after 1934 was no longer “as good as gold.” Americans no longer had the unrestricted right to exchange their paper and bank deposit money for a specified weight of gold, as they could under the classical gold standard—even though in law the dollar was still nominally defined as a certain weight of gold. Only foreigners were still permitted to exchange their undesired paper dollars for American gold. The door had therefore been opened in the future for the dollar to become a nominal paper currency, whose value would be substantially determined and regulated by the opinions of politicians and the Board of Governors of the Federal Reserve System.
Near the end of World War II, 30 years after the founding of the Federal Reserve System, the Bretton Woods Agreements of 1944 elaborated a new international monetary system, establishing the dollar as the official post-war reserve currency. The pound continued to serve until 1975 as an unofficial reserve currency for some nations tied closely to the so-called sterling bloc. But the dollar had become the numéraire of world currencies. Under Bretton Woods, the fixed values of foreign currencies were to be determined by their relationship to the dollar. In turn, the paper dollar derived a definite value, under the Bretton Woods agreement, by virtue of its convertibility into a defined weight of gold, $35 per ounce. (Convertible, that is, for foreigners but by law not for American citizens.) Today, even under floating exchange rates, the world dollar standard persists.
For nearly two decades, the gold-linked dollar of the post-war Bretton Woods System remained a reasonably stable epicenter around which other fluctuating currency systems orbited quite unsteadily. From 1945 to 1958, it dominated global trade and exchange. This static period lasted until Western European governments under the European Payments Union restored the mutual convertibility of their currency systems on current account, abolished most exchange controls, and sought to establish budgetary equilibrium at home; at the same time, the United States began to experience “near-permanent” overall balance-of-payments and budget deficits.
Throughout the 1960s, under Presidents Kennedy and Johnson, inflation and the external deficit of the dollar, generated by expansive U.S. monetary policies and budget deficits, led to perennial foreign-exchange crises and ultimately to foreign-exchange controls in the United States. From 1965, the Federal Reserve had been required by a reformed statute to hold gold reserves equal to 25 percent of Federal Reserve notes and deposits (the so-called monetary base). But when President Johnson decided simultaneously to escalate the Vietnam War and create his Great Society welfare system, he moved to void the statutes that, by virtue of the stipulated gold cover, limited the amount of money and credit which the Federal Reserve System could create. The full inflationary potential inherent in the Federal Reserve Act of 1913—and in the monopoly central bank it had created—was about to be realized. Predictably, as the legally required gold cover was gradually brushed aside, budget deficits, credit expansion, inflation, and the balance-of-payments crises intensified. The Bretton Woods system groaned under the flood-weight of excess U.S. dollars going abroad, where perforce they were accumulated in the official foreign-exchange reserves of our trading partners.
To make a long story short, what a few farseeing statesmen predicted as early as 1960 eventually came to pass on August 15, 1971, when President Richard Nixon abolished by executive order the remaining (and by then weak) link of the dollar to gold, thus removing all restraint on federal spending. Nixon’s decision set off a chain reaction: One by one, the world’s nations unlinked their currencies from the dollar and gold, giving rise to floating exchange rates, protectionism, and worldwide inconvertible paper money managed by central banks.
These dramatic changes were welcomed by most in the academic and policymaking communities, and by most politicians. The Bretton Woods agreement was an unnecessary discipline, they said. Professional economists cavalierly dismissed the Bretton Woods fixed-exchange rate regime, not because, like the interwar monetary regime, it was a flawed reserve currency system, as it surely was, but because it was the last vestige of monetary restraint still remaining from the discipline of the pre-World War I classical gold standard. Even monetarists such as Milton Friedman promoted the idea of a steady increase in the money supply—say, 3 percent per year. It was supposed by the academics that the Fed had the tools, the all-seeing computer, and the perfect foresight to attain this goal.
Nevertheless, once the gyroscope that had guided the world economy for centuries was destroyed, financial disorder followed. After 1971 the United States experienced its highest interest rates since the birth of the republic, and the worst inflation since the War for Independence. Stagflation and slow growth in the 1970s made the world poorer, but OPEC, the oil cartel, richer.
Like workers, businessmen, and consumers, we must look at the real-world consequences, bottoms up, not top down. Since the end of convertibility in 1971, average real wages per hour of work in the United States have been stagnant. Average annual American economic growth since 2000 has been about half the average annual real growth of the previous two American centuries. The real purchasing power of a 1971 dollar saved in the bank, adjusted by the CPI, has declined to a value of about 15 cents. That is to say, the price level has risen from 1971 to 2013 by about six fold, a rise unparalleled in the history of the American Republic. American economic and political world leadership is under siege. And so, too, is the American middle class.
TODAY, A CENTURY after the Great War of 1914, one observes—at home and abroad—the depreciation and fluctuation of the value of all paper and credit monies. The scourge of inflation—either consumer price or asset price inflation—has gradually undermined the harmony of the social order not only in developed but also in emerging countries, because inflation represents a decline in the value of the preeminent economic institution of civilization: money.
Inflation means the gradual impoverishment of the working and middle classes who are paid with lagging wages and salaries, of pensioners who subsist on fixed incomes. There is no better symbol of this inflationary process than the astronomical rise of the price of gold: from $20 in 1930 to $35 per ounce in 1934, from $35 per ounce in 1971 to $500 in March of 1981, to approximately $1,400 in 2013. The market traces, by means of the rising price of gold, the decline of the value of the dollar and other world currencies. Compare that to the world under the gold standard: the price of gold was stable in Anglo-America for two nearly centuries, from 1717-1914. The world general price level in 1913 was at almost exactly the same as in 1879.
It is true that, under the true gold standard, the general price level gradually rose in the short term when rare, large, new sources of gold were found; for example in the 16th century, when vast lodes of precious metal were discovered in the New World, or in the 19th century due to discoveries in California, Australia, and South Africa. But the average annual rise in the price level, contrary to conventional wisdom, was little more than 2 percent. Real money, even from new mines, requires labor and capital to be produced, imposing a strict limit on the growth of the quantity of gold over any decennial or century-long period. In a word, real money cannot be created at a marginal cost of zero.
Conversely, the general price level gradually declined during periods of diminished rates of discovery, thereby causing real wages and salaries to rise relative to average market prices of basic goods and services. Such a period was the late 19th century in the United States, when the average annual decline of the general price level was a bit more than 1 percent. And this fall in the price level was associated with one of America’s greatest periods of economic growth, 3 to 4 percent annually, with rising real wages for lower- and middle-income families.
The corrosive process in the 20th century of long-term inflation coincided with the founding of the Federal Reserve System in 1913, whereas under the classical gold standard the general price level remained stable over the long run.
It is clear that the worldwide collapse of hard money has had both economic and moral consequences. But the long-term effects of fiat money are still unfolding. Only one century of the post-World War I financial disorder has been written. The tale is not fully told. Economic historians of the future may well write that the modern age of central banking, now in its fourth century, saw “the rise and fall of real money.”