Since the early days of humanity we have strived to obtain the goods and services we desire by trading our surpluses to fulfill our deficits. Throughout history a society that could produce an excess of sable furs (for example) would trade with neighbouring societies that were especially efficient at producing wagons.
Early trade simply entailed an exchange of goods, known as the barter system. This method of trade is very cumbersome because it requires both participants have coincident needs, appropriate divisibility of tradable assets and agreed-upon measures of value. These three conditions often prove elusive leaving many potential barter trades incomplete and many others unfair. As an alternative to the barter system, universally-accepted measures of value developed in different cultures around the world in many different ways.
The first universally-accepted measures of value were items with widespread appeal, easy division and widely-accepted values. In many societies, commodities were often used as a medium of exchange because they tended to have widely-known and stable value for most people in society. In many societies, commodities were the first form of money and gold was often the commodity of choice.
In Britain goldsmiths helped to develop the modern banknote. During the English Civil War of the 17th century, citizens deposited valuables (gold, jewellery) into the safes at various goldsmiths for safekeeping. In return, the citizen would get a receipt that provided proof of ownership when the person wished to later withdraw.
Gold withdrawals were made to make a payment for goods and services. Some merchants, however, were willing to accept gold receipts as payment since they knew the receipts were ‘as good as gold’. Goods providers accepted gold receipts as payment since they knew the receipt could be converted into actual gold at any time. The exchange of gold receipts for goods eventually became common-place and, in effect, these receipts became early gold-backed currency.
Once they discovered gold was rarely withdrawn from their safes but gold receipts were being readily traded, some enterprising early goldsmith ‘bankers’ decided to start issuing and lending more receipts than the available gold to back up the receipts. They did this knowing that most customers never actually ever withdrew their gold, so the chance of having to back up all the receipts at the same time was miniscule. This was an early incarnation of fractional reserve banking with a portion of the monetary base tied to a physical commodity, such as gold.
The ability to convert to gold is the basis on which paper money was derived. Paper money wasn’t created out of thin air…it was a contractual ownership stake of a certain amount of gold that was held in a goldsmith’s safe. As long as the public was confident that an appropriate amount of gold was readily available for convertibility, they maintained confidence in the paper receipts that represented those claims.
Of course, some goldsmiths got greedy and lent out far too many receipts. These goldsmiths created the risk that gold would not be available if many receipt-holders redeemed at the same time. Some merchants would question the ability to easily convert the receipts into gold. If it appeared that not enough gold was kept at the goldsmith to back up the receipts, merchants would no-longer accept the receipts at face value. Instead, merchants demanded more receipts for the same amount of goods. In effect, the value of the receipts went down (therefore the prices of goods went up). This illustrates the basic monetary force that creates inflation.
Like the gold receipts of 17th century Britain, the US dollar was at times convertible into gold. The history of US dollar convertibility into gold is mixed – the US dollar has been taken on and off the gold standard a few times. The last time the US dollar (and most other world currencies) was tied to gold was after World War 2 under the Bretton Woods system.
During World War 2, many central banks around the world shipped their gold to the United States for safe-keeping and payment for armaments. By the time the war ended, the US had by-far the largest gold reserves on the planet. In an effort to stabilize the global economy and create confidence in war-torn European economies as they rebuilt, the Bretton Woods exchange rate system was created. Essentially, the Bretton Woods system tied global currencies at a fixed rate to the US dollar. The US dollar, in turn, was tied to gold at a specified convertibility. Therefore, (whether or not they actually had gold in domestic vaults) all currencies were indirectly convertible into gold in US vaults.
During the late 1960s/early 1970s, the US was running a fiscal deficit to pay for the Vietnam war, and for the first time in the 20th century was running a trade deficit with the rest of the world. Interest rates started to rise and it is widely believed that the US Federal Reserve began printing money to buy US Treasuries, thereby increasing money in circulation as a percent of available gold reserves. As the market grew more suspicious of the lack of gold reserves backing US dollars in circulation, confidence in the US dollar began to wane, and Germany and Switzerland left the Bretton Woods system in 1971.
Foreign holders of US dollars started demanding gold in exchange for their US dollars. Growing conversions put pressure on gold reserves and, as the proportion of gold available for conversion declined, it was only a matter of time that all US gold was used up in the conversion process, leaving the remaining US dollars worthless. To prevent this, US President Richard Nixon abandoned convertibility in August 1971.
The act of banning convertibility effectively freed US monetary supply from the anchor of the gold standard, allowing the US Federal Reserve to print money within less restrictive limits. Monetary policy’s only anchor became the ‘full faith and credit’ of the US Treasury and the US Federal Reserve. Of course, central bank and treasury credibility becomes far more subjective with the elimination of a gold standard.
During the 1970s, growing money supply, combined with a decline in productivity, a slowdown in post-war disinflationary forces (due to the tightening of post-war economic capacity in Europe and Asia) and the oil supply shocks were the ingredients that led to high inflation and stagnant economic growth – stagflation.
After a decade of haphazard economic initiatives (e.g. price controls) and ambivalent US monetary policy, Paul Volker – who became chairman of the US Federal Reserve in 1979 – significantly raised short-term US interest rates, starting one of the greatest post-war recessions. It was this dramatic change in interest rates that crushed inflation helping the US Federal Reserve regain credibility.
Why did the US Federal Reserve wait so long to combat inflation? With the memory of the Great Depression still fresh in the minds of many policy-makers, US economic policy was targeted at maximizing employment, and inflation was not seen as a primary economic threat. It was widely felt that aggressively combating inflation would tip a teetering US economy into another depression. Meanwhile, countries like Germany that were familiar with the pain of hyperinflation were quicker to combat inflationary pressures. (This highlights how the collective memories of a society shape political willpower and can lead governments to create erroneous economic policies.)
For the United States, combating inflation early in the 1970s by slowing economic activity would have been political suicide. It took a decade of inflationary pain before policy-makers and the public were willing to accept that inflation was as much a threat to the economy as deflation and unemployment.
The 2008 collapse of the global financial system has parallels to the inflationary experience of the 1970s. Throughout the late 1990s and early 2000s, many policy-makers were aware of the growing threat that concentrated financial intermediaries, leverage, derivatives exposure and skyrocketing real estate values posed to the financial system. It was no secret that these elements posed massive systemic risks. However, the political will-power did not exist to do anything. As these elements of the economy had yet to cause severe economic pain, it was very difficult to get politicians, businesses and consumers to accept the preventative measures that needed to take place. Preventative measures would have slowed economic growth and prosperity – all to safeguard the economy from threat that, at the time, was theoretical and intangible.
Similar circumstances exist with homeland security, cancer prevention, driving behavior, etc. It is extremely difficult to mobilize a population to willingly experience current pain (financial, lifestyle, effort, etc.) in exchange for reducing a potentially larger theoretical future pain.
Today, the US dollar remains a free-floating currency not backed by gold or any other commodity. Instead of being backed by gold US banknotes are backed by the full faith and credit of the US government. Currency value is predicated on the faith that governments won’t print more than what is necessary to keep up with real economic growth. However, with the largest fiscal and monetary expansion in US history currently occurring, combined with the collective global memory of an extremely painful recession/depression, the risk of inflation over the medium/long-term is very high.