By Imre Gams, Editor, Money Trends Cowrie shells. Beaver pelts. Pound of tobacco. At one time or another, each of these was a form of commodity money. And each had its own relationship to a particular commodity… That commodity has played a key role in monetary history since 4,000 B.C. And, until recently, it even served as the linchpin to the U.S. economy… As you probably already know, I’m talking about gold. But what you might not know is how gold’s supremacy gave way to the U.S. dollar as the world’s reserve currency – a move that redefined the nature of our money. Let me explain… From Barter to Fiat All money, whether commodity-backed or fiat, must have an agreed-upon value by all parties involved in the transaction. That’s one of the key insights that underpins our money system. One of the qualities that gives currencies an agreed-upon value is fungibility. Fungibility means that all the basic units of a good or commodity are interchangeable with and indistinguishable from each other. In other words, whatever form of money the members of a society agree upon, each unit of that money has to have the same value as every other unit. Think of it this way. If you’re buying a $2 chocolate bar at the store and hand the clerk two $1 bills as payment, the clerk won’t inspect each bill to see if they’re different from one another (other than to check for fraud). Similarly, if you pay with a $10 bill, the clerk will hand you $8 in change, and you won’t inspect each coin or bill you get back for unique flaws. You trust that the money is fungible and can easily be interchanged with any other dollar bill or coin. Before we had fiat currencies, civilizations around the world developed their own forms of commodity-based currencies. Some societies in ancient India, China, and Africa even used cowrie shells as money. As human civilization became increasingly complex, however, the need for a more sophisticated form of currency became apparent. Enter gold… Recommended Link | Reclusive Best-Selling Author Issues Final Warning The Founder of the largest “underground” financial research firm in the world… Has one last warning to readers.
Bill Bonner founded The Agora in 1978. Today – after 40+ years in the publishing business – he’s coming forward with his final warning. And not a moment too soon… With offices in 11 countries… millions of daily readers… and two New York Times best-sellers to his name already… Bill’s latest – and last – warning may be the most important of his career. |
| -- |
Establishing the Gold Standard As society experimented with different kinds of commodities to use as money, gold came to the forefront. The advantage of gold was its durability. It didn’t tarnish or change shape over time, so it didn’t lose its value. In our modern history, the first region to adopt a gold standard was the British West Indies in the Caribbean. These British territories adopted their gold standard in 1704. Other regions and countries followed suit. The U.S., for example, established a fixed ratio of gold to the dollar for the first time in 1792. (If you’re a rare coin collector, you may know the United States began issuing the original gold "Eagle" coin just three years later, in 1795.) This version of the gold standard lasted until World War I. The terrifying prospect of war made people want to withdraw the physical gold they had on deposit with their banks. Banks, of course, were terrified that these withdrawals would deplete their gold reserves. The Bank of England eventually began issuing Treasury notes to replace the circulation of gold coins, effectively ending the gold standard in the United Kingdom and the rest of the British Empire. In its place, the British signed the Gold Standard Act of 1925 into law. It allowed people to redeem paper notes for gold bullion – bars made up of around 400 troy ounces of gold. Many other countries followed Britain’s example and introduced their own versions of the 1925 gold standard. And the most notable was the one the United State implemented after World War II, to deal with the financial fallout from the war… The Rise of the U.S. Dollar In 1944, representatives from all 44 Allied nations met in New Hampshire to sign a series of agreements now known as Bretton Woods. The Bretton Woods agreement was meant to help with the financial rebuilding of much of the rest of the world. Europe lay in ruins, and Japan had just been devastated with two nuclear bombs. This gave America a clear path towards establishing a new kind of world order… and becoming the world’s economic superpower. The U.S. controlled more than two-thirds of the official global supply of gold. So it established a fixed relationship between gold and the dollar. Specifically, 35 dollars was equal to 1 troy ounce of gold. Other currencies were then pegged to the U.S. dollar at fixed rates. This meant that the U.S. had an obligation to redeem dollars in gold for the central banks of other nations. This system held together for several decades… until the U.S. embarked on its “guns and butter” program in the 1960s, which expanded both the country’s defense budget and its social programs. The combination of social welfare programs and the Vietnam War frightened the U.S.’s trading partners. They felt the deficit spending was dangerous and unsustainable. The French started demanding gold in exchange for its dollars, as they’d been promised… but the U.S. refused to give up its gold reserves. And so, America’s gold-backed regime collapsed. In December 1971, the Smithsonian Agreement effectively ended the fixed-rate exchange system. The free-floating exchange system started several years later, and it persists today. Regards, Imre Gams Editor, Money Trends Like what you’re reading? Send your thoughts to feedback@andykriegertrading.com. |