Gold standard is a monetary system wherein the value of domestic currencies is fixed to a certain amount of gold. National money including bank deposits and bank notes is convertible to gold at a fixed price. Gold is used as the standard because of its durability, rarity, and universal acceptance. When it is used as part of the hard-money system, it reduces the volatility of currencies.
In 1933, legislators in the United States passed the most draconian economic regulation: the Emergency Banking Act. All Americans were told to convert their bullions, gold coins, and certificate of gold deposits into US dollars. While the Act stopped gold outflow during the Great Depression, the allure of gold did not end there. If you are interested in investing in gold today, it is important to understand the history of the gold standard.
Early History of Gold
For more than 5,000 years, gold was the single commodity that was universally accepted between individuals, groups, and civilizations. Its density, malleability, and luster made it valuable. It was during 700 BC that gold coins were introduced, thus enhancing its usability as money.
In 1696, the Great Recoinage in English standardized the production of coins. This effective eliminated a common practice of clipping gold coins to accumulate enough metal to melt down into bullion. However, since it is impossible to rely on a fixed amount of additional supplies from mining, gold supply expanded through trade, deflation, and pillage. Europe was first introduced to paper money in the 16th century but it was not until the 18th century that it dominated the market. The difficult choice between paper money and gold resulted to the gold standard.
Rise and Fall of the Gold Standard
England became the first country to use the gold standard in 1821. Dramatic increase in production and trade as well as more discoveries of gold cemented the gold standard until the next century. Trade imbalances between countries were settled using the precious metal. By 1900, most developed nations were using the gold standard. This monetary system was at its peak from 1871 to 1914. Everything changed in 1914 during the Great War. The weakness of the gold standard was exposed when alliances changed and government finances deteriorated. By 1913, only France and the US were left with large gold reserves.
In 1934, the US had to devalue gold to $35/oz from $20.67/oz to improve its economy. As a result, other nations converted their gold reserves into US dollars. The conversion of gold into US dollars allowed the United States government to corner the market. With World War II coming to a close, a number of western powers came up with the Bretton Woods Agreement which was used as the framework until 1971.
The United States had 75% of the global monetary gold reserve by the end of the war. Its currency was the only one directly backed by gold. However, it saw reserves drop as the precious metal flowed away from the US into rebuilding war-torn nations. The inflationary environment signaled the end of the gold standard. From 1968 to 1971, only central banks can trade at $35/oz with the United States. By 1971, the last remaining vestiges of the gold standard were shed but the appeal of gold remains to this day. Today, both developed and developing nations have significant gold reserves to prove their creditworthiness. The United States has the highest amount of total gold reserves.
We hope you enjoyed reading the above definition. If you'd like to learn more about gold and in particular about its most recent price swings and their implications, we invite you to sign up for our gold newsletter. It's free and if you don't like it, you can easily unsubscribe..Back