Deregulation found an enthusiastic advocate in President Ronald Reagan. In many ways, Reagan is the intellectual father of the modern radical deregulatory movement. At first, expensive, onerous provisions were targeted. But it wasn't long before all regulation became viewed as inherently evil. In this worldview, government was the source of problems, not possible solutions. Reagan famously quipped: “The nine most terrifying words in the English language are, âI'm from the government and I'm here to help.' ” Reducing not only regulations, but the entire size of the government became a goal of this ideology.
Of course, that was before the entire banking system fell apart, and Uncle Sam began writing checks for trillions of dollars. Today, the most terrifying words might be more along the lines of “I run a highly leveraged, unregulated financial institution that owns lots of derivatives.”
But that was yet to occur. Adherents of a strict free market philosophy wanted to remove the decision-making oversight from the bureaucracy and replace it with the “relentless efficiency” of the markets. This was a hopelessly idealistic view of markets, naively premised on false assumptions of market efficiency and human rationality.
Eventually, deregulation became an end unto itself, rather than a means to an end. Along with the costly, unnecessary regulations that were targeted for elimination, effective and necessary safeguards were also removed. Pragmatic decision making was replaced with rigid ideology.
In the ensuing decades, the United States morphed from an overly regulated economy to an absurdly deregulated one. What started as a reasonable pushback against excessive government regulations was soon taken to all sorts of irrational extremes. Any supervision was soon viewed as suspect.
This was especially true when it came to the regulation of commercial and investment banks. Under President Bill Clinton, several key legislative proposals were passed that reduced oversight and supervision. Key Depression-era legislation was overturned.
While Clinton was a Southern Democrat who believed in both government and markets, President George W. Bush took this a huge step further. Like his predecessor, he believed in markets; but when it came to government, he was far less enthusiastic. His appointments to key administrative positionsâSEC chairman, Treasury secretary, and most infamously the Federal Emergency Management Agency (FEMA)âwere lackluster at best.
Thus, the United States moved from a state of aggressive, post-Depression financial oversight to one of negligent supervision. The potential for disaster increased rapidly. It was this massive philosophical and regulatory shift that set the table for the current financial crisis.
T
he first major regulatory changes came about in 1999. That's when a significant Depression-era banking regulation, the Glass-Steagall Act, was repealed. The Gramm-Leach-Bliley Act reversed the rules that prohibited bank holding companies from owning other financial firms. This allowed insurers, banks, and brokerage firms to merge into giant financial centers. Had it not been for Gramm-Leach-Bliley, Citibank could not have evolved into the unruly beast it became.
Freed from Glass-Steagall's strictures, money center banks entered into all manner of underwriting: not just initial public offerings (IPOs) and bond issuance, but structured financial products, including collateralized debt obligations (CDOs) and credit default swaps (CDSs). These derivatives are one of the prime villains in the credit crisis, and grew out of the housing debacle.
The key factor was size. In the new, deregulated environment, banks and brokers were allowed to scale up to become behemoths. What was big became huge; what was huge became enormous. With so many moving parts, too much leverage, and too much risk taking, banks became too big to be effectively managed. In bailout terms, they were too big to fail, but in actual operation they were too big to succeed. They had become so massive that managing all the moving partsâand controlling for riskâwas all but impossible. Once conservative, risk-averse banks had become giant unregulated hedge funds, disaster was all but inevitable.
More importantly, banks started adopting the “eat what you kill” compensation systems. The bonus structure, replete with short-term financial incentives, began to dominate banks. Throw in monthly performance fees and annual stock option incentives, and you end up with a skewed business model suddenly embracing quicker trading profits.
This had an enormous impact upon the ways investment banks approached business generation and risk management. Like many public companies, they became increasingly short-term focused. “Making the quarter,” in Street parlance, meant pulling out all the stops to hit your quarterly profit figures, by any means necessary. Incentives became misaligned with shareholders' interests, as risky short-term performance was rewarded with huge bonuses. Not surprisingly, this worked to the detriment of long-term sustainability.
But short-termism was only part of the equation. Of greater concern was how these firms' internal risk management changed. Unlike in public corporations, partners are
personally liable
for the acts of any of the members of the partnership. If any one of a firm's partners or employees loses a trillion dollars, every last partner is on the hook for that money. As you would imagine, this creates enormous incentives to make sure that risk is managed very, very carefully. Nothing focuses the mind like the real possibility that any partner could bankrupt all the rest. It's no coincidence that partnerships like Lazard Freres and Kohlberg Kravis Roberts did not suffer the same kind of risk management failures as Bear Stearns and Lehman Brothers, among others. (Lazard went public in 2005, but too late in the credit cycle for it to get into much trouble.)
Depository banks are
supposed
to be managed in ways that limit risk. They hold the public's deposit accounts (such as checking and savings accounts), which they then use to provide credit to businesses and individuals. The Federal Deposit Insurance Corporation (FDIC) insures all of these deposits against loss, and insists (quite reasonably, I might add) the monies not be handled recklessly. In contrast, investment banks
by definition
embrace risk; their business model focuses on more speculative activities that are inherently riskier. Activities such as trading and mergers, bringing companies public, and managing investments involve a greater possibility, even likelihood, of losses.
The repeal of Glass-Steagall didn't cause the crisisâit only made the collapse worse, deeper, and more expensive. The skyrocketing costs of the bailouts are in part due to disasters in the riskier investment banking sector spilling over into the risk-averse commercial (depository) banking sector. Glass-Steagall was adopted in 1933
expressly to prevent this from occurring.
Repealing the act allowed the commercial banks to operate investment bank units. Both ends of the risk spectrum ended up festooned with all manner of junk paper. If Glass-Steagall were still in effect, the banks would have had little incentive to buy the junk from the brokers.
It also meant financial carnage at investment banks was no longer quarantined from commercial banks. Hence, the ugly financial impact at Citibank and others can be traced directly to the Gramm-Leach-Bliley act. The repeal of Glass-Stengall could very well end up being the single most costly legislative repeal in the nation's history.
Gramm-Leach-Bliley may not have been the proximate cause of the disaster, but it is precisely why the overall problem was not “contained.”
The rapidly growing trade in derivatives poses a “mega-catastrophic risk.” . . .
[
F
]
or the economy, derivatives are financial weapons of mass destruction that could harm not only their buyers and sellers, but the whole economic system.
âWarren Buffett, Berkshire Hathaway 2002 Annual Report
A
llowing banks and brokers to merge was only one factor that led to the credit crisis. After repealing Glass-Steagall, the following year Congress passed the Commodity Futures Modernization Act of 2000 (CFMA). This legislation allowed derivatives such as credit default swaps (CDSs) to become an enormous, unregulated shadow insurance industry. Many of the horrific losses suffered by AIG, Lehman Brothers, and Bear Stearns trace their paternity to this act.
Glass-Steagall may have set the table, but the Commodity Futures Modernization Act was the poison in the wine.
The CFMA removed derivatives and credit default swaps from
any and all state and federal regulatory oversight
. There were no reserve requirements, as were required in insurance policies. There were no audit mandates, so parties were not assured that their counterparties could make payment when a CDS was supposed to pay off. And there was no central clearing firm, so nobody knew precisely how many CDSs there were or who owned them. Until recently, even the dollar value these derivatives totaled was unknown.
Since the crisis has broken into the open, we now have a few reasonable, if imprecise, estimates on the value of derivatives contracts at various bailout banks. Prior to the passage of the CFMA, unregulated credit default swaps were under $100 billionâa sizable, if manageable, amount of derivatives contracts. By 2008, they had grown to over $50 trillion. To put this in context, that's four times the size of the annual gross domestic product (GDP) of the United States.
Alan Greenspan had “fiercely objected whenever derivatives have come under scrutiny”
1
in either Congress or on Wall Street. In 2003, Greenspan told the Senate Banking Committee:
What we have found over the years in the marketplace is that derivatives have been an extraordinarily useful vehicle to transfer risk from those who shouldn't be taking it to those who are willing to and are capable of doing so. We think it would be a mistake to more deeply regulate the contracts.
2
Critics had been warning about derivatives for years, but they made no headway against Greenspan. In 2008, Peter Goodman took a hard new look at the Greenspan legacy. Writing in the
New York Times
, he observed, “Time and again, Mr. Greenspanâa revered figure affectionately nicknamed the Oracleâproclaimed that risks could be handled by the markets themselves.”
3
Former Federal Reserve board member and Princeton economist Alan S. Blinder was less generous. “I think of him as consistently cheerleading on derivatives.”
4
Why was Greenie so opposed to any oversight of derivatives trading? Former SEC chair Arthur Levitt said it was a fundamental disdain for government. It was part of an ideological shift away from governmentâfrom Reagan to Greenspan to Clinton to Bushâand toward markets.
Kenny Boy and the Gramms
The Commodity Futures Modernization Act is one of the most egregious examples of private enterprise dictating public policy via a combination of sophisticated lobbying and old-fashioned nepotism. The bill was introduced on December 15, 2000, the last day before the Christmas recess. The bill was never debated in either the House or the Senate and was discreetly attached as a rider to the 11,000-page-long omnibus budget bill signed into law by (then) lame-duck President Bill Clinton on December 21, 2000.
But that's just the half of it.
Among the over-the-counter derivatives freed from any federal jurisdiction by the CFMA were energy futures. The bill also included “language advocated by Enron that largely exempted the company from regulation of its energy trading on electronic commodity markets, like its once-popular Enron Online.”
5
This became known as the “Enron loophole” and was designed to help the Houston-based firm pursue its goal of becoming the dominant player in the trading of energy futures, which it saw as having much more profit potential than actually producing energy had.
A key sponsor of the CFMA was Texas Senator Phil Gramm, whose wife, Dr. Wendy Gramm, was a member of Enron's board, which she joined in 1992 after providing the company with some regulatory relief in her prior role as chair of the Commodity Futures Trading Commission.
According to Public Citizen, a national, nonprofit consumer advocacy organization:
6
⢠Enron paid Dr. Gramm between $915,000 and $1.85 million in salary, attendance fees, stock option sales, and dividends from 1993 to 2001. The value of Wendy Gramm's Enron stock options swelled from no more than $15,000 in 1995 to as much as $500,000 by 2000.
⢠Days before her attorneys informed Enron in December 1998 that Wendy Gramm's control of Enron stock might pose a conflict of interest with her husband's work, she sold $276,912 worth of Enron stock.
⢠Enron spent $3.45 million in lobbying expenses in 1999 and 2000 to deregulate the trading of energy futures, among other issues.
In 2002, internal Enron documents revealed the company helped write the Commodity Futures Modernization Act. Senator Gramm later claimed he was not responsible for inserting the “Enron loophole” into the legislation. “But once the Commodity Futures Modernization measureâwith this provision includedâreached the Senate floor, Mr. Gramm led the debate, urging his fellow senators to pass it into law,” the New York Times reported.
7
After failed efforts in 2002, 2003, and 2006, Congress finally closed the “Enron loophole” in 2008 with the passage of the Farm Bill that included an amendment to have the energy futures contracts regulated by the CFTC “with the same key standards (âcore principles') that apply to futures exchanges, like NYMEX, to prevent price manipulation and excessive speculation.”
8
Enron's scam had long since come unraveled by 2008, but it still had powerful friends in Washington: Congress had to override President Bush's veto to get the bill passed and the loophole closed.