Authors: Michael Lind
The growth of financial-market control over corporations probably would not have occurred but for the growth of institutional investors seeking high yields. As late as 1952, households—mostly rich families—held 90.4 percent of all corporate stock. By 1994, households owned only 48 percent. Meanwhile, in the same period, the pension fund share of corporate stock ownership grew to 25 percent and that of mutual funds increased to 10 percent.
39
The share of corporate equities owned by public and private pension funds grew from 6 percent in 1965 to 23 percent in 2007, while the share owned by mutual funds increased from 5 percent in 1985 to 26 percent in 2007.
40
The shift from defined-benefit plans to defined-contribution plans like 401(k)s and individual retirement accounts (IRAs) further swelled the assets under institutional management. IRA savings invested in mutual funds expanded from 17 percent in 1985 to 49 percent in 1999.
41
By 2008, the brokerage firm Fidelity was the largest shareholder in approximately one in ten publicly traded corporations listed on Nasdaq and the NYSE, exercising potential power far beyond that of J. P. Morgan in the days of interlocking directorates.
42
This phenomenon in turn was driven by a peculiarity of America’s midcentury social contract: the role of employer-provided defined-benefit pension plans and, later, tax-favored defined-contribution plans like IRAs and 401(k)s. Social Security’s low rate of preretirement income replacement made a growing number of Americans desperate to see high rates of return from the retirement savings they or their employers had entrusted to mutual funds. By 2005, institutional investors like mutual funds accounted for three-fourths of the ownership of the typical large corporation.
43
The rapid expansion of stock ownership on the part of mutual funds increased the pressure on corporations to maximize their short-term earnings, at the expense if necessary of long-term investment and growth. In 2005, 80 percent of more than four hundred chief financial officers responded to a survey by saying that “they would decrease discretionary spending on such areas as research and development, advertising, maintenance, and hiring in order to meet short-term earnings targets.”
44
Short-termism was reflected in the fact that institutional investors and others moved away from long-term investments in companies. The annualized turnover of all stocks on the NYSE rose from 36 percent in 1980 to 118 percent in 2006.
45
In the new age of financial-market capitalism, as in the era of finance capitalism in the early 1900s, American industry was subordinate to finance. But there was a profound difference. Finance capitalists like J. P. Morgan were long-term investors who sought to profit over many years from the industrial corporations and utilities that they owned. The new financial-market capitalism, by contrast, was marked by short-term ownership of stocks by the agents of millions of anonymous investors, most of whom had no idea what company stocks were being bought and sold on their behalf.
While some corporate managers lobbied state legislatures to create laws protecting their companies against takeovers, most CEOs in the United States were reconciled to financial-market capitalism by means of stock options. In 1993, Congress changed the tax code to encourage corporations to reward their executives with stock options. By the beginning of the twenty-first century, more than half of the compensation of the average
Fortune
500 executive took the form of stock options.
46
Federal Reserve chairman Alan Greenspan observed that large stock options “perversely created incentives to artificially inflate reported earnings in order to keep stock prices high and rising.”
47
The goal was no longer to build a productive company that would make useful products and last for generations, but to maximize short-term profits in time for the next quarterly earnings report. “Wall Street can wipe you out,” the CEO of Sara Lee observed. “They are the rule-setters. They do have their fads, but to a large extent there is an evolution in how they judge companies, and they have decided to give premiums to companies that harbor the most profits for the least assets. I can’t argue with that.”
48
MASTERS OF THE UNIVERSE
Stock options created fortunes for many CEOs—particularly when their companies drove up the price by buying back stock. In the 1990s, stock buybacks became the major method of distributing corporate revenues to shareholders, even as they increased the wealth of corporate executives who were compensated in stock options. Even so, rising CEO compensation could not keep up with the fortunes being made in the swelling American financial sector. From 1948 to 1980, pay in finance was comparable to that in other lines of business; from 1980 onward, financial-industry compensation on average was twice the pay in other American industries.
49
Beginning in the 1980s, compensation in banking rose much faster than compensation in the rest of the private sector. By 2007, the average financial-sector employee earned twice as much money as the average worker in the rest of the private sector.
50
In 2004, the top twenty-five hedge-fund managers made more money than all the CEOs in the corporations of the S&P 500 put together.
51
Prestige followed wealth and power. In an article in
Harper’s
in 1949, the management theorist Peter Drucker observed: “Where only twenty years ago the bright graduate of the Harvard Business School aimed at a job with a New York Stock Exchange house, he now seeks employment with a steel, oil, or automobile company.”
52
Richard Fisher, the chairman of Morgan Stanley from 1991 to 1997, recollected that after he graduated in the 1960s from Harvard Business School “investment banking was about the worst-paying job available to us. I started at Morgan Stanley at $5,800 a year. It was the lowest offer I had. . . . I’m sure my classmates who went to Procter & Gamble started at $9,000 a year.”
53
By the end of the twentieth century, business students who preferred Procter & Gamble to Morgan Stanley would have been ridiculed. Between 1950 and 1980, graduates of Harvard who went into finance were paid no more than those who went into law, engineering, and medicine. By the 2000s, those who worked in finance made nearly twice as much as their colleagues in other professions.
54
Between 1980 and 2007, the financial sector’s share of US GDP grew from around 10 percent to a peak of around 40 percent. In Britain, where a similar process of financialization produced cancerous growth of the City of London’s financial industry, finance’s share of GDP grew by more than 10 points from 1990 to 2006. In the same period, finance expanded its share of the economies of Germany and France by no more than 6 percent.
55
FROM THE GREAT COMPRESSION TO THE GREAT REGRESSION
Between the 1970s and the early twenty-first century, what some scholars have called “the Great Compression” of incomes in the United States was reversed. Between 1913 and the beginning of the New Deal in 1932, the share of income in the US going to the top 10 percent was between 40 and 45 percent, only to plunge and level off at between 31 and 32 percent from World War II until the 1970s. Beginning in the Reagan years, inequality began to grow until it neared its pre-1929 level before the crash of 2008.
In fact, not one but two forms of inequality revived—earnings inequality and asset inequality. As the economist James K. Galbraith demonstrated, wealth inequality was accounted for largely by the disproportionate gains to investors and the stock options received by Silicon Valley entrepreneurs during the tech bubble of the 1990s.
56
Attributing rising inequality within the United States to allegedly unstoppable forces of globalization or technology appealed to the American elite by implying that their disproportionate gains had nothing to do with the power of some economic classes relative to others. But the most plausible explanations for late-twentieth-century and early-twenty-first-century inequality in the United States attribute it to changes in the bargaining power of capital, labor, and professionals, not to long-run forces beyond human control.
The ability of the working-class majority in the United States to bargain for higher wages was undermined after the 1960s by several purely domestic phenomena: the declining real value of the minimum wage as a result of inflation, low-end labor markets flooded with unskilled, low-wage immigrants, and the decline of labor unions.
REVOKING THE SOCIAL CONTRACT
While wages stagnated in the neoliberal era, economic security declined for many American workers. The welfare-state components of America’s New Deal social contract such as Social Security and Medicare remained robust, despite long-run challenges to their funding. But the welfare-capitalist elements, such as company pensions and employer-provided health care, crumbled rapidly in the late twentieth century. Bankruptcies in the airline, automobile, and other industries forced the Pension Benefit Guarantee Corporation to take over many corporate pensions.
57
Some companies, including Ford and GM, replaced health coverage of retired workers with health retirement accounts.
58
Between 1981 and 2003, the number of employees with corporate pension plans who had defined-benefit (DB) plans declined from 81 percent to 38 percent.
59
Under pressure to cut costs, employers increasingly replaced DB pension plans with defined-contribution (DC) plans like 401(k)s for the minority of Americans who had any pensions at all.
Conservatives and libertarians sought to replace the post–New Deal hybrid of welfare capitalism and welfare statism with tax-favored private accounts. Another welfare-market technique was using tax credits rather than direct public spending to achieve social goals. In the last decades of the twentieth century, the tax code was riddled with new tax subsidies for individuals—the child tax credit and the child-care tax credit among them—joining older ones like the home-mortgage-interest deduction. While the child tax credit was refundable—that is, paid to individuals who made too little income to pay federal income taxes—most federal income-tax credits were not available to the bottom half of American wage earners, who had been effectively removed from the federal income tax rolls by 2000. This meant that what the social scientist Christopher Howard called “the hidden welfare state” consisted largely of means-tested subsidies of a novel kind—subsidies available only to the affluent, not the poor.
60
The oligarchic nature of the evolving American political system was symbolized by the lack of protest that greeted the “child-care tax credit,” which in essence was a subsidy by Americans who could not afford nannies or other private child care to the affluent minority who could.
AN AMERICAN PLUTONOMY?
In the late nineteenth century, the American elite was identified with the Four Hundred—the number of guests who could be accommodated in Mrs. Astor’s ballroom in New York. From 1992 to 2007, the ratio between the income of the median household and the top four hundred households grew from 1,124 to 1 to 6,900 to 1.
61
For the top four hundred households, pretax income, adjusted for inflation, ballooned by 409 percent between 1992 and 2007, while for the median American family of four it increased by only 13.2 percent.
62
The top four hundred households in the United States enjoyed a decline in their effective tax rates (the percentage of income paid in taxes) from 26.4 percent in 1992 to 16.6 percent in 2007.
63
Hedge-fund managers benefited from a tax provision that allowed the income they were paid for managing their hedge funds to be taxed at a low capital gains rate rather than the highest income tax rate.
While conservatives and libertarians promoted visions of “the ownership society” in which every American would be an investor, capital income grew much more concentrated. In 1979, the top 10 percent of Americans by income received 67 percent of the income from capital, while 33 percent went to the bottom 90 percent. In 2006, the share of capital income that went to the top 10 percent of Americans had increased to 81.3 percent while that of the bottom 90 percent had declined to 18.7 percent.
64
The percentage of the increase in disposable income that went to the top 1 percent of US households fell from 22–23 percent in 1929 to a low of 8–9 percent in the 1970s, before rising to a remarkable 73 percent during the two terms of George W. Bush.
65
Because capital gains were taxed at lower rates than income from labor, Warren Buffett observed that he and other billionaires were taxed at lower rates than their secretaries.
In 2005, three Citigroup analysts—Ajay Kapur, Niall MacLeod, and Narendra Singh—described the United States as a “plutonomy.” They explained, “Plutonomies have occurred before in sixteenth century Spain, in seventeenth century Holland, the Gilded Age and the Roaring Twenties in the U.S. What are the common drivers of Plutonomy? Disruptive technology-driven productivity gains, creative financial innovation, capitalist-friendly cooperative governments, an international dimension of immigrants and overseas conquests invigorating wealth creation, the rule of law, and patenting inventions. Often these wealth waves involve great complexity, exploited best by the rich and educated of the time.” In a plutonomy, the economy is driven by the consumption of the classes, not the masses: “In a plutonomy there is no such animal as ‘the U.S. consumer’ or ‘the UK consumer,’ or indeed the ‘Russian consumer.’ There are rich consumers, few in number, but disproportionate in the gigantic slice of income and consumption they take. There are the rest, the ‘non-rich,’ the multitudinous many, but only accounting for surprisingly small bites of the national pie.” The Citigroup analysts speculated that a plutonomic world economy could be driven by the spending of the world’s rich minority, whose ranks are “swelling from globalized enclaves in the emerging world.”
66