The Great Capitalist Heist: How Paris Hilton’s Dogs Ended Up Better Off Than You
Summer 2009. Unemployment is soaring. Across America, millions of terrified people are facing foreclosure and getting kicked to the curb. Meanwhile in sunny California, the hotel-heiress Paris Hilton is investing $350,000 of her $100 million fortune in a two-story house for her dogs. A Pepto Bismol-colored replica of Paris’ own Beverly Hills home, the backyard doghouse provides her precious pooches with two floors of luxury living, complete with abundant closet space and central air.
By the standards of America’s rich these days, Paris’ dogs are roughing it. In a 2006 article, Vanity Fair’s Nina Munk described the luxe residences of America’s new financial elite. Compared with the 2,405 square feet of the average new American home, the abodes of Greenwich Connecticut hedge-fund managers clock in at 15,000 square feet, about the size of a typical industrial warehouse. Many come with pool houses of over 3,000 square feet.
Steven Cohen of SAC Capital is a typical product of the New Gilded Age. He paid $14.8 million for his Greenwich home, which he stuffed with a personal art collection that boasts Van Gogh’s Peasant Woman Against a Background of Wheat (priced at $100 million); Gauguin’s Bathers ($50 million); a Jackson Pollock drip painting (also $50 million); and Andy Warhol’s Superman ($75 million). Not satisfied, Cohen spent millions renovating and expanding, adding a massage room, exercise and media rooms, a full-size indoor basketball court, an enclosed swimming pool, a hairdressing salon, and a 6,734-square-foot ice-skating rink. The rink, of course, needs a Zamboni ice-resurfacer which Cohen houses in a 720-square-foot shingle cottage. Munk quotes a visitor to the estate who assured her, “You’d be happy to live in the Zamboni house.”
So would some of the over 650,000 Americans sleeping in shelters or under highway overpasses.
By the time it was finished, Cohen’s house had swelled to 32,000 square feet, the size of the Taj Mahal. Even at Taj prices, cost mattered little to a man whose net worth is estimated by the Wall Street Journal at $8 billion — with an income in 2010 of over $1 billion. Cohen’s payday is impressive, but by no means unique. In 2005, the 25 hedge-fund managers averaged $363 million. In cash. Paul Krugman observes that these 25 were paid three times as much as New York City’s 80,000 public school teachers combined. And because their pay is taxed as capital gains rather than salary, the teachers paid a higher tax rate!
Back in the 18th century, Alexis de Tocqueville called America the “best poor man’s country.” He believed that “equality of conditions” was the basic fact of life for Americans. How far we’ve come! Since then, the main benefits of economic growth have gone to the wealthy, including the Robber Barons of the Gilded Age whom Theodore Roosevelt condemned as “malefactors of great wealth” living at the expense of working people. By the 1920s, a fifth of American income and wealth went to the richest 1 percenters whose Newport mansions were that period’s Greenwich homes. President Franklin Roosevelt blamed these “economic royalists” for the crash of ’29. Their recklessness had undermined the stability of banks and other financial institutions, and the gross misdistribution of income reduced effective demand for products and employment by limiting the purchasing power for the great bulk of the population.
Roosevelt’s New Deal sought to address these concerns with measures to restrain financial speculation and to redistribute wealth down the economic ladder. The Glass-Steagall Act and the Securities Act restricted the activities of banks and securities traders. The National Labor Relations Act (the “Wagner Act”) helped prevent business depression by strengthening unions to raise wages and increase purchasing power. Other measures sought to spread the wealth in order to promote purchasing power, including the Social Security Act, with retirement pensions, aid to families with dependent children, and unemployment insurance; the Fair Labor Standards Act, setting a national minimum wage and maximum hours; and tax reforms that lowered taxes on workers while raising them on estates, corporations and the wealthy. And the kicker: Through pronouncement and Employment Act (1946), the New Deal committed the U.S. to maintain full employment.
The New Deal reversed the flow of income and wealth to the rich. For 25 years after World War II, strong labor unions and government policy committed to raising the income of the great majority ensured that all Americans benefited from our country’s rising productivity and increasing income.
Advocates of laissez faire economics warned that we would pay for egalitarian policies with slower economic growth because we need inequality to encourage the rich to invest and the creative to invent. But the high costs of inequality in reduced social cooperation and wasted human capital point to the giant flaws in this view. A more egalitarian income distribution provides better incentives for investment, and our economy functions much better when people can afford to buy goods and services.
The New Deal ushered in a period of unusually rapid and steady economic growth with the greatest gains going to the poor and the middle-class. Strong unions ensured that wages rose with productivity, government tax and spending policies helped to share the benefits of growth with the poor, the retired and the disabled. From 1947-’73, the bottom 90 percent received over two-thirds of economic growth.
Then, the political coalition behind the New Deal fragmented in the 1960s. Opponents seized the moment and reversed its policies. They began to funnel income toward the rich. With a policy agenda loosely characterized as “neoliberalism,” conservatives (including much of the economics profession) have swept away the New Deal’s focus on employment and economic equity to concentrate economic policy on fighting inflation by strengthening capital against labor. That has worked out very badly for most of America.
The GOP has led the attack on Roosevelt’s legacy, but there has been surprising bipartisan support. President Carter got the ball rolling with his endorsement of supply-side taxation and his commitment to fight inflation by promoting labor market competition and raising unemployment. Carter’s policies worked to reverse the New Deal’s tilt toward labor and higher wages. Under his watch, transportation and telecommunications were deregulated, which undermined unions and the practice of industry-wide solidarity bargaining. Carter also campaigned to lower trade barriers and to open our markets to foreign trade. These policies were presented as curbs on monopolistic behavior, but the effect was to weaken labor unions and drive down wages by allowing business to relocate production to employ lower-wage foreign workers while still selling in the American market.
Carter also began a fatal reversal of economic policy by refusing to support the Humphey-Hawkins Full Employment Act. Instead of pushing for full employment, Carter appointed Paul Volcker to chair the Federal Reserve with the charge to use monetary policy to restrain inflation without regard for the effect on unemployment. Since then, inflation rates have been brought down dramatically, but unemployment has been higher and the growth rate in national income and in wages has slowed dramatically compared with the New Deal era.
Already in the 1970s, a rising tide of anti-union activities by employers led Douglas Fraser, the head of the United Auto Workers to accuse employers of waging a “one-sided class war against working people, the unemployed, the poor, the minorities, the very young and the very old, and even many in the middle class of our society.” Organized labor’s attempt to fight with labor reform legislation amending the Wagner Act found little support in the Carter White House and went down to defeat in the Democratic-controlled Senate.
Any residual commitment toward collective bargaining under the Wagner Act was abandoned during the Reagan administration, ironically the only union president ever elected to the White House. Reagan, of course, is known as the president who fired striking air traffic controllers in 1981. He is also known for the devastating regulatory changes during his presidency and those of his Republican successors (the two Presidents Bush). Their appointments to the National Labor Relations Board helped to turn this agency from one charged with promoting union organization and collective bargaining to one charged with ensuring that employers were free to avoid unions. Under this new regime, private sector unionism, the unions covered by the Wagner Act, has almost disappeared.
The 1970s also saw a shift in tax policy away from the principles of ability-to-pay and income redistribution toward those associated with supply-side economists who argued for lower taxes on the rich to provide incentives to accumulate wealth. After campaigning for tax reform, Carter signed the Revenue Act of 1978, which gave small tax benefits for working people and dramatic cuts in capital gains and corporate taxes and on the top marginal rates. Since then, major reductions on taxes paid by the rich enacted under Presidents Reagan and George W. Bush have dramatically reduced the tax burden on the richest Americans.
Government spending policies have also turned away from ordinary Americans. In 1996, under President Bill Clinton, a vital piece of the New Deal safety net was repealed with the “Personal Responsibility and Work Opportunity Reconciliation Act.” Abolishing the provisions of the Social Security Act that established the program of Aid to Families with Dependent Children, the 1996 law ended the national right to relief. Along with restrictions on unemployment insurance, the abolition of programs of public jobs for the unemployed and gradual reductions in the real value of Social Security benefits, this act was another blow for working people.
The New Deal showed us how to combine economic growth and lower levels of unemployment. But the widening gap between rich and poor since the 1970s has been associated with higher levels of unemployment and a slowing of economic growth. Had economic growth rates continued after 1978 at the same rate as during the decades before, average income would have been more than $14,000 higher than it actually was in 2008.
The slowdown in growth since the abandonment of egalitarian New Deal policies has cost Americans about 30 percent of their income. And the massive redistribution of income away from average Americans and toward the rich has destroyed the sense that America is a land of opportunity for all. Quality of life has plunged because the shredding of social protections has exposed average Americans to much higher levels of risk. The substitution of defined contribution pensions, such as Individual Retirement Accounts or 401K plans, for defined benefit pensions has reduced retirement security for individuals while reducing the risk borne by employers or other social institutions. Just as important as declining income for many Americans, the stress and anxiety associated with the risk shift has contributed to rising levels of depression and morbidity and a decline in life expectancy for Americans compared with residents of other countries.
Workers’ security has been abandoned. But the government has let financial markets run wild. In 1982, Congress deregulated the thrift industry, freeing thrifts to engage in reckless and fraudulent behavior. In 1994, it removed restrictions on interstate banking. In 1998 it allowed Citigroup to merge with Travelers’ Insurance to create the world’s largest financial services company. And in the Gramm-Leach-Bliley Act of 1999, it repealed the remaining Glass-Steagall barriers between commercial and investment banking. Acting with the virtual consent of Congress and the president, in 2004, the Securities and Exchange Commission established a system of voluntary regulation that in essence allowed investment banks to set their own capital and leverage standards.
By then our financial regulatory system had largely returned to the pre-New Deal situation in which we trusted financial institutions to self-police. Advocates of deregulation, like Federal Reserve chair Alan Greenspan, were unconcerned because they expected banks and other financial firms to limit their risk for fear of failure. Either they misunderstood the incentives facing company managers, or they did not care. In practice, financiers are playing with other people’s money (ours). When they do well, their compensation is tied to profits and they can earn huge sums. But when their investments fail, they are protected because monetary authorities and the United States Treasury cannot allow “too big to fail” financial companies to go bust. So long as risky investments would have periods of high returns, the managers of deregulated financial firms have an incentive to increase their risk, profiting from success while passing the costs of failure to the public. We have all been suffering from the consequences of their failures since the financial crisis of 2007-’08.
The share of income going to the top 1 percent has doubled since the 1970s, returning to the levels of the 1920s. The greatest gains have gone to the very wealthiest and to executives and managers, especially of financial firms. From 1973 to 2008, the average income of the bottom 90 percent of American households fell even while the rich gained. The wealthiest 1 percent gained 144 percent or over $600,000 per household; and the richest 1 percent of the 1 percent, barely 30,000 people, gained over 455 percent or over $19,000,000.
That’s enough to buy a nice doghouse. Or a mansion in Greenwich.