to support workers’ bargaining positions and restrain perceived excess at the top. Workers’ productivity was substantially lower in the 1960s than it is today, but the minimum wage, adjusted for inflation, was considerably higher. Labor laws were interpreted and enforced in a way that favored unions. And there was often direct political pressure on large companies and top executives who were seen as stepping over the line. John F. Kennedy famously demanded that steel companies, which had just negotiated a modest wage settlement, rescind a price increase.

To see how different labor relations were under the Treaty of Detroit from their state today, compare two iconic corporations, one of the past, one of the present.

In the final years of the postwar boom General Motors was America’s largest private employer aside from the regulated telephone monopoly. Its CEO was, correspondingly, among America’s highest paid executives: Charles Johnson’s 1969 salary was $795,000, about $4.3 million in today’s dollars—and that salary excited considerable comment. But ordinary GM workers were also paid well. In 1969 auto industry production workers earned on average almost $9,000, the equivalent of more than $40,000 today. GM workers, who also received excellent health and retirement benefits, were considered solidly in the middle class.

Today Wal-Mart is America’s largest corporation, with 800,000 employees. In 2005 Lee Scott, its chairman, was paid almost $23 million. That’s more than five times Charles Johnson’s inflation-adjusted salary, but Mr. Scott’s compensation excited relatively little comment, since wasn’t exceptional for the CEO of a large corporation these days. The wages paid to Wal-Mart’s workers, on the other hand, do attract attention, because they are low even by current standards. On average Wal-Mart’s nonsupervisory employees are paid about $18,000 a year, less than half what GM workers were paid thirty-five years ago, adjusted for inflation. Wal-Mart is also notorious both for the low percentage of its workers who receive health benefits, and the stinginess of those scarce benefits.[6]

What Piketty and Saez, Levy and Temin, and a growing number of other economists argue is that the contrast between GM then and Wal-Mart now is representative of what has happened in the economy at large—that in the 1970s and after, the Treaty of Detroit was rescinded, the institutions and norms that had limited inequality after World War II went away, and inequality surged back to Gilded Age levels. In other words, the great divergence of incomes since the seventies is basically the Great Compression in reverse. In the 1930s and 1940s institutions were created and norms established that limited inequality; starting in the 1970s those institutions and norms were torn down, leading to rising inequality. The institutions-and-norms explanation integrates the rise and fall of middle-class America into a single story.

The institutions-and-norms explanation also correctly predicts how trends in inequality should differ among countries. Bear in mind that the forces of technological change and globalization have affected every advanced country: Europe has applied information technology almost as rapidly as we have, cheap clothing in Europe is just as likely to be made in China as is cheap clothing in America. If technology and globalization are the driving forces behind rising inequality, then Europe should be experiencing the same rise in inequality as the United States. In terms of institutions and norms, however, things are very different among advanced nations: In Europe, for example, unions remain strong, and old norms condemning very high pay and emphasizing the entitlements of workers haven’t faded away. So if institutions are the story, we’d expect the U.S. experience of rising inequality to be exceptional, not echoed in Europe.

And on that comparison, an institutions-and-norms explanation wins: America is unique. The clearest evidence comes from income tax data, which allow a comparison of the share of income accruing to the economic elite. These data show that during World War II and its aftermath all advanced countries experienced a Great Compression, a sharp drop in inequality. In the United States this leveling was reversed beginning in the 1970s, and the effects of the Great Compression have now been fully eliminated. In Canada, which is closely linked to the U.S. economy, and in Britain, which had its own period of conservative dominance under Margaret Thatcher, there has been a more limited trend toward renewed inequality. But in Japan and France there has been very little change in inequality since 1980.[7]

There’s also spottier and less consistent information from surveys of household incomes. The picture there is fuzzier, but again the United States and, to a lesser extent, Britain stand out as countries where inequality sharply increased, while other advanced countries experienced either minor increases or no change at all.[8]

There is, in short, a strong circumstantial case for believing that institutions and norms, rather than technology or globalization, are the big sources of rising inequality in the United States. The obvious example of changing institutions is the collapse of the U.S. union movement. But what do I mean when I talk about changing norms?

Norms and Inequality: The Case of the Runaway CEOs

When economists talk about how changing norms have led to rising inequality, they often have one concrete example in mind: the runaway growth of executive pay. Although executives at major corporations aren’t the only big winners from rising inequality, their visibility makes them a good illustration of what is happening more broadly throughout the economy.

According to a Federal Reserve study, in the 1970s the chief executives at 102 major companies (those that were in the top 50 as measured by sales at some point over the period 1940–1990) were paid on average about $1.2 million in today’s dollars. That wasn’t hardship pay, to say the least. But it was only a bit more than CEOs were paid in the 1930s, and “only” 40 times what the average full-time worker in the U.S. economy as a whole was paid at the time. By the early years of this decade, however, CEO pay averaged more than $9 million a year, 367 times the pay of the average worker. Other top executives also saw huge increases in pay, though not as large as that of CEOs: The next two highest officers in major companies made 31 times the average worker’s salary in the seventies, but 169 times as much by the early 2000s.[9]

To make some sense of this extraordinary development, let’s start with an idealized story about the determinants of executive pay.[10] Imagine that the profitability of each company depends on the quality of its CEO, and that the bigger the company, the larger the CEO’s impact on profit. Imagine also that the quality of potential CEOs is observable: everyone knows who is the 100th best executive in America, the 99th best, and so on. In that case, there will be a competition for executives that ends up assigning the best executives to the biggest companies, where their contribution matters the most. And as a result of that competition, each executive’s pay will reflect his or her quality.

An immediate implication of this story is that at the top, even small differences in perceived executive quality will translate into big differences in salaries. The reason is competition: For a giant company the difference in profitability between having the 10th best executive and the 11th best executive may easily be tens of millions of dollars each year. In that sense the idealized model suggests that top executives might be earning their pay. And the idealized model also says that if executives are paid far more today than they were a generation ago, it must be because for some reason—more intense competition, higher stock prices, whatever—it matters more than it used to to have the best man running a company.

But once we relax the idealized premises of the story, it’s not hard to see why executive pay is a lot less tied down by fundamental forces of supply and demand, and a lot more subject to changes in social norms and political power, than this story implies.

First, neither the quality of executives nor the extent to which that quality matters are hard numbers. Assessing the productivity of corporate leaders isn’t like measuring how many bricks a worker can lay in a hour. You can’t even reliably evaluate managers by looking at the profitability of the companies they run, because profits depend on a lot of factors outside the chief executive’s control. Moreover profitability can, for extended periods, be in the eye of the beholder: Enron looked like a fabulously successful company to most of the world; Toll Brothers, the McMansion king, looked like a great success as long as the housing bubble was still inflating. So the question of how much to pay a top executive has a strong element of subjectivity, even fashion, to it. In the fifties and sixties big companies didn’t think it was important to have a famous, charismatic leader: CEOs rarely made the covers of business magazines, and companies tended to promote from within, stressing the virtues of being a team player. By contrast, in the eighties and thereafter CEOs became rock stars—they defined their companies as much as their companies defined them. Are corporate boards wiser now than they were when they chose solid insiders to run companies, or have they just been caught up in the culture of celebrity?

Second, even to the extent that corporate boards correctly judge both the quality of executives and the extent to which quality matters for profitability, the actual amount they end up paying their top executives depends

Вы читаете The Conscience of a Liberal
Добавить отзыв
ВСЕ ОТЗЫВЫ О КНИГЕ В ИЗБРАННОЕ

0

Вы можете отметить интересные вам фрагменты текста, которые будут доступны по уникальной ссылке в адресной строке браузера.

Отметить Добавить цитату