worse, and so on.
Maybe the last word should be given to the public. According to a 2006 survey taken by the Pew Research Center, most working Americans believe that the average worker “has to work harder to earn a decent living” today than he did twenty or thirty years earlier.[1] Is this just nostalgia for a remembered golden age? Maybe, but there was no such nostalgia a generation ago about the way America was a generation before
As I’ve suggested with my Bill-Gates-in-a-bar analogy, ordinary American workers have failed to reap the gains from rising productivity because of rising inequality. But who were the winners and losers from this upward redistribution of income? It wasn’t just Bill Gates—but it was a surprisingly narrow group.
If gains in productivity had been evenly shared across the workforce, the typical worker’s income would be about 35 percent higher now than it was in the early seventies.[2] But the upward redistribution of income meant that the typical worker saw a far smaller gain. Indeed, everyone below roughly the 90th percentile of the wage distribution—the bottom of the top 10 percent—saw his or her income grow more slowly than average, while only those above the 90th percentile saw above-average gains. So the limited gains of the typical American worker were the flip side of above-average gains for the top 10 percent.
And the really big gains went to the really, really rich. In Oliver Stone’s 1987 movie
At the time an income of $400,000 a year would have put someone at about the 99.9th percentile of the wage distribution—pretty good, you might think. But as Stone realized, by the late 1980s something astonishing was happening in the upper reaches of the income distribution: The rich were pulling away from the merely affluent, and the super-rich were pulling away from the merely rich. People in the bottom half of the top 10 percent, corresponding roughly to incomes in the $100,000 to $150,000 range, though they did better than Americans further down the scale, didn’t do all that well—in fact, in the period after 1973 they didn’t gain nearly as much, in percentage terms, as they did during the postwar boom. Only the top 1 percent has done better since the 1970s than it did in the generation after World War II. Once you get way up the scale, however, the gains have been spectacular—the top tenth of a percent saw its income rise fivefold, and the top .01 percent of Americans is seven times richer than they were in 1973.
Who are these people, and why are they doing so much better than everyone else? In the original Gilded Age, people with very high incomes generally received those incomes due to the assets they owned: The economic elite owned valuable land and mineral resources or highly profitable companies. Even now capital income—income from assets such as stocks, bonds, and property—is much more concentrated in the hands of a few than earned income. So is “entrepreneurial income”—income from ownership of companies. But ownership is no longer the main source of elite status. These days even multimillionaires get most of their income in the form of paid compensation for their labor.
Needless to say we’re not talking about wage slaves toiling for an hourly rate. If the quintessential high- income American circa 1905 was an industrial baron who owned factories, his counterpart a hundred years later is a top executive, lavishly rewarded for his labors with bonuses and stock options. Even at the very top, the highest- income 0.01 percent of the population—the richest one in ten thousand—almost half of income comes in the form of compensation. A rough estimate is that about half of the wage income of this superelite comes from the earnings of top executives—not just CEOs but those a few ranks below—at major companies. Much of the rest of the wage income of the top 0.01 percent appears to represent the incomes of sports and entertainment celebrities.
So a large part of the overall increase in inequality is, in a direct sense, the result of a change in the way society pays its allegedly best and brightest. They were always paid well, but now they’re paid incredibly well.
The question, of course, is what caused that to happen. Broadly speaking there are two competing explanations for the great divergence in incomes that has taken place since the 1970s. The first explanation, favored by people who want to sound reasonable and judicious, is that a rising demand for skill, due mainly to technological change with an assist from globalization, is responsible. The alternative explanation stresses changes in institutions, norms, and political power.
The standard explanation of rising inequality—I’m tempted to call it the safe explanation, since it’s favored by people who don’t want to make waves—says that rising inequality is mainly caused by a rising demand for skilled labor, which in turn is driven largely by technological change. For example, Edward Lazear, chairman of the Council of Economic Advisers in 2006, had this to say:
Most of the inequality reflects an increase in returns to “investing in skills”—workers completing more school, getting more training, and acquiring new capabilities…. What accounts for this divergence of earnings for the skilled and earnings for the unskilled? Most economists believe that fundamentally this is traceable to technological change that has occurred over the past two or three decades. In our technologically-advanced society, skill has higher value than it does in a less technologically-advanced society…with the growing importance of computers, the types of skills that are required in school and through investment in learning on the job become almost essential in making a worker productive. The typical job that individuals do today requires a much higher level of technical skills than the kinds of jobs that workers did in 1900 or in 1970. [3]
To enlarge on Lazear’s remarks: Information technology, in the form of personal computers, cell phones, local area networks, the Internet, and so on, increases the demand for people with enough formal training to build, program, operate, and repair the new gadgets. At the same time it reduces the need for workers who do routine tasks. For example, there are far fewer secretaries in modern offices than there were in 1970, because word processing has largely eliminated the need for typists, and networks have greatly reduced the need for physical filing and retrieval; but there are as many managers as ever. Bar-code scanners tied to local networks have reduced the number of people needed to man cash registers and the number required to keep track of inventory, but there are more marketing consultants than ever. And so on throughout the economy.
The hypothesis that technological change, by raising the demand for skill, has led to growing inequality is so widespread that at conferences economists often use the abbreviation SBTC—skill-biased technical change— without explanation, assuming that their listeners know what they’re talking about. It’s an appealing hypothesis for three main reasons. First, the timing works: The upward trend in inequality began about the same time that computing power and its applications began their great explosion. True, mainframe computers—large machines that sat in a room by themselves, crunching payrolls and other business data—were in widespread use in the sixties. But they had little impact on how most workers did their jobs. Modern information technology didn’t come into its own until Intel introduced the first integrated circuit—the first computer chip—in 1971. Only then could the technology become pervasive. Second, SBTC is the kind of hypothesis economists feel comfortable with: it’s just supply and demand, with no need to bring in the kinds of things sociologists talk about but economists find hard to incorporate in their models, things like institutions, norms, and political power. Finally, SBTC says that the rise in inequality isn’t anybody’s fault: It’s just technology, working through the invisible hand.
That said, there’s remarkably little direct evidence for the proposition that technological change has caused rising inequality. The truth is that there’s no easy way to measure the effect of technology on markets; on this issue and others, economists mainly invoke technology to explain things they can’t explain by other measurable forces. The procedure goes something like this: First, assume that rising inequality is caused by technology, growing international trade, and immigration. Then, estimate the effects of trade and immigration—a tendentious procedure in itself, but we do at least have data on the volume of imports and the number of immigrants. Finally, attribute whatever isn’t explained by these measurable factors to technology. That is, economists who assert that technological change is the main cause of rising inequality arrive at that conclusion by a process of exclusion: They’ve concluded that trade and immigration aren’t big enough to explain what has happened, so technology must be the culprit.
As I’ve just suggested, the main factors economists have considered as alternative explanations for rising