Some economists and journalists—I’ll call them "the real-wage pessimists"—have claimed that average real wages have fallen during substantial time periods over the last 30 or so years. Others have claimed that average real wages for the majority of the labor force have fallen during substantial periods over the last 30 years. These are claims that Alan Reynolds takes on in his new book, Income and Wealth.
As with the other parts of his book, Reynolds accomplishes his task by carefully looking at what exactly is measured by the data cited by what I shall call "the real-wage pessimists." If you want to know whether someone’s real wage has increased, you would need to know that person’s hourly wage rate, adjusted for inflation, plus the value to the person of any non-wage benefits. Right away, that suggests three things that you would need to be careful about: getting their hourly wage rate right, getting the inflation rate right, and estimating correctly the value of their non-wage benefits. Reynolds shows that the real-wage pessimists make mistakes on some or all of these three.
Start with the person’s hourly wage rate. Many of the real-wage pessimists don’t carefully estimate that but, instead, settle for looking at average weekly wages. But comparing average weekly wages over time will give a much more pessimistic view than is justified. Why? Part-time jobs as a percentage of total jobs have increased over time.
Reynolds quotes the statement from the Bureau of Labor Statistics that there have been "persistent long-term increases in he proportion of part-time workers in retail trade and many of the service industries have reduced average workweeks in these industries." Although the BLS is scrapping the average earnings figures because the categories into which they sort workers and are less and less relevant, Reynolds notes that the increase in part-time work is, in itself, enough of a reason to scrap these figures.
Moreover, although Reynolds doesn’t mention this, there is another reason that average weekly earnings understate the growth in hourly wages: the average work week, even for full-time workers, has fallen steadily. As Michael Cox and Richard Alm, economists at the Dallas Federal Reserve Bank have written:
Even at work, Americans aren’t always doing the boss’s bidding. According to University of Michigan time diary studies, the average worker spends more than an hour a day engaged in something other than assigned work while on the job. Employees run errands, socialize with colleagues, make personal telephone calls, send e-mail, and surf the Internet. More than a third of American workers, a total of 42 million, access the Internet during working hours. The peak hours for submitting bids on eBay, the popular online auction site, come between noon and 6 p.m., when most Americans are supposedly hard at work.
This "leisure on the job" appears to have risen over time. If one then divides even a full-time worker’s weekly earnings by the actual number of hours of work rather than by the number of hours in the work place, and tracks this number over time, one would get a higher wage rate and a higher growth of the wage rate (assuming, that is, that actual hours of work have fallen over time). In short, the growth in wages per actual hour of work is higher than numbers that assume that all hours on the job are work hours.
How about the inflation rate? If our data overstate inflation, then, again, the growth in real wages is understated. Reynolds notes that many of the real-wage pessimists point to 1973 as the end of an alleged golden era when real wages were rising so quickly. But Reynolds points out an obvious problem with data on real wages from that year and from years before and after: price controls. On this issue, Reynolds’s reasoning is correct but much too terse and so let me fill in the gaps.
President Nixon’s price controls, which began on August 15, 1971, froze prices for three months and then allowed small increases so that actual legal prices fell further and further behind the prices that would have existed without price controls. With the price controls increasingly binding, there were widespread shortages, the shortage of gasoline and, at times, beef, being the most memorable. But probably more important than the shortages were the other adjustments producers made, given that they could not legally raise prices as much as they would have.
The main such adjustment was a decrease in quality. So, for example, a producer of Matzoh ball soup would put in three Matzoh balls instead of the usual four. If memory serves, this is an actual example that Nixon’s chief economist, Herb Stein [incidentally, one of the best bosses I ever had] gave at the time. Just as the Consumer Price Index is not good at accounting for quality improvements, it does not do well at picking up quality reductions either. Thus a given money wage would appear to have more purchasing power in 1973—when the price controls were relatively "mature"—than it really had. Then in late 1973 and early 1974, when the price controls were repealed in sector after sector, prices shot up. But the shortages disappeared also, except for gasoline, one of the few sectors in which price controls were kept. Presumably quality of various goods began to improve as well. Again, the CPI would not have picked up these subtle but widespread improvements in quality. So the data would show real wage rates falling even if real wages were actually increasing.
Moreover, notes Reynolds, "Attempting to compare today’s price index with one from 1973 would require comparing computers with typewriters, digital TIVO with rooftop antennas, and contemporary cars with Chevy Vegas and Ford Pintos." In other words, when one looks at real wage rates over 30 years, one must contend with new goods that are so superior that they are beyond the level of simple quality improvement. The growth in the CPI will overstate the true increase in the cost of living.
The third factor typically left out by the real-wage pessimists is the growing fraction of compensation paid in the form of benefits. Reynolds points out that between 1973 and 2005, total compensation per hour (including health insurance, retirement, and other benefits as part of compensation), rose by almost 40 percent. Reynolds should have noted that this probably overstates slightly the growth of real hourly compensation because some of the shift in compensation towards non-wage benefits is due to rising marginal tax rates for middle-income workers.
From 1973 to 1981, marginal federal income tax rates for virtually all workers rose substantially because a brisk inflation put them into higher and higher tax brackets over time. Although most workers’ marginal income tax rates fell a few percentage points between 1981 and 1984, due to the Reagan tax cut, and then again after 1986, due to the 1986 Tax Reform Act, their tax rates for Social Security and Medicare rose throughout the 1980s due to the phased-in increases in tax rates legislated in 1977. When employers and employees get around an increase in marginal tax rates by paying more in the form of untaxed benefits, employees are better off but they are not better off dollar for dollar. So, for example, if my employer and I can avoid a 40 percent hit on my marginal thousand dollars of income by his giving me that thousand dollars in the form of a slightly-more generous health insurance plan, then as long as I value the increased benefits at more than $600, I am better off getting the benefit rather than the cash. That is why the growth in benefits somewhat overstates the growth in real compensation.
Beyond these three factors—hourly wages, inflation adjustment, and value of benefits—that the wage pessimists tend to get wrong are two other factors, both of which Reynolds discusses. The first is that because of demographic changes, the growth of median wages, even if measured correctly, will not necessarily describe what happens to "typical" workers. The big demographic factor Reynolds points to is the addition of millions of low-wage immigrants. Their addition to the labor force will bring down median wages. Yet, notes Reynolds, that change "probably had no effect on those considered ‘typical’ (middle-income) except to hold down their cost of fast food or home and lawn care." And of course those low-wage immigrants are typically much better off at those low wages than they were in the situations they left in their native countries.
The final and most-important factor is the growth in consumption at all income levels. Reynolds writes:
Unless the top 10-20 percent could somehow consume unlimited numbers of houses, cars, shirts, and steaks, it is difficult to imagine how each American’s [he should have said "the average American’s] real consumption could have doubled if real wages and salaries had really been unchanged. The average size of new homes rose from 1,500 square feet in 1970 to 2,349 square feet in 2004, and the national home ownership rate rose from 62.9 percent in 1970 to 69.2 percent by the end of 2004. How could so many people be living in so much larger houses if only 10-20 percent had significant increases in income?
The above quote is not in itself a slam-dunk argument. Even though the average size of new homes increased a great deal, the average size of all homes would presumably have increased less. But Reynolds makes it a slam-dunk by citing data from the aforementioned Cox and Alm and from Kirk Johnson showing that the average poor family in 2001 did as well as or better than the average family in 1971 in ownership of motor vehicles, air conditioners, color TVs, refrigerators, VCRs, personal computers, and cell phones. Of course, the last three didn’t exist in 1971, but that’s part of the point. When poor families can afford what even middle-income families couldn’t imagine having 30 years earlier, aren’t things working out pretty well?