Smith writes:
I don't understand the idea that "households have had to" compensate for lower weekly wages (also the choice of weekly over hourly wages continues to mystify me, since long workweeks suck, but OK).Of course, no one literally had to compensate for the fact that their wages were falling, but people do prefer to maintain (if not improve!) their current level of consumption. So, if wages are falling, and you want to maintain your standard of living, you have to adjust something.
Let's make no mistake, real wages were falling (see Table B-15), and even today remain below their all-time peak. The Bureau of Labor Statistics (BLS) likes to use the period 1982-84 as its base period for inflation calculations, so the numbers that follow are in 1982-84 dollars to adjust for inflation. Smith likes to talk about 1980-2000, working from one business cycle peak to another, but he ignores the previous business cycle peak, 1973, which is a very interesting and important one since that is the year of peak real hourly wages (equal to 1972) and real weekly wages were just 37 cents less than 1972. It seems to me that it's more interesting to ask if middle class workers are as well off as they were at their peak than to ask if they are as well off in 1979 or 1980.
What happened to real wages? In inflation-adjusted dollars, the hourly wage peak of 1972-73 was $9.26 per hour; in 1980 it was $8.26 per hour, in 2000 it was $8.30 per hour, and in 2013 it had increased to $8.78 per hour.
But it's actually worse than that. Unlike professors, whose working time is pretty much their own as long as they teach well enough and publish enough to get tenure, most people cannot choose how much they work: Their employer decides that for them. Your paycheck is hourly wage times hours worked, and hours worked by production and non-supervisory workers has fallen from 36.9 in 1972 and 1973 to a low of 33.1 in 2009 and in 2013 was 33.7 hours per week. That is why we can't look at just the hourly wage, but need to use the weekly wage (hours per year would be an even better metric, but BLS does not publish the data that way). By the way, this category of workers is no small slice: It makes up about 62% of the entire non-farm workforce and 80% of the non-government workforce.
Because both hourly real wages and hours per week fell, real weekly earnings fell even more: From $341.73 (again, 1982-84 dollars) in 1972 to $290.80 in 1980, $284.78 in 2000, and $295.51 in 2013. If you're keeping track at home, that's a fall of 15% from 1972 to 1980, a 16.67% fall from 1972 to 2000, and still 13.5% below the 1972 peak in 2013.
But wait! After directly quoting and discussing what I said about real weekly wages, Smith suddenly, with no documentation, rejects that premise: "So, median real wages in America stayed roughly flat in America in 1980-2000, and people worked more - actually, what happened is that many women stopped being housewives and began working." Nothing I said in my post was about median real wages, but weekly real wages of production and non-supervisory workers. And he knows this is my view, because he clicked through, and even linked to, my much longer post, "The best data on middle class decline." He suddenly introduces a different measure entirely, and gives no argument for why it's better.
That's not to say there's no argument one could make for that measure. In fact, Dean Baker in an email a few years back pointed out to me that the real weekly wage measure I have used does not include McDonald's supervisors, who certainly are not well paid by any stretch of the imagination. But likewise, it does not include everyone from CEO on down. To clarify the difference, my preferred measure is the average (mean) of what's close to the bottom 62% of workers, while the median is of course the median of everyone. Which better captures the situation of the middle class?
I submit that my measure is better. Smith is right that real median wages have stayed fairly flat from 1980 to 2000, and in fact they have also varied little from 2000 to 2013. But that does not seem consistent with increasing cries of economic distress, as people have lost jobs, homes, and, too often, their pensions. I have always thought that declining real wages were fairly invisible at first: People might have made less than the previous generation, but for any given individual, that effect was offset by their increasing experience over time, leading to a slightly higher real income for that person. It was only when large numbers of people began to lose jobs, and could not find anything that paid as much as their old job, that the issue of middle class decline rose more to public consciousness.
Having shifted measures in midstream, Smith's final comments are rather less compelling. He has via this shift precluded the answer that women entered the workforce in large numbers from 1980 to 2000 to help maintain consumption, a position buttressed by the finding of Elizabeth Warren and co-researchers that private debt has increased sharply. I would also point out that women entered the workforce more rapidly in the 1973-79 period (when incomes were falling the most consistently) than in 1980-2000, as a close inspection of this FRED chart will show. Finally, going back to Warren's work, she argues that the rapid rise of home prices swallowed up the nominal income increase due to women entering the workforce, and that the average house grew in size by less than half a room in over 20 years, even while new single-family houses were growing by the much larger percentages Smith gave in his initial article.
Hence, Smith's claim that I'm saying increased women's labor force participation "represents a deterioration in the living standards of the average American" is mistaken, based on his using a different measure of middle class income than I do. I have tried to indicate why I think my measure is better, which would make women's labor force participation indeed at least in part a response to the falling incomes his measure doesn't show.