Comments Guidelines

All comments are pre-moderated. No spam, slurs, personal attacks, or foul language will be allowed.

Tuesday, January 27, 2015

What is Noah thinking? Part 2

Noah Smith has replied to my recent post criticizing his use of median household income to measure middle class living standards. He raises some interesting questions, but some of them still leave me scratching my head.

Smith writes:
I don't understand the idea that "households have had to" compensate for lower weekly wages (also the choice of weekly over hourly wages continues to mystify me, since long workweeks suck, but OK).
Of course, no one literally had to compensate for the fact that their wages were falling, but people do prefer to maintain (if not improve!) their current level of consumption. So, if wages are falling, and you want to maintain your standard of living, you have to adjust something.

Let's make no mistake, real wages were falling (see Table B-15), and even today remain below their all-time peak. The Bureau of Labor Statistics (BLS) likes to use the period 1982-84 as its base period for inflation calculations, so the numbers that follow are in 1982-84 dollars to adjust for inflation. Smith likes to talk about 1980-2000, working from one business cycle peak to another, but he ignores the previous business cycle peak, 1973, which is a very interesting and important one since that is the year of peak real hourly wages (equal to 1972) and real weekly wages were just 37 cents less than 1972. It seems to me that it's more interesting to ask if middle class workers are as well off as they were at their peak than to ask if they are as well off in 1979 or 1980.

What happened to real wages? In inflation-adjusted dollars, the hourly wage peak of 1972-73 was $9.26 per hour; in 1980 it was $8.26 per hour, in 2000 it was $8.30 per hour, and in 2013 it had increased to $8.78 per hour.

But it's actually worse than that. Unlike professors, whose working time is pretty much their own as long as they teach well enough and publish enough to get tenure, most people cannot choose how much they work: Their employer decides that for them. Your paycheck is hourly wage times hours worked, and hours worked by production and non-supervisory workers has fallen from 36.9 in 1972 and 1973 to a low of 33.1 in 2009 and in 2013 was 33.7 hours per week. That is why we can't look at just the hourly wage, but need to use the weekly wage (hours per year would be an even better metric, but BLS does not publish the data that way). By the way, this category of workers is no small slice: It makes up about 62% of the entire non-farm workforce and 80% of the non-government workforce.

Because both hourly real wages and hours per week fell, real weekly earnings fell even more: From $341.73 (again, 1982-84 dollars) in 1972 to $290.80 in 1980, $284.78 in 2000, and $295.51 in 2013. If you're keeping track at home, that's a fall of 15% from 1972 to 1980, a 16.67% fall from 1972 to 2000, and still 13.5% below the 1972 peak in 2013.

But wait! After directly quoting and discussing what I said about real weekly wages, Smith suddenly, with no documentation, rejects that premise: "So, median real wages in America stayed roughly flat in America in 1980-2000, and people worked more - actually, what happened is that many women stopped being housewives and began working." Nothing I said in my post was about median real wages, but weekly real wages of production and non-supervisory workers. And he knows this is my view, because he clicked through, and even linked to, my much longer post, "The best data on middle class decline." He suddenly introduces a different measure entirely, and gives no argument for why it's better.

That's not to say there's no argument one could make for that measure. In fact, Dean Baker in an email a few years back pointed out to me that the real weekly wage measure I have used does not include McDonald's supervisors, who certainly are not well paid by any stretch of the imagination. But likewise, it does not include everyone from CEO on down. To clarify the difference, my preferred measure is the average (mean) of what's close to the bottom 62% of workers, while the median is of course the median of everyone. Which better captures the situation of the middle class?

I submit that my measure is better. Smith is right that real median wages have stayed fairly flat from 1980 to 2000, and in fact they have also varied little from 2000 to 2013. But that does not seem consistent with increasing cries of economic distress, as people have lost jobs, homes, and, too often, their pensions. I have always thought that declining real wages were fairly invisible at first: People might have made less than the previous generation, but for any given individual, that effect was offset by their increasing experience over time, leading to a slightly higher  real income for that person. It was only when large numbers of people began to lose jobs, and could not find anything that paid as much as their old job, that the issue of middle class decline rose more to public consciousness.

Having shifted measures in midstream, Smith's final comments are rather less compelling. He has via  this shift precluded the answer that women entered the workforce in large numbers from 1980 to 2000 to help maintain consumption, a position buttressed by the finding of Elizabeth Warren and co-researchers that private debt has increased sharply. I would also point out that women entered the workforce more rapidly in the 1973-79 period (when incomes were falling the most consistently) than in 1980-2000, as a close inspection of this FRED chart will show. Finally, going back to Warren's work, she argues that the rapid rise of home prices swallowed up the nominal income increase due to women entering the workforce, and that the average house grew in size by less than half a room in over 20 years, even while new single-family houses were growing by the much larger percentages Smith gave in his initial article.

Hence, Smith's claim that I'm saying increased women's labor force participation "represents a deterioration in the living standards of the average American" is mistaken, based on his using a different measure of middle class income than I do. I have tried to indicate why I think my measure is better, which would make women's labor force participation indeed at least in part a response to the falling incomes his measure doesn't show.

7 comments:

  1. There is a serious conflation of cause and effect in this analysis.

    The Equal Credit Opportunity Act of 1974, if I recall correctly, required banks to consider the wife's income when determining eligibility for a mortgage. This flood of money into the market drove up real estate prices to the point that two incomes are almost NECESSARY to buy. Rent was also bid up.

    So women in the workforce, and in the housing market, drove up inflation (w/o an increase in wage) which in turn got more women into the labor market, which drove wages down at the same time they were bidding prices up.

    Result: stagflation

    We are now seeing similar effects with the illegal immigrants taking jobs. Wages go down relative to prices, because they bid up the prices and bid down the wages.

    ReplyDelete
    Replies
    1. Horace Boothroyd IIIFebruary 1, 2015 at 6:37 PM

      Exactly: every problem in our society is caused by government screwups, and everything would be perfect if only they would disband and allow us to go about our business unmolested. Now if only we could get people to stop blurting out simpleminded one dimensional pseudo analyses of complicated problems then we all could be cooking with gas.

      Delete
    2. Ah, yes "All-or-Nothing" argument. Either ALL problems are due to government, or none are. Brilliant.

      Delete
  2. Average Hourly Earnings (AHE) deflated by the Personal Consumption Expenditures Price Index ( PCEPI is blue line) and Consumer Price Index-Urban (CPI-U is red line) normalized to January 1970.

    The PCEPI, not the CPI-U, is the deflator used in the National Income and Product Accounts and shows that AHE is 25% higher in 2014 relative to 1970.

    FRED chart.

    ReplyDelete
    Replies
    1. Hi marmico, the problem is that the PCE is not supposed to represent the prices individual consumers pay. Moreover, the supposed greater accuracy of chained indices in reflecting substitutions actually obscures the compromises consumers make to deal with rising prices. I think Katha Pollitt said it best: "...switching to cheaper products is what having a lowered standard of living *is*." Pollitt

      Moreover, in a paper by two BEA and one BLS staffer, they write this about the difference between CPI-U and PECPI: " The CPI measures the change in prices paid by urban consumers for a market basket of consumer goods and services; it is primarily used as an economic indicator and as a means of adjusting current-period data for inflation. The PCE price index measures the change in prices paid for goods and services by the
      personal sector in the U.S. national income and product accounts; it is primarily used for macroeconomic
      analysis and forecasting." BEA/BLS paper (footnote 1)

      So I continue to think that CPI-U is the better measure for inflation as experienced by individuals.

      Delete
    2. It seems to me that the reconciliation is that the 2014 AHE deflated by the CPI-U has 25% more PCE purchasing power relative to the 1970 AHE.

      Delete
    3. marmico, if I understand you correctly, it would seem the answer is that individuals don't spend their money in PCE world, but in CPI world. And part of the gain in PCE world is fictitious, anyway, since it involves doing things like downsizing from a one-bedroom apartment to a studio if you can no longer afford the one-bedroom (example is from Pollitt above).

      But I'm not 100% sure I'm understanding your point about "reconciliation." There was a discussion of reconciling the CPI-U and PCE in the BEA/BLS paper I linked, so perhaps it is coloring my views.

      Delete