Donald Trump, having discovered that raising the minimum wage is popular, has suddenly jumped on the bandwagon. He now claims to favor raising the federal minimum wage to $10 per hour. I will leave it to you to decide if you believe him.
Trump's flip-flopping aside, James Pethokoukis has a few points to make. Here's the first one:
As Scott Winship has argued, using the proper inflation adjustment would mean a roughly $8.50 modern minimum to match its 1968 level. And the current minimum is pretty much what the average minimum was from 1960 to 1980 before its steady decline during the 1980s. So a jump to $10, much the less $15 Democrats want, is a pretty big jump. What’s more, government-mandated wage floors are particularly problematic in a big country like America where living costs vary greatly by region.
In 1968 the minimum wage was $1.60. If you adjust for inflation, that comes to $11.08. So why does Scott Winship say it only comes to $8.50?
The answer has to do with which inflation measure you use. If you use the usual CPI indicator that gets reported in the news every month, inflation has risen 6.9x since 1968. If you use the PCE indicator, it's only gone up 5.3x. So which is correct?
I don't have the chops to adjudicate this, and anyway, the real answer is: it depends. They both have advantages and disadvantages depending on what you're interested in. However, without getting into all the gory details, I want to make a couple of points.
First, CPI measures only money that consumers spend. PCE measures everything, including business expenditures. The place where this makes the biggest difference is healthcare spending. Consumers generally spend a fairly small amount on medical care (copays, deductibles, etc.) with the vast bulk being covered by insurance or the government. As a result, medical expenses account for about 6 percent of the CPI index, but a whopping 20 percent of the PCE index.
But if medical spending accounts for a bigger percentage of the PCE index, something else must be lower. It all has to add up to 100 percent, after all. As it turns out, there are several differences in weighting, but the biggest by far is housing. Primary shelter accounts for only 15 percent of the PCE index, but 33 percent of the CPI index.
So which is more accurate? Again, it depends on what you're interested in. But without making any sweeping statements on one side or the other, I'll say this: for the poor, CPI is almost certainly more accurate. I can't prove this with the BLS survey numbers used to construct the CPI, since the accuracy of those numbers is precisely what we're arguing about. But consider two things:
- The poor do, in fact, say that they spend about 40 percent of their income on housing (compared to about 30 percent for the middle class and above).
- Common sense suggests that this is right. Do you really think that a family earning $25,000 spends only $300 per month on rent? Likewise, do you think they spend $5,000 per year on medical care?
It's hardly conceivable that the PCE weights are anywhere near representative of the real-life expenditures of the poor, and these are the people who are affected by the minimum wage. In particular, housing prices are a big expense for the poor, and housing costs have increased 7.4x since 1968.
I'm generally loath to play too many games with inflation measures, since you can very quickly get into a quagmire of cherry picking just the bits and pieces that help your argument. But in this case, it really does seem clear: in the case of the minimum wage, the lived experience of the poor over the long term is much closer to the CPI than to the PCE. A minimum wage of $10 would get us back to roughly where we were in the late 60s and early 70s. Is there really a good reason we shouldn't do that?