Kevin Drum - August 2011

Is It 1931 Again?

| Wed Aug. 10, 2011 11:56 PM EDT

Dan Drezner is feeling very gloomy tonight:

The start of the Great Depression is commonly assumed to be the October 1929 stock market crash in the United States. It didn't really become the Great Depression, however, until 1931, when Austria's Creditanstalt bank desperately needed injections of capital. Essentially, neither France nor England were willing to help unless Germany honored its reparations payments, and the United States refused to help unless France and the UK repaid it's World War One debts. Neither of these demands was terribly reasonable, and the result was a wave of bank failures that spread across Europe and the United States.

The particulars of the current sovereign debt crisis are somewhat different from Creditanstalt, and yet it's fascinating how smart people keep referring back to that ignoble moment. The big commonality is that while governments might recognize the virtues of a coordinated response to big crises, they are sufficiently constrained by domestic discontent to not do all that much.

So... is this 1931 all over again?

Read the rest to see why Dan thinks it might be. My own take is that it's probably not, partly because 1931 already happened and we've learned at least a little something since then. As senseless as a lot of our recent political behavior has been, so far national governments have been willing to respond to prevent imminent catastrophes from getting out of hand. It might not happen until the last second, but in the end, they finally do the right thing — or at least enough of the right thing to keep things puttering along.

What national governments haven't shown the will to do is address issues that are slightly less than catastrophic. As a result, our recovery is going to be much slower than it needs to be, and it might even tip into a second recession. More than likely we'll avoid 1931, but I'm increasingly unsure we'll avoid 1981.

Advertise on MotherJones.com

Liberals Have Been Played for Chumps

| Wed Aug. 10, 2011 4:25 PM EDT

Centrist moderate balanced nonpartisan independent columnist Matt Miller on his irritation with President Obama right now:

Here’s the thing. I know Tea Party Republicans were behind the debt-ceiling standoff that wreaked needless damage on confidence in the United States. I wrote weeks ago of Standard & Poor’s outrageous nerve in threatening a downgrade when America’s ability to pay its debts can’t possibly be in doubt. In short, I know who the real villains are at this volatile moment.

So why am I so mad at Barack Obama?

Feel free to click the link if you want to read more. I didn't much feel like bothering myself.

Honest to God, Republicans must all be sitting in their back rooms and just cackling like hell right now. Think about it. They developed a strategy to hamstring the president completely — a strategy that's bulletproof thanks to our country's Constitution — knowing that it would rally their base but also hoping that it would cause moderates and lefties alike to become disgusted with Obama's weakness even though we all know who's really responsible for what's going on. And it worked! In fact, it's worked better than they could possibly have imagined. They can probably barely keep from spitting up their beers right now.

We are such chumps.

Raising Taxes May Be Popular, But Not Popular Enough

| Wed Aug. 10, 2011 3:31 PM EDT

Bruce Bartlett rounds up public sentiment on tax increases and finds that in poll after poll the public is strongly in favor of tackling the deficit not with spending cuts alone, but with spending cuts plus at least some tax increases. The majorities in favor of raising taxes range between 60-70%.

Which is all fine. Unfortunately, as with nearly all polls, these don't measure intensity of feeling. And I don't think anyone will be surprised if I suggest that the one-third of Americans opposed to tax increases feels really strongly about it while the two-thirds who support them don't really care all that much. They're certainly nowhere near ready to kick people out of office if they decline to vote for a tax increase.

This is, of course, the story of politics everywhere. A motivated minority trumps an apathetic majority every time. They always have and they always will. Until we can get people out in the streets with torches and pitchforks in favor of raising taxes on the rich, these polls just don't mean much.

The Obama Administration's Weird Home Rental Plan

| Wed Aug. 10, 2011 1:13 PM EDT

Fannie Mae and Freddie Mac — now owned by you, the taxpayer! — have foreclosed on lots of homes. Those foreclosed homes act as a drag on the housing market, but Fannie and Freddie are reluctant to just get rid of them once and for all by offering them in bulk at rock bottom prices. Why? Because that would cause us, the taxpayers, to lose even more money than we already have on Fannie and Freddie.

So now the Obama administration is "seeking investors' ideas" on a new proposal to rent out the homes instead of keeping them on the market:

One proposal would sell packages of hundreds or thousands of foreclosed properties in bulk to investors that agree to rent them out. That approach is preferred by the Department of Housing and Urban Development, which is taking back properties as defaults mount on loans backed by the FHA.

Another approach would let investors enter joint ventures with Fannie or Freddie to invest in a pool of converted rental homes. A national property-management business would handle day-to-day landlord responsibilities. Investors would pay for rehabbing and maintaining properties and would share revenue from monthly rental income and the ultimate sale of the property. Such a joint venture would be modeled on the Resolution Trust Corp., which sold failed banks' assets in the early 1990s.

Jared Bernstein thinks this is an idea worth trying, and that makes me loath to admit that I don't get this. But....I don't get it. Right now, investors are free to buy packages of foreclosed properties any time they want and then do whatever they like with them. Sell them, rent them, demolish them, whatever. The problem is that Fannie and Freddie are asking too high a price so no one is interested.

So what changes under this new proposal? Well, we're going to put a new restriction on what investors can do with their foreclosed properties: they'll only be allowed to rent them. What's more, apparently there will be some additional regulations to make sure that investors who participate in this plan will be good landlords. But restrictions and regulations make the properties less valuable, no? So investors will not only remain unwilling to pay Fannie and Freddie's asking price, they'll be even less willing than before because the properties now have additional encumbrances on them.

I just don't get this. This plan would presumably require F&F to offer their foreclosed homes at fire sale prices. But if we're willing to do that, why not just offer them at fire sale prices and be done with it? Why waste time with the rental plan? What am I missing here?

Optimal Healthcare Not As Easy As It Seems

| Wed Aug. 10, 2011 11:21 AM EDT

This chart, via Austin Frakt, is one of the weirdest I've seen in a while, so I'm going to inflict it on you just as an object lesson. (In what? you ask. I'm not sure. Ask me again some other time.)

The question at hand is: do doctors provide more optimal levels of care if they're paid on a fee-for-service basis or a capitation basis? On an FFS plan, they get reimbursed for every test and procedure they order, so you'd expect that they might provide too much care. On a capitation plan, they get a flat payment for each patient, so you'd expect that they might provide too little care. According to a clever recent experiment, both of these things are true, but in a surprising way.

In the chart below, Type 1 patients (1A though 1E) are average, Type 2 are healthier than average, and Type 3 are sicker than average. The black dots show "optimal" care levels for each type of patient, the red line shows the actual care provided under FFS, and the blue line shows the actual care provided under capitation. (Well, "actual" in an experimental sense, anyway. See note below.)

So what's the result? For average patients and healthy patients, capitation is great. Doctors working under this system provide almost exactly optimal levels of care, while doctors working under FFS strongly overtreat.

But what about patients who are sicker than average? For them, FFS doctors provide an almost exactly optimal level of treatment while capitation docs severely undertreat.

Apparently, as long as the level of treatment required (say, $3,000 worth) is less than the capitation payment (say, $5,000), doctors will provide all the care that's appropriate even if it means earning less money on each patient. But if the level of treatment goes above the capitation payment, they don't. They aren't willing to actually lose money on a patient.

Ideally, then, we'd like to compensate doctors on a capitation basis for most patients, but on an FFS basis for the 5-10% of the sickest patients. Figuring out just how to do that, however, is a considerable challenge.

NOTE: This was an experiment with medical students, not a real-life study with actual doctors. So obviously take it with a big grain of salt. Still, there was actual money at stake, and the design of the experiment was pretty good. It's not a slam-dunk case by any means, but it's highly suggestive of how things might work in real life.

UPDATE: An email exchange with Austin suggests that I should explain a couple of things. First, by "weird" I only meant that the chart is fairly complex and hard to understand at a glance, not that the results themselves are weird. Actually, the results seem pretty straightforward.

Second, those dollar figures I used have nothing to do with the chart or the experiment. They were just illustrative, on the assumption that for average and healthy patients the cost of proper care is less than the capitation payment, while for sick patients the cost of proper care is higher than the capitation payment. Apologies if this was unclear.

Yes, Virginia, a Double-Dip Recession is Possible

| Wed Aug. 10, 2011 10:23 AM EDT

Karl Smith says his internal models strongly suggest a double-dip recession, but he just can't bring himself to believe it:

I look at a lot of fundamentals but at the end of the day the money markets drive my forecasts. The money markets are telling me in every possible way that recession is coming. Liquidity demand is rising, inflation expectations are falling, nominal interests rates are collapsing.

However, like Leamer in 2007, I am hard pressed to see what is left to recess? At the time Leamer doubted a recession because he didn’t think there were enough manufacturing jobs left to lose.

This time, I look at construction and local government and think the same thing. The cyclical employment sectors are already so far down. Are we going to start losing jobs in Health Care and Education at this point?

I don't know that I can bring myself to believe it either. Then again, in 1931, guess what? It hardly seemed possible, but things got worse! The truth is that as long as insane conservatives continue to drive our national economic policy, a double-dip recession is not only possible, it's likely. They simply show no signs of stopping their madness, and most of the mainstream press and punditocracy aren't numerate enough to recognize what's really going on. We are trapped in a cycle of insanity that's truly Kafkaesque.

Advertise on MotherJones.com

Restaurant Tipping: 15 or 20 Percent?

| Tue Aug. 9, 2011 9:20 PM EDT

LA Weekly's Jonathan Gold has some advice about tipping:

Tip 20 percent. Every time. Pre-tax? Post-tax? In practice the difference is no more than a buck or two....Yes, I know your parents still talk about when the recommended percentage used to be 15 percent, and that the practice is considered barbaric in Japan. But it's not 1973, and you're probably not in Osaka at the moment. 20 percent.

I figure this is something readers might know something about, so: When did this change? After 1973, apparently, but that's a little vague. And why? Do food servers make less in ordinary wages than they used to? I don't think that's the case, though I might be wrong. And just generally, tipping seems like it's perfectly designed to keep up with the cost of living. And it has: To geek out about this a bit, the chart on the right shows headline inflation vs. the inflation rate for "food away from home." There's a slight divergence during the recent recession, but that's it. Overall, the rate has been pretty much the same. Restaurant bills have gone up as much as everything else.

Anyway, I'm not trying to campaign for stingier tips for food servers. I'm fine with 20 percent, and it's certainly easier to calculate. But I'm trying to distract myself from the grim political news of late and just sort of curious about when and why the recommended practice changed. Or is Jonathan Gold wrong?

The Economy: A Play in Four Short Acts

| Tue Aug. 9, 2011 3:05 PM EDT

Shorter Fed: The economy sucks really badly, but we're not going to do anything about it. Have a nice day!

Shorter Republicans: Pain is good for you, so we're not going to do anything either. Or allow anyone else to do anything. See you next November!

Shorter Democrats: We'd like to do something but there's nothing we can do. Sorry, folks!

Shorter Obama: Prosperity is just around the corner. This time for sure. Clap louder!

Healthcare and the Free Market

| Tue Aug. 9, 2011 12:16 PM EDT

There have been a lot of shortages of generic cancer drugs lately, and Ezekiel Emanuel says part of the reason is related to reforms that were part of the Medicare prescription drug legislation of 2003. The nickel explanation is that the act required Medicare to pay physicians based on a drug's actual average selling price, with price increases limited to 6% every six months:

The act had an unintended consequence. In the first two or three years after a cancer drug goes generic, its price can drop by as much as 90 percent as manufacturers compete for market share. But if a shortage develops, the drug's price should be able to increase again to attract more manufacturers. Because the 2003 act effectively limits drug price increases, it prevents this from happening. The low profit margins mean that manufacturers face a hard choice: lose money producing a lifesaving drug or switch limited production capacity to a more lucrative drug.

Megan McArdle says this is a good example of why it's a bad idea for bureaucrats to think they can control market forces:

Things like this are the root of my skepticism about technocratic rule-making. I have no doubt that the earnest people who drafted this rule spent a lot of time thinking about whether the allowed price increase should be 5% or 7%. But they somehow overlooked a rather significant feature of the market they were regulating, and the effect that their rule would have when it interacted with market reality. The more complex your system of rules is, the harder it is to keep track of these potential unwanted side effects.

Maybe. But I have one question: how do other countries handle this? That is, other countries like France and Germany and Sweden and Japan where price controls for pharmaceuticals are stricter than they are here? Emanuel answers this question later in his op-ed: "Most of Europe, where brand-name drugs are cheaper than in the United States, while generics are slightly more expensive, has no shortage of these cancer drugs." Then this: "A more radical approach would be to take Medicare out of the generic cancer drug business entirely. Once a drug becomes generic, Medicare should stop paying, and it should be covered by a private pharmacy plan."

In other words, making this particular segment of the pharmaceutical business more market-friendly might well be a good idea. At the same time, it's also obvious that every other country in the world seems to have addressed this problem without any of the difficulties we faced. Writing decent regs isn't impossible, and it's especially not impossible if you're willing to look at what other countries do and learn from them. This was, to put it gently, not something that the Republicans who designed Medicare Part D were willing to consider.

Oh, That Old-Timey Movie Accent!

| Tue Aug. 9, 2011 11:06 AM EDT

James Fallows was watching some old movies recently and has a question:

The language that the narrator, one Gayne Whitman, uses is florid enough. But his accent! It's instantly familiar to anyone who's seen old movies and newsreels from the 1930s and 1940s. But you cannot imagine a present-day American using it with a straight face. It's not faux-British, but it's a particular kind of lah-dee-dah American diction that at one time was very familiar and now has vanished.

Even without watching the clip, you probably know exactly what he's talking about. I always assumed that this was an artificial construction, and one of Fallows's readers confirms this:

The accent you are wondering about is the Transatlantic accent, also called the Midatlantic accent. This was not a regional accent. Rather, it's an accent that was taught to actors and announcers. I learned about this accent from Amy Walker's "21 Accents" video on YouTube. She starts using the Transatlantic accent at the 2:12 mark.

I assume that this accent was an artifact of live theater that got transplanted into movies during their first few decades, and it's the main thing that makes old movies nearly unwatchable to me. Obviously everyone has their own idea of what makes acting great, but for me it's first, foremost, and almost exclusively voice: the ability to precisely control tone, pace, pitch, timbre, tempo, modulation, resonance, accent, and so forth. Actors who can do this are great even if they have limited proficiency in all the other arts of acting; actors who can't are terrible no matter what else they do well.

That's how I respond to acting, anyway, and the Transatlantic accent in movies of the 30s and 40s almost entirely ruins them for me. Anyone else feel this way?