Kevin Drum

Why Do We Give Medical Treatment That Increases Patients' Chances of Dying?

| Tue May 12, 2015 9:00 AM EDT

While Kevin Drum is focused on getting better, we've invited some of the remarkable writers and thinkers who have traded links and ideas with him from Blogosphere 1.0 to this day to contribute posts and keep the conversation going. Today we're honored to present a post from Aaron E. Carroll.

I saw this study a few weeks ago on blood pressure treatment for nursing home residents, and I almost ignored it. There are so many like it. But it's just ridiculous that this kind of stuff continues, and that we can't seem to do anything about it.

We know that in many people, high blood pressure is bad. We therefore try and do things to lower it. But then we go ahead and decide that if lowering blood pressure in some people is good, it must be good for everyone. In frail, elderly people, however, there's no evidence for this—and there may be evidence that lowering blood pressure is a bad idea. But that runs counter to what we've always been told, so many ignore it.

Getting doctors to change their behavior is hard, and getting them to stop doing something may be even harder.

This was a longitudinal study of elderly people living in nursing homes, meaning that the authors recruited people there and then followed them for about two years. They were interested in seeing how different aspects of care were related to the subjects' chance of dying. Almost 80 percent of them were being treated for high blood pressure (in spite of the above). A previous analysis of this study had shown that blood pressure was inversely related to all-cause mortality "even after adjusting for several confounders, such as age, sex, history of previous cardiovascular (CV) disease, Charlson Comorbidity Index score, cognitive function (Mini-Mental State Examination), and autonomy status (activities of daily living)." This study went further, to look at whether being on lots of drugs for high blood pressure was bad—even after controlling for the blood pressure relationship.

Patients in this study were on an average of seven drugs and were on at least two drugs for high blood pressure.

What the study found, to no one's real surprise, is that the people on two or more blood pressure medications who had a systolic blood pressure of less than 130 mm Hg had a significantly higher all-cause mortality. This held true even after additionally adjusting for propensity score–matched subsets, other cardiovascular issues, and the exclusion of patients without a history of hypertension who were receiving BP-lowering agents.

We know that there’s evidence that keeping blood pressure lower in this population might be bad. Yet, many of these patients were not only being treated for "high" blood pressure—many were on multiple medications for it. Those on more medications (i.e. more treatment) were more likely to die.

Here's the kicker: This wasn't a study done in the United States. It was done in France and Italy—so this isn't me bashing on the US health care system. It's a problem that's writ large. We find something that is bad. We find that lessening it is better. We then start to lessen it even more. Soon we're trying to lessen it for everyone. We're saying it's too high in all populations, even when we don't have evidence that's true. We say it even as evidence builds that less is bad for lots of people.

Better clinical decision support might help, but we can't seem to get that in electronic health records, and doctors hate those anyway. Many are still unaware that guidelines even exist.

And then when things get really bad, we act as if we weren't to blame. From an editorial in JAMA:

It is surprising that among frail elderly patients with a systolic blood pressure less than 130 mm Hg (20 percent of the studied group), the use of multiple antihypertensive drugs was continued, because few evidence-based data support this approach.

Really? It's surprising?

Getting doctors to change their behavior is hard, and getting them to stop doing something may be even harder. But all of this is important, and it's part of why health services research is so critical.

A final note: Even when I'm upset about some aspects of medicine, I'm grateful for so many others—like the ones helping Kevin right now. I'm crazy about health care. I'll keep poking it with a stick. That's how I show my love.

Advertise on MotherJones.com

Progressives Are Getting Clobbered in Europe. Here's Why Their Chances Are Better in America.

| Mon May 11, 2015 10:00 AM EDT
Britain's Labour Party leader Ed Miliband delivering his resignation speech.

While Kevin Drum is focused on getting better, we've invited some of the remarkable writers and thinkers who have traded links and ideas with him from Blogosphere 1.0 to this day to contribute posts and keep the conversation going. Today we're honored to present a post from Ruy Teixeira.

The United Kingdom voted on May 7 to determine its next government. Despite predictions that there would be a hung parliament with an advantage to Labor in forming a coalition government, that did not turn out to be the case. Instead the Conservatives won an outright majority, meaning that David Cameron will continue as Prime Minister, not Labor Party leader Ed Miliband, as most believed.

Naturally, Labor Party supporters, and progressives in general, are aghast at this outcome. And certainly a Labor government would have governed differently than the Tories, who have been ruling the UK since 2010 and have famously adopted budget austerity as their main economic policy. But how differently? Oddly, the ascension of "Red Ed", as the British tabloid press likes to call him, may not have made as big a difference as one might think. This is because the Labor Party did not propose to break decisively from the pro-austerity policies of the Tory government. Indeed, the Labor Party election manifesto promised to "cut the deficit every year" regardless of the state of the economy.

Could the torchbearer for social democratic progress be the Democrats in 2016?

This "Budget Responsibility Lock", as the manifesto jauntily called it, may seem bonkers given everything we have learned about the negative economic effects of austerity policies since 2010, including in the UK, and the rapidly declining intellectual credibility of austerity as an economic doctrine. Well, that's because it is bonkers, as Paul Krugman explains in a lengthy article for The Guardian with the somewhat despairing title: "The austerity delusion: The case for cuts was a lie—Why does Britain still believe it?"

The bulk of Krugman's article is a detailed and very convincing analysis of how nutty austerity was as a policy and how poorly it has worked. However, I'm not sure he really clears up the question of why British economic discourse is still dominated by this mythology. But this is a tough one. And it's not as if the British Labor Party is alone in its attempts to reconcile social democracy with austerity; most continental social democratic parties are having similar difficulties breaking out of the austerity framework.

In fact, the center left party that's most ostentatiously stepped out of this framework is that wild-eyed band of Bolsheviks, the American Democratic Party, which has moved steadily away from deficit mania since 2011. This raises an interesting question. Given the macroeconomic straightjacket European social democrats seem determined to keep themselves in, is the Democratic Party really the torchbearer now for social democratic progress?

In this regard, it's interesting to turn to a recent book by political scientist Lane Kenworthy, Social Democratic America, that makes the case (summarized here and here) that, over the long term, the US is, in fact, on a social democratic course.

By social democracy, Kenworthy means an economic system featuring "a commitment to the extensive use of government policy to promote economic security, expand opportunity, and ensure rising living standards for all… [I]t aims to do so while also safeguarding economic freedom, economic flexibility, and market dynamism, all of which have long been hallmarks of the U.S. economy." He calls this "modern" social democracy, contrasting with "traditional" social democracy in that it goes beyond merely helping people survive without employment to also providing "services aimed at boosting employment and enhancing productivity: publicly funded child care and preschool, job-training and job-placement programs, significant infrastructure projects, and government support for private-sector research and development."

Kenworthy anticipates that, as we move toward this kind of social democracy, we will do most of the following:

1) Increase the minimum wage and index it to inflation.

2) Increase the Earned Income Tax Credit while making it available to middle income families and indexing it to GDP per capita.

3) Increase benefit levels and loosen eligibility levels for Temporary Assistance for Needy Families, general assistance, food stamps, housing assistance, and energy assistance.

4) Mandate paid parental leave.

5) Expand access to unemployment insurance.

6) Increase the Child Care Tax Credit.

7) Universalize access to pre-K.

8) Institute a supplemental defined contribution plan with automatic enrollment.

9) Increase federal spending on public child care, roads and bridges, and health care; and mandate more holidays and vacation time for workers.

It's interesting to note that most of this list is consistent with the mainstream policy commitments of the Democratic Party and that a good chunk of it will probably find its way into the platform of the 2016 Democratic Presidential candidate. Maybe Kenworthy's prediction is not so far-fetched.

One other reason to see the US as a potential beacon for social democratic progress stems from the nature of political coalitions in an era of demographic change. In the United States, the Democratic Party has largely succeeded in capturing the current wave of modernizing demographic change (immigrants, minorities, professionals, seculars, unmarried women, the highly-educated, the Millennial generation, etc.) Emerging demographic groups generally favor the Democrats by wide margins, which combined with residual strength among traditional constituencies gives them a formidable electoral coalition. The challenge for American progressives is therefore mostly about keeping their demographically enhanced coalition together in the face of conservative attacks and getting it to turn out in midterm elections.

The situation is different in Europe, where modernizing demographic change has, so far, not done social democratic parties much good. One reason is that some of these demographic changes do not loom as large in most European countries as they do in the United States. The immigrant/minority population starts from a smaller base so the impact of growth, even where rapid, is more limited. And the younger generation, while progressive, does not have the population weight it does in America.

Beyond that, however, is a factor that has prevented social democrats from harnessing the still-considerable power of modernizing demographic change in Europe. That is the nature of European party systems. Unlike in the United States, where the center-left party, the Democrats, has no meaningful electoral competition for the progressive vote, European social democrats typically do have such competition and from three different parts of the political spectrum: greens; left socialists; and liberal centrists. And not only do they have competition, these other parties, on aggregate, typically overperform among emerging demographics, while social democrats generally underperform. Thus it would appear that social democrats, who have also hemmoraged support from traditional working class voters, will be increasingly unable to build viable progressive coalitions by themselves.

Bringing progressive constituencies together across parties is of course difficult to do and so far European social democrats seem completely at sea on how to handle this challenge. Much easier to have all those constituencies together in one party—like we do in the United States.

The road to progress isn't clear anywhere but, defying national stereotypes, it's starting to look a bit clearer in the US than in Europe.

Here's the Reason Cable News Is Going Down the Tubes

| Mon May 11, 2015 6:00 AM EDT

cable news ratings

While Kevin Drum is focused on getting better, we've invited some of the remarkable writers and thinkers who have traded links and ideas with him from Blogosphere 1.0 to this day to contribute posts and keep the conversation going. Today we're honored to present a post from Ezra Klein, editor-in-chief of Vox.

Cable news is in trouble. The Pew Research Center reports that the median daily audience for Fox, CNN, and MSNBC is down about 11 percent since 2008.

The Washington Post's Paul Farhi sees a grim future for the industry. He argues that cable news is pretty much where newspapers were a decade ago: Their audience is aging, their medium is being disrupted by new technologies, and the next generation of viewers is developing habits and preferences that they're poorly placed to serve. (This is probably a good moment to note that I'm a contributor to MSNBC.)

The networks may still be making money—in 2014, Fox News managed $1.2 billion in profits, while CNN cleared $300 million and MSNBC made a bit more than $200 million—but Farhi suggests the "the cable news networks will face bankruptcy the same way Ernest Hemingway once described a character’s financial demise: 'Gradually and then suddenly.'"

Perhaps that's right. But while Farhi's account of cable news' woes focuses mainly on the cable part of the equation, it's also worth considering the problems all three networks are having with the news itself.

The rise of the three major cable news networks were all driven by stories they dominated. CNN was made by the 1991 Gulf War. It wasn't just the first time they passed the networks in ratings. It was the first time they showed they could beat the networks in coverage. You can still feel the surprise in this New York Times article from 1991:

The shooting in the Persian Gulf began tonight with the three broadcast networks committed to covering the war on a 24-hour basis, although their image as news leaders was damaged by the Cable News Network's early dominance of the coverage…the networks' image was certainly not helped when Defense Secretary Dick Cheney said he was following the attacks on Baghdad on CNN. At least one network station, an NBC station in Detroit, decided to quit its network's coverage to run CNN's. And NBC finally was compelled to interview CNN reporters on the air to get information out of Baghdad.

Fox News, for its part, saw basically exponential growth around 9/11, and then again around the 2008 campaign and Obama's election. MSNBC's rise was driven by the backlash to the Bush administration, and particularly to the Iraq War:

The network held those gains in the first half of the Obama era as liberals went from terrified to triumphant. But as liberals have gone from triumphant to a bit depressed and checked out, viewership has begun to decline.

The recent rise of cable news, particularly Fox and MSNBC, came in a period when the news—particularly political news—was unusually interesting.

Between 2000 and 2012, we saw a contested US presidential election, the largest terrorist attack ever on US soil, wars in Afghanistan and Iraq, repeated wave elections, a global financial crisis, the first black president, the rise of the tea party, the fight over Obamacare, and the first states to legalize gay marriage and marijuana—and much more. It's been a weirdly interesting, consequential period in American politics. And so the cable news networks, which could devote 24 hours a day to covering these stories, benefited.

But now it's an unusually dull period in American politics. Congress is gridlocked, and is likely to stay that way for the foreseeable future. The United States, thankfully, isn't reeling from a terrorist attack or a financial crisis. We haven't invaded Iran, at least not yet. And it's not just cable news that's losing viewers because of it. Turnout in the 2014 election was the lowest it's been in 70 years.

You see this, I think, in the specific fortunes of the cable networks. Farhi reports that MSNBC lost 14 percent of its audience in 2014, and Fox lost 2 percent. But CNN prime time—which swung away from politics towards covering plane crashes and airing documentaries—is up 10 percent in 2015.

Which is all to say that Farhi may be right about the long-term decline of cable news—over some extended period of time, both network and cable channels are going to be diminished by whatever it is the internet creates in their place.

But year to year, a lot of the ups and downs might just be the appeal of what's actually in the news. If President Scott Walker goes to war with Iran, MSNBC's ratings are going to go up. If President Hillary Clinton takes away everyone's guns, Fox is going to boom. But for now, relative peace and stability are bad news for cable news.

And One Chart to Rule Them All

| Sat May 9, 2015 8:34 PM EDT

It feels like it's been weeks since I last created a chart for this blog. I suppose this is because it has been weeks. Today that changes.

Over on the right is the chart that's controlled my life for the past couple of weeks. That's not to say there weren't plenty of others. My potassium level seemed to be of particular concern, for example, but that would make an especially boring chart since it just bounced around between 3.3 and 3.9 the entire time. (They added a bag of IV potassium to my usual daily hydration whenever it fell below 3.6.) Now that I'm home and my IV line is gone, I'm eating more bananas than usual, just to be on the safe side, but that's about it.

But that was nothing. What really mattered was my white blood count. You can see it on the right. For some reason, the two days of actual chemotherapy are called Day -2 and Day -1, and the day of the stem cell transplant is Day 0. On that day, as you can see, my count was around 6500, which is quite normal. Then, as the Melphalan steamrolled everything in its path, it plummeted to ~0 on days 7 and 8. Bye bye, immune system. Finally, on Day 9, as the transplanted stem cells started to morph into various blood products, my count skyrocketed. By the time I was discharged on Day 14, it was back to normal levels.

Fascinating, no? Especially when it's in chart form!

Lessee. Any other news? My fatigue is still pretty heavy, and will stay that way for 2-3 weeks. I didn't realize it would last so long, partly because my doctor waited literally until my discharge date to tell me. But it's for real. It took me two tries to create this post: one session to create the chart, after which I crashed, and a second session to write the text. Not exactly speed demon blogging. What else? I have a nasty metallic taste in my mouth all the time. It sucks. And I think my hair is finally getting ready to fall out completely. This morning my pillow was covered with tiny little pieces of hair, and it's pretty obvious where they came from. On the bright side, my appetite is improving. I'm not yet at the stage where I really want to eat, but I'm mostly willing to eat, which is good enough for now. This may be partly due to the fact that I'm wearing one of those seasickness patches behind my ear to fight nausea. It seems to be working.

Oh, and I can now take a nice, normal shower without first having to spend ten minutes trying to bundle up my catheter so it doesn't get wet. Woohoo!

Friday Cat Blogging - May 8 2015

| Fri May 8, 2015 3:44 PM EDT
VZ and CC

While Kevin is taking a break and getting better, we're rounding out the usual Friday Cat Blogging routine with some special Mother Jones-affiliated guests.

Today, I'm happy to present CC and VZ. These handsome brothers were adopted from a Berkeley shelter by Ian Gordon, our copy editor. Named Sacco and Vanzetti at birth (I did mention the shelter was in Berkeley, right?), their new family quickly developed nicknames that would be less of a mouthful. Below you'll find CC on the left, and VZ on the right.


These fellas are intrepid neighborhood explorers. Ian reports that they have indoor visitation rights at at least three nearby houses. Don't you wish they'd stop by and class up your joint sometime?

If they did, they just might come bearing gifts. Their phase of hunting, gathering, and gifting mysterious objects to their caregivers is well cataloged on Ian's Look What the Cats Dragged In Tumblr, where you'll find alternately hilarious and discomfiting documentation of undergarments, empty food packages, and decades-old newspapers.

Where do they get this stuff? How do they make their selections? What are they trying to communicate?

The only ones who know aren't talking.

Bonus Friday Cat Blogging - 8 May 2015

| Fri May 8, 2015 12:00 PM EDT

Well, I'm home. I slept in my own bed last night for the first time in two weeks. No cats to greet me, though, since we first have to wait for all my shiny new cells to mature a bit—enough to handle a couple of cats, anyway. The furballs will be back home in three weeks, but in the meantime here are Hilbert and Hopper lounging on my sister's magazine pile. Sadly, the New York magazine on the far left met with a gory death a few days after this picture was taken. It is the price of cuteness.

Advertise on MotherJones.com

Why Would an Economic Analysis Want to Ignore American Slavery?

| Fri May 8, 2015 7:00 AM EDT

While Kevin Drum is focused on getting better, we've invited some of the remarkable writers and thinkers who have traded links and ideas with him from Blogosphere 1.0 to this day to contribute posts and keep the conversation going. Today we're honored to present a post from Ryan Cooper, national correspondent for the Week.

The next several years will see a rolling 150th anniversary of Reconstruction, my favorite period in American history. From about 1865 to 1877, American society as a whole tried reasonably hard to do right by the freed slaves, before getting tired of the effort and abandoning them to the depredations of racist terrorism. For the next nine decades, black Americans had few if any political rights under the boot heel of Jim Crow.

It's both a shining example of what can happen when a society really tries to right a past wrong, and tragic, infuriating failure of will. But most of all it's very interesting. Things were changing, social orders were being overthrown, historical ground was being broken. At a time when few nations had any suffrage at all, roughly 4 million freed slaves got the vote in a single stroke, perhaps the single starkest act of democratic radicalism in world history.

So it's weirdly fascinating to read conservative historiography of the 19th century, such as this piece by Robert Tracinski at the Federalist, as an example of how Darryl Worley-style historiography irons all the best parts out of American history.

He's interested in trying to prove that a "non-coercive" economy is possible, by which he means that taxes and spending could be dramatically lower than they are today. Thus he charts government spending as a percentage of GDP, finds that it was pretty low for most of the 19th century, and claims victory:

What the left wants is not just to make America’s economic history disappear. It needs to make America’s political system disappear: to make truly small, truly limited government seem like a utopian fantasy that can safely be dismissed. Please bear in mind that this latest example came up in the context of a discussion about the justification for government force. So what they want to describe as an unrealistic fantasy is a society not dominated by coercion.

One might think that when writing a paean to a noncoercive century, it might be a good idea to address the fact that for 60 percent of that century, it was government policy that human beings could be owned and sold like beasts, or that half or more of the national economy was based on that institution. But no, the word "slavery" does not appear in the piece. Neither does "Civil War" or "Reconstruction," which as a literal war against and military occupation of the South would seem fairly coercive.

So speaking of the 19th century as one notably free of coercion is not just utterly risible, it's also a cockeyed way to look at what was good or bad about it. The economy of the antebellum South was founded on the labor of owned human beings, extracted through torture. Slave masters set steadily increasing quotas for cotton picking, for instance, and would flog slaves according to the number of "missing" pounds. As Edward Baptist writes, they thus increased the productivity of slave cotton-picking by nearly 400 percent from 1860 to 1865.

It was akin to the Gulag system of Soviet Russia, except that it had all the power of the red-hot Industrial Revolution, including cutting-edge financial technology, behind it. That combination of slavery plus explosive economic growth and innovation made the antebellum South one of the most profoundly evil places that has ever existed — one that was an absolutely critical part of early industrial growth in both Britain and the North.

But on the other hand, the war that ended slavery, despite involving coercion in the form of organized mass killing, was therefore good! And so was Reconstruction, even though that involved extremely harsh measures against the likes of the KKK. Whether coercion is good or bad depends on just who is being coerced and why.

And that, in turn, puts the lie to conservative complaints that liberals always "blame America first." On the contrary, grappling with the pitch-black periods of history makes the positive notes shine all the brighter. As Ta-Nehisi Coates has written, the "epoch of slavery is…the quintessential romance of American history." It's just a romance difficult to detect in the GDP statistics.

Attention Parents: Your Neighborhood Matters More Than You Do

| Thu May 7, 2015 12:34 PM EDT

A few days ago Justin Wolfers passed along some new research showing that growing up in a good neighborhood has immensely positive effects on future success:

I will start with the smaller of their two studies....The findings are remarkable....The children who moved when they were young enjoyed much greater economic success than similarly aged children who had not won the lottery....The sharpest test comes from those who won an experimental housing voucher that could be used only if they moved to low-poverty areas. Here the findings are striking, as those who moved as a result of winning this voucher before their teens went on to earn 31 percent more than those who did not win the lottery. They are also more likely to attend college.

....It is rare to see social science overturn old beliefs so drastically. It happened because these scholars returned to an old experiment with a fresh perspective, based on the idea that what matters is how long children are exposed to good or bad neighborhoods. But is this the right perspective?

Here’s where the second study is critical. While the conclusions of the Moving to Opportunity project are based on following only a few thousand families, Mr. Chetty and Mr. Hendren use earnings records to effectively track the careers and neighborhoods of five million people over 17 years.

Instead of contrasting the outcomes of families in different areas — which may simply reflect different families choosing to live in different areas — they can track what happens to families when they move....Their findings are clear: The earlier a family moved to a good neighborhood, the better the children’s long-run outcomes. The effects are symmetric, too, with each extra year in a worse neighborhood leading to worse long-run outcomes. Most important, they find that each extra year of childhood exposure yields roughly the same change in longer-run outcomes, but that beyond age 23, further exposure has no effect. That is, what matters is not just the quality of your neighborhood, but also the number of childhood years that you are exposed to it.

A crucial advantage of this analysis is that it follows the children through to early adulthood. This matters because a number of recent studies have shown that interventions have effects that might be hard to discern in test scores or behavioral problems, but that become evident in adulthood. The same pattern of years of exposure to good neighborhoods shaping outcomes is also apparent for college attendance, teenage births, teenage employment and marriage.

This may all seem obvious to you—of course good schools and good playmates matter a lot—but professionals in this field have long believed that quality of parenting is by far the most important factor in a child's success. This is a popular and comforting notion that Judith Rich Harris effectively demolished more than a decade ago in The Nurture Assumption, but it hangs on tenaciously anyway. Nor do you have to buy Harris's theories hook, line, and sinker to believe she has the basic shape of the river correct. For example, I happen to think she underplays the evidence that good parenting matters. But not by much. The simple fact is that kids pick up cues about how to act far more from the collective influence of friends, siblings, teachers, TV, babysitters, and others than they do from their parents. It's hardly even a fair contest. As I put it a few weeks ago: 

This means that the single biggest difference you can make is to be rich enough to afford to live in a nice neighborhood that provides nice playmates and good schools.

This, unfortunately, doesn't make things any easier for policymakers. Teaching good parenting skills may be a monumental challenge, but it's no less monumental than somehow conquering poverty and making sure every child grows up in a good neighborhood. There are no easy answers. But at a minimum, it's always better to at least make sure we're pointed in the right direction.

Do Small Businesses Deserve Exemptions From the Minimum Wage?

| Thu May 7, 2015 9:00 AM EDT
Brian Hibbs (far right) and his employees, some of whom may lose shifts or even their jobs, if San Francisco's minimum wage goes to $15.

Brian Hibbs, a Mother Jones reader and owner of Comix Experience, wrote in to object to San Francisco's plan to raise its minimum wage. Conservatives who argue against the minimum wage often point to jobs lost and heavy burdens on small businesses, and progressives largely brush off those arguments as so much Chamber of Commerce propaganda. And then you have guys like Hibbs. Read what he has to say, and then we'll discuss.

I own two comic book stores in SF, and while we're a profitable business and have been for 26 years, we're only modestly profitable, y'know? When you calculate my own salary on a per-hour basis, given that 70-hour weeks are not at all uncommon for me, I don't make much more than the high-end of SF's new minimum wage law.

Raising the minimum wage by 43 percent (from $11.05 today to $15 in 2018) means that we need to generate at least another 80 grand in revenue. Eighty grand. I don't personally make eighty grand in a year. I'm not some kind of fat cat getting rich off the exploitation of my workers or something. And look, if I did manage to increase sales by that amount, I'd sure be hoping that I got to keep a tiny little percentage of it myself.

Just so we're clear: The hole I find myself soon facing isn't one created by escalating San Francisco rents (my landlord is awesome!), or because of competition from the internet (in fact, our sales consistently grow year-over-year, and sales growth has accelerated since the introduction of digital comics), but one solely and entirely created by the increase of the minimum wage.

I'm a progressive; I support fair labor practices, and I try, above all else, to give the folks who work for me absolute agency in their jobs. I have multiple employees who quit higher paying jobs for corporate owners to come work for me, because I actively valued their passions. I don't own a comic book store to make money as my primary goal, right? The primary goal is to wake up the morning and be excited by what you do, to feel like you're spreading your passion, that you're promoting art, and creators and joy—and my staff feels much the same way.

I have staff who are supported by a spouse and are working for me to essentially make pocket money; I have staff who want to be full-time artists, and this helps them get closer to their goal by exposing them to the form and helping them make contacts. I have staff who are actively working toward having their own stores, and I'm basically paying them to get a master's class (though I am fine with that!). I have staff who are full-time students living at home.

I'm not exploiting any of them, I don't think. They all have options, and they all work for me because they want to.

If I can't increase sales by $80,000—which is not something that seems likely, given historical year-over-year gains—then I have to start firing people, or trimming hours of operation. We don't run extravagant overlaps—nearly 60 percent of the hours the stores are open we only have one person on deck; nor do we have a lot of waste or absurd inventory or anything like that. I've survived in a kind of marginal business for 26 years by being a savvy businessperson, and a relatively nimble and predictive one. But firing people, cutting hours…how does that help the employees? How does that help the business expand so I can eventually hire more people?

I have the largest staff of any SF comics business (because I have two locations), and, in point of fact, my two closest competitors have zero employees. Not being impacted by this mandate, they'd have no reason to raise prices in tandem…and really, every reason to not do so. If I raised prices by, let's say, 10 percent to meet this mandate, I'm absolutely positive we'd lose at least 20 percent of our business to stores that didn't raise their prices—thereby putting us at a net negative.

We’re trying to solve this problem by growing our way out of it with a new national, curated Graphic-Novel-of-the-Month Club, but I think that if we’re able to succeed from that (and I am not at all sure we will) it will be because of years of building our exceptional reputation. As a result, I do not at all think that this type of solution is scalable for the average small business. The City of San Francisco’s own Office of Economic Analysis believes the minimum wage hike will cost 15,270 jobs, or 2 percent of the private workforce!

Honestly, if San Francisco had voted for "Minimum Wage must be at least equal to X percent of your net profit" or "Every person in America gets a guaranteed income of $20,000/year paid for by progressive taxes" or some other scheme where you know that people being asked to contribute more can afford it, then maybe we'd be on sounder ideological ground...But I think that the higher minimum wage, the higher you're making the barriers for low-income people and marginal-but-promising businesses to even have a chance to enter the marketplace and to survive in the first place, let alone legacy businesses like ours.

Here's my personal take: It's hard not to feel sympathy for Hibbs, yet it would be a mistake to take his situation as a case for abolishing or making exceptions to the city's minimum wage law. As I've noted elsewhere, raising the minimum wage doesn't tend to decrease overall employment; in general, businesses find new efficiencies and their workers find themselves with more disposable income to spend on things like comics.

Of course, that's probably little comfort to Hibbs, who faces competition from smaller comics stores whose sole proprietors are the ones manning the cash registers. Hibbs may well be able to keep his doors open by downsizing, bringing in volunteers, or drumming up donations from devoted customers (as one local bookstore has done), but when it comes down to it, there simply may not be much of a future for bricks-and-mortar comics stores in a city with astronomical real estate prices.

"I super commiserate with him because we are in almost the identical situation," says Lew Prince, a member of the group Business for a Fair Minimum Wage and the owner of Vintage Vinyl, a record store in St. Louis. Dwindling sales and rising labor costs forced Prince to consolidate his two Vintage Vinyl locations into one. He nonetheless supports increasing Missouri's minimum wage from $7.65 to $12 an hour because he thinks it's the right thing to do. "The job of the business owner is to prepare for the future," he told me. "I have great empathy and sympathy for [Hibbs], but you have to do the job every day, and sometimes the marketplace defeats you."

But maybe that point of view is too harsh. I'd love to hear, in the comments, what Kevin's readers think about all of this.

Dear Marvel and Sony: We Love Movies for Their Kick-Ass Female Heroes, Too, You Jerks

| Wed May 6, 2015 5:21 PM EDT

While Kevin Drum is focused on getting better, we've invited some of the remarkable writers and thinkers who have traded links and ideas with him from Blogosphere 1.0 to this day to contribute posts and keep the conversation going. Today we're honored to present a post from Shakesville founder Melissa McEwan.

Each time WikiLeaks posts another round of emails from the Sony hack, there is a garbage trove of misogyny: unequal pay, gendered and racist harassment, Aaron Sorkin waxing sexist, Angelina Jolie dismissed as a spoiled brat. Found among the latest collection was a dispatch from Marvel CEO Ike Perlmutter to Sony CEO Michael Lynton on the subject of female-centered superhero films, and if it's not exactly as awful as you're already imagining, that's possibly because it's even worse. Sent under the simple subject line "Female Movies," Perlmutter writes:

Michael,

As we discussed on the phone, below are just a few examples. There are more.

Thanks,

Ike

1. Electra (Marvel) – Very bad idea and the end result was very, very bad. http://www.boxofficemojo.com/movies/?id=elektra.htm

2. Catwoman (WB/DC) - Catwoman was one of the most important female character within the Batmanfranchise. This film was a disaster. http://www.boxofficemojo.com/movies/?id=catwoman.htm

3. Supergirl – (DC) Supergirl was one of the most important female super hero in Superman franchise. This Movie came out in 1984 and did $14 million total domestic with opening weekend of $5.5 million. Again, another disaster.

Best, Ike

Case closed, your honor! At Women and Hollywood, Laura Berger quite rightly notes that Perlmutter's list is highly selective and narrowly defined. "It seems fair to assume," writes Berger, "that Perlmutter is referring specifically to female superhero movies. If that's the case, why is something like 'The Hunger Games' omitted from this list? The extremely lucrative franchise is led by a woman, and while Katniss isn't technically a superheroine, she's certainly marketed as one. Isn't 'The Hunger Games' a more relevant example of how female-led films fare at the box office today than, say, 'Supergirl,' which was released over 30 years ago?" Emphasis original.

At ThinkProgress, Jessica Goldstein shows how easily one could selectively compile a list of male-centered superhero flops if one were inclined to make the incredulous assertion, based exclusively on box office returns and not on the inherent quality of the films, that male-centered superhero films don't work.

The three films on Perlmutter's list frankly just weren't very good. Which has to do with their female heroes only insomuch as studios don't generally dedicate equivalent creative and financial resources to female-centered superhero films, because they don't want to "waste" them on films they fear won't succeed at the box office. Thus the vicious cycle continues: Many female-centered superhero films are set up to fail, and then when one fails, the blame is directed at the women at its center, rather than the misogyny at her back.

This is a conversation that happens around every genre of "hero" film: Superhero films, action films, fantasy films, adventure films. The wildly successful male-centered flicks get rattled off as evidence of what "works," and implicit condemnation of what (allegedly) doesn't.

Many of the wildly successful male-centered franchises have, however, a token female character—carefully segregated from other women and girls, lest they get any ideas about taking over the world, I suppose.

When I watched the Superman series, it was for Margot Kidder's Lois Lane, who I was certain was the coolest woman with the most amazing voice who had ever lived.

And we are ever meant to understand that all of the dedicated superfans of these films watched them because of the men, always the men. What Perlmutter and his cohort don't understand, don't consider, or simply don't care about is that there are plenty of us who watched those films for the women.

When I watched the Superman series, I wasn't watching those films for Christopher Reeve; I was watching them for Margot Kidder's Lois Lane, who I was certain was the coolest woman with the most amazing voice who had ever lived. When I watched the Star Wars trilogy, I had zero interest in Luke; I showed up for Leia. When I watched Raiders of the Lost Ark, I was watching it as much for Marion as I was for Indy. When I watched Dragonslayer (which admittedly was a commercial flop, but later became a cult classic) over and over until I could say every line, I was all about Valerian. When I watched Romancing the Stone, I was cheering for THE JOAN WILDER.

There were female heroes in my favorite films, and they were the reason I watched them. I imagine there are plenty of little girls (and little boys) who watch The Avengers not because of the guys, but because of the one, remarkable, exceptional (in every sense of the word) female hero in their midst. That doesn't show up in the numbers—nor, apparently, in the imaginations of the men who make creative decisions based on numbers.

The thing about many of the films I mentioned is that they're generally regarded as good movies. They were made with monumental investments of care and attention. And they didn't have to be male-centered, but they got that care and attention because they were.

What would happen if a female-centered hero were given the same mighty powers? Welp.