Kevin Drum

We Require Affirmative Consent For Most Things. Why Not Sex?

| Thu Oct. 16, 2014 12:21 PM EDT

Ezra Klein has taken a lot of heat for his defense of California's new "Yes Means Yes" law, which puts in place an "affirmative consent standard" on university campuses to decide whether a sexual assault has taken place. In other words, the mere lack of a clear "no" is no longer a defense against sexual assault charges. Instead, you have to make sure that your partner has given you a clear "yes."

Klein defends himself here in exhausting detail. Most of it you've probably heard before, but perhaps the most interesting part is this: "More than anything, what changed my mind on Yes Means Yes was this article by Amanda Taub, and some subsequent conversations with women in my life." Here's Taub:

When our society treats consent as "everything other than sustained, active, uninterrupted resistance," that misclassifies a whole range of behavior as sexually inviting. That, in turn, pressures women to avoid such behavior in order to protect themselves from assault.

As a result, certain opportunities are left unavailable to women, while still others are subject to expensive safety precautions, such as not traveling for professional networking unless you can afford your own hotel room. It amounts, essentially, to a tax that is levied exclusively on women. And it sucks.

And here's Klein:

Every woman I spoke to talked about this tax in the same way: as utterly constant, completely unrelenting. It's so pervasive that it often goes unmentioned, like gravity. But it colors everything. What you wear. Who you have lunch with. When you can hug a friend. Whether you can invite someone back to your house. How you speak in meetings. Whether you can ask male colleagues out for a drink to talk about work. How long you can chat with someone at a party. Whether you can go on a date without having a friend who knows to be ready for a call in case things go wrong. Whether you can accept seemingly professional invitations from older men in your field. Whether you can say yes when someone wants to pick up the tab for drinks. For men, this is like ultraviolet light: it's everywhere, but we can't see it.

I have some hesitations about this new law, but it's hardly the apocalypse that some of its detractors have made it out to be. It doesn't change the standard of proof required in sexual assault cases and it doesn't change the nature of the proceedings that govern these cases. These may both be problematic, as some critics think, but they're separate issues. "Yes Means Yes" changes only the standard of consent, and does so in a pretty clear and unambiguous way.

Beyond that, keep in mind that this is just an ordinary law. If it were a ballot initiative, I'd be adamantly opposed. But it's not: if it turns out to work badly or produce unintended consequences, it can be repealed or modified. And it's not as if the current situation is some kind of utopia that should be defended at all costs. We'll know soon enough if the law's benefits are worth the costs. In the meantime, it seems like a worthwhile experiment in changing a culture that's pretty seriously broken.

Advertise on MotherJones.com

Let Us Now Praise Placebos

| Thu Oct. 16, 2014 11:02 AM EDT

Placebos are fascinating things. They shouldn't work, but they do. And it's not just pills, either. In certain cases, it turns out, fake knee surgery can relieve pain just as effectively as real knee surgery. Austin Frakt writes about the placebo effect today:

Given the strength and ubiquity of placebo effects, many physicians prescribe them. In fact, doing so was common practice before World War II, with supportive publications in the medical literature as late as the mid-1950s. This practice faded away after the rise of placebo-controlled trials that yielded treatments that were shown to be better than placebos, but it has resurfaced in new forms.

Today, the widespread use of antibiotics for conditions that don’t require them is a form of placebo prescribing, for example. Acetaminophen for back pain appears to be a placebo as well. These may help patients feel better, but only because they believe they will do so. The active ingredient adds nothing. To the extent some doctors trick patients in an effort to achieve a placebo effect, most patients don’t seem to mind. Nevertheless, deliberately harnessing just the placebo effect by prescribing a treatment that does not have any additional direct physical effect is an ethical gray area.

I didn't know that placebo prescriptions were common before World War II. Interesting! I've also lately been trying to figure out whether acetaminophen is actually doing anything for the back pain I'm suffering thanks to an injury a few months ago compounded by some more recent cat-related idiocy that aggravated it. It kinda seems like it might, but I can't really tell. But now I know. If there was an effect, it was a placebo effect.

Still, I'm disappointed that the placebo effect wasn't more significant for me. Maybe this is why I've never had a lot of luck with medication in the first place. It's not that it never works, but that most of it doesn't seem to work very well. Perhaps it's because I rarely have much confidence in the stuff, so I only get half the effect. It would probably help if I were more gullible.

The only recent exception I can think of is prednisone, which miraculously and instantly cured my breathing problems a few months ago. It only lasted a couple of days, unfortunately, though even after that my breathing was vastly improved, if not back to normal. But it did no good because, placebo or not, my doctors had no clue why it worked and were therefore unwilling to try more of it. Nor did it lead to any subsequent treatments since they had no clue what was going on and essentially decided to pretend the whole thing was just a coincidence. And people wonder why I'm skeptical of the medical profession.

What's the Point of an Unenforceable Noncompete Agreement?

| Thu Oct. 16, 2014 10:06 AM EDT

We all learned recently that sandwich shop Jimmy John's forces its workers to sign a noncompete agreement before they're hired. This has prompted a lengthy round of blogospheric mockery, and rightfully so. But here's the most interesting question about this whole affair: What's the point?

Laws vary from state to state, but generally speaking a noncompete agreement can't be required just for the hell of it. It has to protect trade secrets or critical business interests. The former makes them common in the software business, and the latter makes them common in businesses where clients become attached to specific employees (doctors, lawyers, agents) who are likely to take them with them if they move to a new practice. But none of this seems to apply to a sandwich shop. Clare O'Connor of Forbes asked an attorney about this:

“There’s never a guarantee, but I can’t see any court in the world upholding this,” said Sherrie Voyles, a partner at Chicago firm Jacobs, Burns, Orlove & Hernandez. “Every state law is different on this issue, but the general idea is that it’d only be upheld if it’s reasonable. The test would be, is there a near-permanent customer base? No. Customers at Jimmy John’s are probably also customers at Subway.”

Voyles said she can’t imagine any Jimmy John’s outlet actually enforcing this non-compete clause (indeed, there’s no evidence any have tried), but can’t see any reason it’d hold up in court. “It’s not the kind of interest protected by law,” she said.

The Wall Street Journal nonetheless reports that litigation over noncompete clauses has risen over the past few years, but this appears to be mostly in places like the software industry, where trade secrets are important. Not so much in the fast food business.

So again: what's the point? I've not heard of a single case of Jimmy John's actually taking someone to court over this, and it seems vanishingly unlikely that they would. That seems to leave a couple of options. First, it's just boilerplate language they don't really care about but left in just in case. The second is that they find it useful as a coercive threat. Sure, they'll never bother going to court, but maybe their workers don't know that—which means they're less likely to move across the street to take a higher-paying job. In other words, it's a handy tool for keeping workers scared and wages low.

So it's either stupid or scummy. Take your pick.

UPDATE: Stephen Bainbridge, who actually knows what he's talking about, agrees that a noncompete clause like this is pretty much legally useless. But he quotes Cynthia Estlund explaining why it might have value anyway:

Even a manifestly invalid non-compete may have in terrorem value against an employee without counsel.

In terrorem? Lawyers actually use this phrase? I guess it gets the point across, doesn't it?

Rick Scott Takes Late Lead In Southeast Division of Jackass Competition

| Thu Oct. 16, 2014 12:24 AM EDT

WTF?

In one of the weirdest, and most Floridian moments in debate history, Wednesday night's gubernatorial debate was delayed because Republican Governor Rick Scott refused to take the stage with Democratic challenger Charlie Crist and his small electric fan....Rather than waiting for the governor to emerge, the debate started with just Crist onstage. "We have been told that Governor Scott will not be participating in this debate," said the moderator. The crowd booed as he explained the fan situation, and the camera cut to a shot of the offending cooling device.

"That's the ultimate pleading the fifth I have ever heard in my life," quipped Crist, annoying the moderators, who seemed intent on debating fan rules and regulations. After a few more awkward minutes, Scott emerged, and the debate proceeded, with only one more electronics dispute. When asked why he brought the fan, Christ answered, "Why not? Is there anything wrong with being comfortable? I don't think there is."

There are plenty of Republicans who I find more extreme, or more moronic, or more panderific than Rick Scott. But for sheer pigheaded dickishness, he's a hard act to beat. Jeebus.

Is Clean, Green Fusion Power In Our Near Future?

| Wed Oct. 15, 2014 5:50 PM EDT

Fusion is the energy source of the future—and it always will be. That used to be a Unix joke, but in various forms Unix has actually become pretty widespread these days. It runs the server that hosts the web page you're reading; it's the underlying guts of Apple's Mac operating system; and Linux is—well, not really "popular" by any fair definition of the word, but no longer just a fringe OS either.

So maybe fusion is about to break through too:

Lockheed Martin Corp said on Wednesday it had made a technological breakthrough in developing a power source based on nuclear fusion, and the first reactors, small enough to fit on the back of a truck, could be ready in a decade.

Tom McGuire, who heads the project, said he and a small team had been working on fusion energy at Lockheed's secretive Skunk Works for about four years, but were now going public to find potential partners in industry and government for their work.

....Initial work demonstrated the feasibility of building a 100-megawatt reactor measuring seven feet by 10 feet....Lockheed said it had shown it could complete a design, build and test it in as little as a year, which should produce an operational reactor in 10 years, McGuire said.

Over at Climate Progress, Jeff Spross is containing his enthusiasm:

At this point, keeping the world under 2°C of global warming will require global greenhouse gas emissions to peak in 2020 and fall rapidly after that....So by Lockheed Martin’s own timeline, their first operational CFR won’t come online until after the peak deadline. To play any meaningful role in decarbonization — either here in America or abroad — they’d have to go from one operational CFR to mass production on a gargantuan scale effectively overnight. More traditional forms of nuclear power face versions of the same problem.

A WW2-style government mobilization might be able to pull off such a feat in the United States. But if the political will was there for such a move, the practical question is why wait for nuclear? Wind and solar are mature technologies in the here and now — as is energy efficiency, which could supply up to 40 percent of the effort to stay below 2°C all by itself.

Jeez. I get where Spross is coming from, but come on. If Lockheed Martin can actually pull this off, it would mean huge amounts of baseload power using existing grid technology. It would mean cheap power from centralized sites. It would mean not having to replace every building in the world with high-efficiency designs. It would mean not having to install wind farms on millions of acres of land. It would mean not having to spend all our political efforts on forcing people to make do with less energy.

More generally, it would mean gobs of green power at no political cost. That's huge.

The big question is whether Lockheed Martin can actually pull this off. Lots of people before them have thought they were on the right track, after all. But if they can, it's a game changer. Given the obvious difficulties of selling a green agenda to the world—and the extreme unlikelihood of making that 2020 deadline with existing technologies—I'll be rooting for Lockheed Martin to pull this off. Cynicism can be overdone.

Editors' note: Over on the Blue Marble blog Climate Desk Producer James West spoke with a thermonuclear plasma physicist who doubts the significance of this breakthrough and called Lockheed's announcement "poppycock." So there's that.

Lots of People May Misunderstand Thomas Piketty, But That Doesn't Mean They're Wrong

| Wed Oct. 15, 2014 3:32 PM EDT

Thomas Piketty is having another moment in the blogosphere. As you may remember, he's famous for the equation r > g, which states that the rate of return on investments is historically higher than economic growth. This means that rich people with lots of investments get richer faster than the rest of us wage slaves, and this in turn produces growing levels of income inequality.

Is Piketty right? In one of its quarterly polls of economists, the University of Chicago's Institute on Global Management asked if r > g has been the most powerful force pushing towards greater income inequality since the 1970s. Pretty much everyone said no. Take that, Piketty!

But wait. As Matt Yglesias says, this isn't evidence that Piketty is wrong. Quite the contrary: it's evidence that hardly anyone has actually read his book. You see, Piketty doesn't say that r > g has been a big driver of income inequality in recent years. He says only that he thinks it will be a big driver in the future.

This is good clean fun as a gotcha. But liberals should understand that it also exposes one of the biggest weaknesses of Piketty's argument: r > g has been true for centuries, but the rich have not gotten steadily richer over that time. Wealth concentration has stayed roughly the same. Piketty argues that this is likely to change starting around 2050 or so, but this is an inherently iffy forecast since it's several decades in the future. What's worse, he bases it mostly on a projection that economic growth (g) is shortly going to suffer an unprecedented fall. This makes his forecast even iffier. Piketty may be right, but projecting growth rates for the second half of the century isn't something he has any particular expertise in. His guess is no better than anyone else's.

Beyond that, there are also serious suggestions that Piketty has improperly measured r. What this all means is that (a) Piketty's measure of r is questionable because he seems to have conflated gross and net returns and (b) his measure of g is questionable because it's so far in the future. Other than that, r > g is great.

Making fun of misreadings of Piketty's book may be good sport, but those misreadings unwittingly raise serious questions. A proper reading suggests that—for now, anyway—r > g as a driver of income inequality should continue to be taken with a grain of salt.

Advertise on MotherJones.com

No, There's Still No Evidence There Was an Active WMD Program in Iraq

| Wed Oct. 15, 2014 12:35 PM EDT

C.J. Chivers of the New York Times has a long piece today about chemical weapons found in Iraq after the 2003 invasion. A few dead-enders are now gleefully claiming that Bush was right after all. Iraq did have WMD!

This is ridiculous enough that—so far, at least—the savvier wing of the conservative movement is staying mum about the whole thing. There are three main reasons for this. First, most of these weapons were rotting remnants of artillery shells used during the Iraq-Iran war in the 80s and stored at Iraq's Muthanna State Establishment as well as other nearby sites. Murtaza Hussain of the Intercept explains what this means:

The U.S. was aware of the existence of such weapons at the Al Muthanna site as far back as 1991. Why? Because Al Muthanna was the site where the UN ordered Saddam Hussein to dispose of his declared chemical munitions in the first place. Those weapons that could not safely be destroyed were sealed and left to decay on their own, which they did. The site was neither “active” nor “clandestine” — it was a declared munitions dump being used to hold the corroded weapons which Western powers themselves had in most cases helped Saddam procure.

In other words, these shells weren't evidence of an active WMD program, which had been George Bush's justification for the war. They were simply old munitions that everyone knew about already and that were being left to degrade on their own.

Second, the Bush administration kept its discoveries secret. If any of this were truly evidence for an active WMD program, surely Bush and Dick Cheney would have been the first to trumpet the news. The fact that they didn't is pretty plain evidence that there was nothing here to back up their prewar contentions of an Iraqi WMD program.

Third, there's the specific reason these discoveries were kept secret. Chivers tells the story:

Participants in the chemical weapons discoveries said the United States suppressed knowledge of finds for multiple reasons, including that the government bristled at further acknowledgment it had been wrong....Others pointed to another embarrassment. In five of six incidents in which troops were wounded by chemical agents, the munitions appeared to have been designed in the United States, manufactured in Europe and filled in chemical agent production lines built in Iraq by Western companies.

Far from being a smoking gun of Saddam Hussein's continuing quest for illegal WMDs, these discoveries were evidence that Western powers in the 80s were perfectly happy to supply illegal WMDs to an ally as long as they were destined for use against Iran. This was not something Bush was eager to acknowledge.

Iraq had no active WMD program, and it was an embarrassment to the Bush administration that all they could find were old, rotting chemical weapons originally manufactured by the West. So they kept it a secret, even from troops in the field and military doctors. But lies beget lies, and American troops are the ones who paid the price. According to Chivers, "The government’s secrecy, victims and participants said, prevented troops in some of the war’s most dangerous jobs from receiving proper medical care and official recognition of their wounds."

Today, the consequences of our lies continue to haunt us as the rotting carcasses of these weapons are apparently falling into the hands of ISIS. Unfortunately, no mere summary can do justice to this entire shameful episode. Read Chivers' entire piece to get the whole story.

Please Rescue Us. Now Go Away.

| Wed Oct. 15, 2014 10:47 AM EDT

Ed Kilgore brings the snark:

I realize the remarks of politicians should not be imputed to the entire populations they govern or represent. But still, it's hard to avoid noting that Texas—the very sovereign State of Texas, I should clarify, where the federal government is generally not welcome—was at a loss in dealing with a single Ebola case until the feds stepped in.

Sure, this is just a cheap gotcha. But sometimes there's a real lesson even in the simplest gibe, and Kilgore offers it: "It would be helpful to see some after-the-fact reflection on why the resources of a central government are sometimes necessary to avoid catastrophe."

That won't happen, of course. Instead, conservatives are already using this as an excuse to trash the federal government for not coming to their rescue sooner. This will undoubtedly be only a brief preface to yet another round of across-the-board budget cutting because everyone knows there's far too much waste and fat in the system. The irony of it all will, I'm sure, go entirely unnoticed.

Tom Cotton Is Upset That Democrats Ended a Free Money Stream for Banks

| Wed Oct. 15, 2014 10:21 AM EDT

The latest from the campaign trail:

Republican Tom Cotton said during an Arkansas U.S. Senate debate on Tuesday that "Obamacare nationalized the student loan industry." The first-term congressman added, "That's right, Obamacare grabbed money to pay for its own programs and took that choice away from you."

Huh. Does Cotton really think this is a winning issue? I mean, it has the virtue of being kinda sorta semi-true, which is a step up for Cotton, but why would his constituents care? Does Cotton think they're deeply invested in the old system, where their tax dollars would go to big banks, who would then make tidy profits by doling out risk-free student loans that the federal government guaranteed?

That never made any sense. It would be like paying banks to distribute Social Security checks. What's the point? The new student loan system saves a lot of money by making the loans directly, and that's something that fiscal conservatives should appreciate. Instead, they've spent the past four years tearing their hair out over the prospect of Wall Street banks being shut out of the free money business. Yeesh.

The Kids These Days Know More Than You Probably Think

| Tue Oct. 14, 2014 4:35 PM EDT

When I write about American education, the background implication is usually simple: kids these days are dumber than they used to be. Schools are bad; the children are slackers; and the Chinese are going to destroy us. Frankly, I doubt this. It may be true that most 17-year-olds can't locate France on a map, but I'll bet most adults can't either. They just never get tested to find out.

The Boston Review ran a fascinating blog post on this theme a few days ago. It seems that one of the pieces of evidence on the side of the doomsayers is the declining vocabularies of our youth. This has been measured regularly since 1974 by the General Social Survey, and it turns out that scores on its multiple-choice Wordsum vocabulary test rose steadily for generations born between 1900 and 1950 but declined after that.

It's easy to understand why test scores rose for generations born between 1900 and 1950: schooling became far more widespread during the first half of the 20th century. But why did it decline after that? Is it because kids born after 1950 have gotten successively dumber? Claude Fischer summarizes some new research that takes advantage of Google's Ngram viewer to measure how frequently words have been used over the past century:

[The researchers] took the ten test words—most of which became relatively less common over the century—and also the words that appeared in the answers respondents were given to choose from....Each of over 20,000 respondents in the cumulative GSS survey, 1974 to 2012, got a score for how common the words were in the years between the respondent’s birth and the year he or she turned fifteen.

Dorius and colleagues found that, other things being equal, the rise in test scores from the earliest cohorts to the mid-century cohorts is largely explained by the schooling those cohorts got. And importantly, the decline in test scores from the latter cohorts to the latest ones can be explained by the declining use of certain words, especially “advanced” ones. Once both factors are taken into account, there is little difference among generations in vocabulary scores.

In other words, it's not that kids have gotten dumber. It's just that GSS has been using the same words for 40 years, and these words have become less common. The words themselves are kept secret, but apparently they aren't too hard to suss out. In case you're curious, here they are: space, broaden, emanate, edible, animosity, pact, cloistered, caprice, accustom, and allusion.

It turns out that once you adjust for how common these words have been at various points in time, the apparent drop in vocabulary scores vanishes. In fact, vocabulary scores have actually gotten higher, as the chart on the right shows. This is hardly the last word on the subject, which appears to be the topic of a vast literature, but it does go to show how careful you have to be with this kind of stuff. It's safe to say that kids these days are less knowledgeable than their parents about some things. But it's also true that they're more knowledgeable about certain other things. This should probably even out, but it doesn't. It's adults who get to form judgments about which things matter, and they naturally assume that their knowledge is important, while all the stuff the kids know that they don't is trivial and ephemeral.

That's comforting to the oldsters, but not necessarily true. Kids probably don't know less than their parents. They just know different things.