Here's something to take our minds off politics for the next few hours as we await word from Florida about just how badly Mitt Romney and his George-Soros-Goldman-Sachs-New-York-Washington-establishment-money-power have crushed Newt Gingrich's people power in today's primary. It comes from a biography of Frances Perkins, FDR's secretary of labor, and it's a reporter's description of her eyes:

It is her eyes that tell her story. Large and dark and vivid, they take their expression from her mood. If she is amused, they scintillate with little points of light. If moved to sympathy or compassion, they cloud over. At the slightest suspicion of insincerity or injustice, they can become keen and searching.

I'm pretty much oblivious to people's eyes. I could sit across from you for an hour in deep conversation and come away not even knowing the color of your eyes, let alone whether they scintillate or cloud over from time to time. So I am, sort of literally, a blind man when it comes to stuff like this.

So I turn to you, my faithful readers. Are descriptions like this for real? It's part of the whole "eyes are the window to the soul" schtick, which has always seemed more poetic than verifiably factual to me, but what do I know? And another thing: if this is real, how does it happen? That is, what physiological mechanism makes eyes scintillate or cloud over?

Help me out, those of you with normal human perceptions. What's the deal here?

POSTSCRIPT: And here's a fascinating historical tidbit that I learned today. In 1938, suspecting that Perkins, the first female cabinet member, was a communist sympathizer, conservatives concocted a story that she wasn't really American at all. Instead, she was supposedly a Russian Jewish immigrant who had lied about her real identity. Perkins eventually set the record straight in a letter outlining her genealogy, but there's no mention of whether she also had to release a copy of her long-form birth certificate to quell the rumors.

It's remarkable how history repeats itself, isn't it?

Adam Skaggs writes that Congress needs to do something about the tsunami of money coming into campaigns via supposedly independent Super PACs:

Super PACs make a mockery of the idea of independence. As Elizabeth Drew wrote recently in the New York Review of Books, today, the “connections between . . . candidates and the Super PACs supporting them aren’t very well hidden.”....The candidate Super PACs were all established by former campaign advisors to the candidates. They are funded by friends and associates with close ties to the candidates (or, in the case of former candidate Jon Huntsman, by the candidate’s father). As election law expert Rick Hasen explained, Super PACs can do a lot that sure sounds like coordination, including soliciting funds, attending fundraisers, appearing in ads, and using the same lawyers — all without coordinating, and still legally claiming to be independent.

....There are countless ways the existing system of campaign finance should be reformed, but cleaning up Super PACs is an obvious first step. Congress should adopt common-sense rules that make terms like independence and coordination mean something. Super PACs that function as adjunct campaigns should be treated like what they are — and they should be subject to the same contribution limits as candidates. Putting candidates in charge of their own campaigns is the first step toward putting the public back in charge of democracy.

I would really like to hear more about this from someone steeped in — something. I'm not sure what, actually. Election law? Insider trading law? Maybe both. In any case, I'd like to hear in some detail how, exactly, rules could be written that would guarantee genuine independence. Even if some of the most obvious loopholes were closed, it still sounds close to impossible to do this without creating a lot of unintended consequences that could end up being worse than the disease we're trying to cure. Here's Drew, for example, on various proposals to cure the plague of Super PACs:

Another route would be through new legislation to assure the independence of the Super PACs. But even if this could be achieved another serious problem would arise: political consultants could be making their own decisions about what would help their candidates, who could lose control of their own campaigns.

Would true independence be better or worse than what we have now? That's as unclear to me as it is to Drew. And it's unclear to me if we could really police independence effectively anyway. After all, how many successful prosecutions for insider trading have we seen recently? Not many. It's a similar principle, and it's really, really hard to prove even though financial records often make a prima facie case that's even stronger than suspicions of collusion in electioneering.

So: suggestions welcome. But I suspect this is a very, very hard problem.

Apropos of my post this morning suggesting that Republicans unhappy with the current presidential field will come around, here is Rick Perlstein:

I've never been impressed with the argument that Mitt Romney makes for a weak Republican nominee because conservatives don't like him. That's not how that party works. Like they say, "Democrats fall in love, Republicans fall in line." Don't believe me? Think back four years. When the race was still up in the air, the venom aimed at McCain was ten times worse than anything being suffered by Mitt. I collected the stuff back then: Rush Limbaugh said McCain threatened "the American way of life as we've always known it"; Ann Coulter said he was actually "a Democrat" (oof!); an article in the conservative magazine Human Events called him "the new Axis of Evil"; and Michael Reagan, talk radio host and the 40th president's son, said "he has contempt for conservatives, who he thinks can be duped into thinking he's one of them."

Then McCain wrapped up the nomination, and Mike Reagan suddenly said, "You can bet my father would be itching to get out on the campaign trail working to elect him." One thing Republicans understand: In American elections you have to choose from among only two people — not between the perfect and the good.

Roger that. However, Rick then goes on to argue that Romney's Mormon faith won't hurt him among evangelicals. After all, evangelicals hated Catholics 50 years ago ("Mother of Harlots," "Whore of Babylon," etc.) but they eventually came around when they decided they needed to make common cause to fight abortion. "When the siren song of cobelligerency beckons," he says, "theological qualms tend to fall away. That's the way it's always been."

I'll buy that. But the real question is: When? Rick suggests that conservatives usually abandon their cultural prejudices "in the fullness of time," but Romney doesn't have the fullness of time. He's got nine months. And it's not clear to me that evangelical suspicion of Mormonism as a cult is going to disappear in nine months.

This won't hurt Romney a lot. The alternative is our current Muslim president, after all, so most evangelicals will come around. But if even a few percent don't, and if a larger number vote but don't actively campaign, that could be enough to sink him. In a 50-50 nation, even a few percent can spell the difference between victory and defeat.

Steve Benen, posting from his new home at Rachel Maddow's online presence, suggests that at this point in the primary process Republican voters ought to be getting happier with their candidates:

And yet, as the Pew Research Center found, rank-and-file Republicans are finding themselves less satisfied with their presidential choices, not more. As the Pew report, released yesterday, explained, "In fact, more Republican and Republican-leaning registered voters say the GOP field is only fair or poor (52%) than did so in early January (44%)."

In other words, this field of candidates isn't just unappealing to the party's own voters; it's increasingly unappealing.

But how unusual is this, really? Maybe someone with a vast collection of past polling data can weigh in on this, but I'm not sure that we're seeing anything all that out of the ordinary. Campaigns usually get nastier as they get closer to their endgames, and that nastiness often translates into increased voter dissatisfaction. This year's Republican primary only entered its nuclear phase after New Hampshire, and it's not too surprising that this has driven up everyone's negatives.

Now, this is the point at which I'd normally remind everyone that it's only January (hard to believe, I know, after the debate marathon of the past five months) and there's plenty of time for everyone to cool down before summer. And I think that's exactly what's going to happen. Still, there's that little niggling voice in my head saying "Newt, Newt, Newt....." Will Newt Gingrich, even after he's obviously lost, continue his scorched-earth campaign against Romney? Will Sheldon Adelson fund this doomed effort? I'd guess no. But it's a soft, unconvincing no. He just might, after all.

David Brooks glosses Charles Murray's new book, Coming Apart:

His story starts in 1963. There was a gap between rich and poor then, but it wasn’t that big. A house in an upper-crust suburb cost only twice as much as the average new American home. The tippy-top luxury car, the Cadillac Eldorado Biarritz, cost about $47,000 in 2010 dollars. That’s pricy, but nowhere near the price of the top luxury cars today.

More important, the income gaps did not lead to big behavior gaps. Roughly 98 percent of men between the ages of 30 and 49 were in the labor force, upper class and lower class alike. Only about 3 percent of white kids were born outside of marriage. The rates were similar, upper class and lower class.

Since then, America has polarized. The word “class” doesn’t even capture the divide Murray describes. You might say the country has bifurcated into different social tribes, with a tenuous common culture linking them.

....Roughly 7 percent of the white kids in the upper tribe are born out of wedlock, compared with roughly 45 percent of the kids in the lower tribe. In the upper tribe, nearly every man aged 30 to 49 is in the labor force. In the lower tribe, men in their prime working ages have been steadily dropping out of the labor force, in good times and bad.

People in the lower tribe are much less likely to get married, less likely to go to church, less likely to be active in their communities, more likely to watch TV excessively, more likely to be obese.

I haven't read Murray's book, and probably won't. But I'm not unwilling to take his thesis seriously. The part that keeps pushing back at me, though, is the idea that this is something new. I don't doubt that Murray can show that there's a much larger group of very well-off people today than there was in 1963: these are the folks buying the McMansions and the $100,000 cars. That's not news. And the behavioral changes in the bottom third are real too.

But is it really true that back in 1963 the "upper tribe" and the "lower tribe" were more similar than they are today? It might seem that way in retrospect, but it sure didn't at the time. It didn't seem that way to Gunnar Myrdal or Michael Harrington, anyway. Overall, I can pretty easily buy the "Apart" piece of the title, but I'm a lot less sure about the "Coming" piece. For every example of a way in which top and bottom have diverged over the past 50 years, I suspect that you could also find an example of ways in which they've converged. It's just that Murray wasn't looking for any of those.

But as I said, I haven't read the book. Perhaps someone over at Crooked Timber, or someplace like that, would like to read it and do us all the public service of commenting on it? Thanks.

The Wesleyan Media Project has a new study out today that compares ad spending in the 2008 Republican primary vs. the 2012 primary. Overall spending is down, primarily because Mitt Romney spent a ton of money in Iowa in 2007 but very little in 2011. The big takeaway, however, is the rise of outside interest group spending. In 2008, nearly all spending came from ad buys by the campaigns themselves. In 2012, more than half the spending has come from outside groups, mostly super PACs formed in the wake of Citizens United:

The campaigns all claim they hate this trend, but I'd take that with a grain of salt. Sure, campaigns lose some control when outside groups are spending so much money, but they also gain deniability. Outside groups have more freedom to air genuinely vicious ads — something we're likely to get a big snootful of in the general election — and I'd be surprised if most campaign poobahs didn't secretly think that's a pretty good tradeoff for the loss of message control.

Here's another interesting tidbit from the report: Although overall spending is down, the number of ads purchased is about the same. This means that the average cost of an ad has gone down from $700 in 2008 to $400 this year. I really have no idea why this is. Recession or not, I'm sure ad rates haven't fallen that much, which must mean that both campaigns and outside groups have changed their ad buying strategies. Maybe shorter ads. Maybe ads in cheaper time slots. Beats me. But the data is only for national cable and broadcast buys, not local cable buys, so it's not due to a sudden surge of super-targeted local ads.

Also, Romney is absolutely swamping Gingrich in Florida, buying 60 times as many ads as Gingrich. No, that's not a typo. 60 times. More data at the link.

The Congressional Budget Office (CBO) weighed in today on the fraught subject of whether federal employees are paid more than comparable workers in the private sector (full report here). Their analysis attempted to control for occupation, years of work experience, geographic location (region of the country and urban or rural location), size of employer, and various demographic characteristics (age, sex, race, ethnicity, marital status, immigration status, and citizenship). Their conclusion should come as no surprise: When you account for both wages and benefits, Uncle Sam is generous toward those with less than a college degree and stingy toward those with PhDs or professional degrees.

According to the CBO, the federal government employs a lot more workers with doctorates or professional degrees than private sector companies do (7 percent of the workforce vs. 3 percent of the workforce). Nonetheless, when you look at the overall number, they figure that the federal government's payroll is 16 percent higher than it would be if it paid its workers private sector scales.

Another interesting result: If you look at the range between the lowest and highest paid workers, it's about the same in the public and private sectors for both high school grads and college grads. But the private sector has a way higher range for those with doctorates. The federal government tops out at about $70/hour in wages for the top decile of workers while the private sector tops out at about $140. If you were to look at the top 1 percent instead of the top 10 percent, the difference would probably be even starker.

None of this should come as a big surprise. Federal jobs have always been plum positions for blue-collar workers, while for highly-educated professionals it's something you do if you either want a lot of job security or are really dedicated to public service. If you're a doctor or a lawyer, you can almost certainly do better in private practice than you can working for the government.

Would the quality of the federal bureaucracy improve if we paid less for low-level jobs and used the money we saved to compete better for top-level managers and other professionals? Maybe! But the CBO punts on this: "A key issue in compensation policy is the ability to recruit
and retain a highly qualified workforce. But assessing how changes in compensation would affect the government’s ability to recruit and retain the personnel it needs is beyond the scope of this analysis." Maybe next time.

POSTSCRIPT: Just to be clear, this study is for federal workers only, not all government workers. Other studies I've seen suggest that state and local governments show similar dynamics (high school grads paid more than private-sector workers, professionals paid less), but the difference isn't as large and the overall impact on payroll is close to zero.

Jesse Eisinger and Chris Arnold report on Freddie Mac's latest wheeze:

Freddie Mac, the taxpayer-owned mortgage giant, has placed multibillion-dollar bets that pay off if homeowners stay trapped in expensive mortgages with interest rates well above current rates. Freddie began increasing these bets dramatically in late 2010, the same time that the company was making it harder for homeowners to get out of such high-interest mortgages.

No evidence has emerged that these decisions were coordinated. The company is a key gatekeeper for home loans but says its traders are “walled off” from the officials who have restricted homeowners from taking advantage of historically low interest rates by imposing higher fees and new rules.

Ah, the old Chinese wall. I remember it well from the dotcom bubble days, when investment banks supposedly erected an impenetrable barrier between the bankers who helped go public and the analysts who told clients whether was a good investment. After it all blew up, of course, it turned out the wall wasn't quite as impenetrable as everyone thought.

So how is Freddie doing this? After telling us the story of the Silversteins, who want to get a refi on their current high-interest loan but can't because of Freddie Mac policies, the authors explain:

Here's how Freddie Mac’s trades profit from the Silversteins staying in “financial jail.” The couple’s mortgage is sitting in a big pile of other mortgages, most of which are also guaranteed by Freddie and have high interest rates. Those mortgages underpin securities that get divided into two basic categories.

One portion is backed mainly by principal, pays a low return, and was sold to investors who wanted a safe place to park their money. The other part, the inverse floater, is backed mainly by the interest payments on the mortgages, such as the high rate that the Silversteins pay. So this portion of the security can pay a much higher return, and this is what Freddie retained.

In 2010 and '11, Freddie purchased $3.4 billion worth of inverse floater portions — their value based mostly on interest payments on $19.5 billion in mortgage-backed securities, according to prospectuses for the deals. They covered tens of thousands of homeowners. Most of the mortgages backing these transactions have high rates of about 6.5 percent to 7 percent, according to the deal documents.

So as long as homeowners have to keep paying high interest rates on their loans, Freddie's investment is gold. If they refi into a lower-interest loan, the value of the inverse floater goes down and Freddie is in trouble. Naturally, it's all just a big coincidence that Freddie is simultaneously making it hard for families to refi into lower-interest loans. Chinese wall, you know.

Read the whole thing for more. It's good to see that the American finance industry hasn't lost a step just because of that whole financial collapse thing a couple of years ago.

Aaron Carroll flags a study suggesting that spending a lot of time in front of a screen (TV or computer) doesn't actually have any effect on your life span:

On the whole, I think consuming amounts of technology that would stagger mere mortals has not hurt me too much; I think I've turned out OK…It may be that there are other factors that are correlated with lots of TV time that may make kids or people worse off. Perhaps parents who let their kids watch enormous amounts of TV are more likely to be bad parents. Perhaps parents who let their kids watch enormous amounts of TV are working three jobs, struggling to make ends meet, and can’t play with their kids as much as they would like.

…Many of the studies account for that as best they can. But the media likes to run around extrapolating a small statistically significant correlation into headlines like "TV WILL KILL YOU!" The sensationalism is pretty staggering. This leads to a publication bias, where results that are likely to shock and garner headlines are more likely to get accepted and printed.

So I'm glad to see a negative study get published. I bet you didn’t know about this study, though. It was published last week with almost no fanfare, and I doubt you will see any news stories on it. When it comes to science, I fear the media isn't nearly as fair and balanced as many think they are.

Well, yeah. But this seems to be part of a bigger problem linked to the actual effects of lifestyle choices, not just the reporting on them: Surprisingly few seem to have much impact on mortality. Just in the past few years, new studies have raised pretty serious doubts about the supposed effect on mortality of obesity, salt, saturated fat, routine mammograms under 50, colonoscopies, prostate screening, LDL levels, and lots of other things. Now even a sedentary lifestyle is under attack. I would have expected that to be the last holdout.

One problem, of course, is our focus on mortality in the first place. Obesity may or may not kill you, for example, but it does make diabetes more likely and it does make your joints wear out faster. Modern medicine may be able to control the diabetes and replace your knees, allowing you to live as long as you otherwise would have, but you're still stuck taking lots of medication, paying for joint replacements, and being less mobile.

It turns out that there's just a helluva lot of uncertainty around a lot of things we once thought we had a pretty good handle on. On a broader note, this is one of the reasons that I'm skeptical of studies about health care policies that focus on mortality, even though many of them provide evidence for policy positions that I support. It's just too narrow a lens, because too few things have a major impact on mortality. We'd be better off, I think, spending less time on crude measures of death rates and more time on other good/ill effects of various policies. That would create problems of its own, but at least we'd be looking at things that are more sensitive indicators of whether our policies are working or not.

E.J. Dionne makes the liberal case today that the Obama administration screwed up when it issued rules requiring insurance companies to cover contraceptives:

Speaking as a Catholic, I wish the Church would be more open on the contraception question. But speaking as an American liberal who believes that religious pluralism imposes certain obligations on government, I think the Church’s leaders had a right to ask for broader relief from a contraception mandate that would require it to act against its own teachings. The administration should have done more to balance the competing liberty interests here.

I'm just a big ol' secular lefty, so I guess it's natural that I'd disagree. And I do. I guess I'm tired of religious groups operating secular enterprises (hospitals, schools), hiring people of multiple faiths, serving the general public, taking taxpayer dollars — and then claiming that deeply held religious beliefs should exempt them from public policy. Contra Dionne, it's precisely religious pluralism that makes this impractical. There are simply too many religions with too many religious beliefs to make this a reasonable approach. If we'd been talking about, say, an Islamic hospital insisting that its employees bind themselves to sharia law, I imagine the "religious community" in the United States would be a wee bit more understanding if the Obama administration refused to condone the practice.

I can understand compromising over a very limited number of hot button issues. Abortion is the obvious one. But in general, if Catholic hospitals don't want to follow reasonable, 21st century secular rules, they need to make themselves into truly religious enterprises. In particular, they need to stop taking secular taxpayer money. As long as they do, though, they should follow the same rules as anyone else.