Here's Why All the Polling Models Are Probably Right
Paul Waldman describes the Bizarro-world war on polling guru Nate Silver that's gained steam over the past week:
In the last few days, we've seen a couple of different Silver narratives emerge as attention to him has increased. First, you have stories about how liberals are obsessing over Silver, "clinging" to him like a raft in a roiling sea of ambiguous poll data. Then you have the backlash, with conservatives criticizing him not because they have a specific critique of the techniques he uses, but basically because they disagree with his conclusions....Then you've got the reporter backlash. At Politico, Dylan Byers raised the possibility that Silver would be completely discredited if Mitt Romney won, because "it's difficult to see how people can continue to put faith in the predictions of someone who has never given that candidate anything higher than a 41 percent chance of winning."
This whole thing is deeply weird. Could Nate be wrong? Sure. Ditto for Sam Wang and Drew Linzer and all the rest of the poll modelers out there. But if they're wrong, it probably won't be because of their models. After all, with minor differences they all do the same thing: average the publicly available state polls, figure out who's ahead in each state, and then add up the electoral votes they get for each candidate. Sure, they all toss in a little bit of mathematical secret sauce, but not really all that much. You could do the same thing if you felt like it. Want to know who's ahead in Ohio? Go add up the five latest polls and then divide by five. Voila. You are now your own Nate Silver.
Needless to say, though, the poll modelers are only as good as the polls they use. If the pollsters are systematically wrong, then the models will be wrong. And while there are a few small sources of potentially systematic bias (not calling cell phones, demographic weighting, etc.), by far the biggest is the pollsters' likely voter screens. But even here, with one or two exceptions, this is pretty simple stuff. Most pollsters just ask a question or two that go something like this:
- Are you planning to vote?
- How sure are you that you'll vote?
- Really? Honest and truly?
That's about it. If you tell them you're highly likely to vote, they mark you down as a likely voter. If not, they don't. There's no rocket science here.
So if the modelers are wrong, it will probably be because the pollsters were systematically wrong. And if the pollsters are systematically wrong, it will probably be because this year, for some reason, people started lying about their likelihood of voting. And while anything's possible, I sure can't think of any reason why this year there would be a sudden change in how truthful people are about their intention to vote.
That's what this whole controversy comes down to. Conservatives seem to be convinced that Democrats simply won't turn out in high enough numbers to reelect Obama. A fair number of liberals fear the same thing. But there's no analytic reason to believe this. The Obama campaign's ground game seems to be as good as any in the business, and Obama voters are telling pollsters that they're likely to vote in big enough numbers to give him the key swing states he needs to win. That's the current state of our knowledge. It might be wrong, but if it is, the question isn't going to be why Nate Silver went astray. The question is going to be, why was 2012 the year when people suddenly started lying to telephone pollsters?
UPDATE: Asawin Suebsaeng has a roundup of all the prognosticators here. It's a nice, Cliff Notes version of who's who and what they're saying.