Skip to main content
Wrong Track

If Frank Newport at Gallup is right, than that LA Times/Bloomberg poll that showed Barack Obama with a 12-point lead may have had a flaw after all.

As is always the case, there are some slight differences in the way the polls are conducted. The Los Angeles Times/Bloomberg poll asks a “right direction/wrong direction” question before the ballot. Our Gallup Poll Daily tracking asks a registered voter screen before the ballot. The Los Angeles Times/Bloomberg poll includes the phrase “or would you vote for a candidate from some other party?” Our Gallup poll does not include this phrase. It is unclear how the order of these questions may affect the polling results.

That’s a pretty big no-no in my book. Question order definitely matters — the later you ask a question, the more it’s going to be influenced by the implicit messaging triggered by previous questions. In this case, an overwhelming majority of Americans think the country is on the wrong track, and most of them also associate that with the incumbent Republican administration. So it isn’t surprising that a Republican presidential candidate performs worse if you ask about presidential preference immediately afterward, or that Republican party ID was lower in this survey.

In fact, there happens to be some direct evidence of this. In its new New Jersey poll, Fairleigh Dickinson split its panel into halves. The first half received a battery of national questions — including a “wrong track” question — before being asked their presidential preference. The second half got the Presidential questions first.

Among those respondents who got the presidential question first, Obama had a 47-34 lead. But among who got the wrong track question and then were asked for their presidential pick, Obama’s lead expanded to 51-33. The difference was particularly large among independent voters, who split 24-24 with huge numbers of undecideds when asked the presidential question first, but went 41-14 for Obama if they had been prompted by the national mood questions.

One shouldn’t read that much into this — when we’re looking at subsamples, and particularly subsamples of subsamples — the margins of error are very high. Nevertheless, those splits would fall into line with generally accepted theory. I understand that folks like LA Times/Bloomberg take a lot of pride in their national mood questions, which in some cases they have asked for years on end. But it’s the horse race questions that generate the earned media for your polling organization and convey prestige upon it. That’s why most of the media-savvy pollsters that we tend to reference frequently understand that you need to give horse race questions the top billing.

Nate Silver is the founder and editor in chief of FiveThirtyEight.