Skip to main content
ABC News
Estimating the Cellphone Effect: 2.8 2.2 Points

Mark Blumenthal has a rundown of the pollsters that are including cellphone numbers in their samples. Apparently, Pew, Gallup, USA Today/Gallup (which I consider a separate survey), CBS/NYT and Time/SRBI have been polling cellphones all year. NBC/WSJ, ABC/Washington Post and the AP/GfK poll have also recently initiated the practice. So too does the Field Poll in California, PPIC, also based in California, and Ann Selzer. There may be some others too but those are the ones that I am aware of. (EDIT: A representative of the PPIC survey in California has kindly written to let me know that, while they use a cellphone supplement for some of their public policy surveys, they have not done so thus far this year for most of their Presidential trial heats (they did do so in July). The remainder of this article has been corrected accordingly.

Let’s look at the house effects for these polls — that is, how much the polls have tended to lean toward one candidate or another. These are fairly straightforward to calculate, via the process described here. Essentially, we take the average result from the poll and compare it to other polls of that state (treating the US as a ‘state’) after adjusting the result based on the national trendline.

Since ABC, NBC/WSJ and AP/GfK all just recently began using cellphones, we will ignore their data for now. We will also throw out the data from three Internet-based pollsters, Zogby Interactive, Economist/YouGov, and Harris Interactive. This leaves us with a control group of 36 37 pollsters that have conducted at least three general election polls this year, either at the state or national level.

Pollster                 n   Lean
========= ====
Selzer 5 D +7.8
CBS/NYT 14 D +3.7
Pew 7 D +3.4
Field Poll 4 D +2.8
Time/SRBI 3 D +2.4
USA Today/Gallup 11 D +0.4

Gallup 184 R +0.6
PPIC 4 R +1.3

AVERAGE D +2.8 +2.3

CONTROL GROUP (37 Pollsters) D +0.0 +0.1

Six of the seven eight cellphone-friendly pollsters have had a Democratic (Obama) lean, and in several cases it has been substantial. On average, they had a house effect of Obama +2.8 +2.3. By comparison, the control group had essentially zero house effect a house effect of Obama +0.1 (**), so this would imply that including a cellphone sample improves Obama’s numbers by 2.8 points. (Or, framed more properly, failing to include cellphones hurts Obama’s numbers by approximately 2 2-3 points).

The difference is statistically significant at the 95 percent confidence level. Perhaps not coincidentally, Gallup, Pew and ABC/WaPo have each found a cellphone effect of between 1-3 points when they have conducted experiments involving polling with and without a cellphone supplement.

A difference of 2-3 points may not be a big deal in certain survey applications such as market research, but in polling a tight presidential race it makes a big difference. If I re-run today’s numbers but add 2.2 points to Obama’s margin in each non-cellphone poll, his win percentage shoots up from 71.5 percent to 78.5 percent, and he goes from 303.1 electoral votes to 318.5 (EDIT: I have not changed this part of the analysis in reflection of the new numbers, as it should still get the general point across). The difference would be more pronounced still if Obama hadn’t already moved ahead of McCain by a decent margin on our projections.

So this is my plea to pollsters: let’s get it right. Perhaps the cellphone effect will prove to be a mirage after all, but that’s something for the data to determine on its own, rather than the pollster.

(**) Keen observers will wonder why the average house effect is greater than zero. This is because in determining our house effect coefficients, we weight based on how many polls each pollster has conducted. A couple of pollsters that account for a large proportion of our data, like Rasmussen and ARG, have had slight (very slight, but enough to skew the numbers) GOP leans.

Nate Silver founded and was the editor in chief of FiveThirtyEight.

Comments