Have you ever seen a poll disappear? Over the past few years, we’ve seen a pollster revise a survey once it was made public and another re-poll a race after it claimed there was a “skew” in the respondents.
But what we saw from Rasmussen Reports this month was different: the withdrawal of a poll in a key Senate race without any satisfactory explanation.
On the morning of Oct. 9, Rasmussen Reports released a survey on its premium website that showed independent Greg Orman leading Republican Pat Roberts 52 percent to 40 percent. The poll, conducted Oct. 7 and Oct. 8, was seen by poll watchers and put into our polling database. It was quite different from two polls released the day prior from CNN/Opinion Research Corp. and Fox News giving Roberts leads of 1 percentage point and 5 percentage points, respectively. The CNN and Fox News polls were the first since Democrat Chad Taylor’s name was withdrawn from the Kansas ballot to show Roberts ahead.
But within a few hours of publication, the Rasmussen Reports poll was no longer on its website. We asked Rasmussen why the poll was removed and received this response from Rasmussen Reports general manager Beth Chunn:
Those numbers were basically raw data that we were still processing and were not anywhere near ready for publication. They were inadvertently posted for our subscribers only and not on our public site. They were taken down as soon as we realized the mistake. We don’t hide any of our polling, so you can be assured that when we have numbers that we’re ready to release publicly, they’ll be available right away at rasmussenreports.com.
It’s rare for a poll to be published and then taken offline. We spoke with Jay Leve at SurveyUSA (a pollster rated highly by FiveThirtyEight), which like Rasmussen uses interactive voice response polling technology, about how often his company has published a poll before identifying a mistake in the sample. He told us SurveyUSA has published only one poll in its 22-year history that it later found had a methodological mistake.
Still, FiveThirtyEight was willing to give Rasmussen Reports the benefit of the doubt. Most pollsters agree that if an error is made while commissioning or publishing a survey, it ought to be corrected. We’ve made mistakes of our own: We’ve inadvertently hit the “publish” button on articles that were still being edited or entered polls incorrectly into our database. These things happen; the important thing is fixing them, explaining the mistake and correcting the record.
But then a funny thing occurred: Days passed, and Rasmussen Reports never released its Kansas survey. So we emailed Rasmussen again to see what was going on.
This time, Chunn said:
When reviewing the Kansas sample, we realized that an error was made in the programming of the survey that may have skewed the data. Instead of releasing that data, we elected to scrap the survey and have rescheduled it for later in the month.
We wanted to find out what exactly the programming error was. Leve told us that the term was ambiguous. It could be “anything from a mispronounced word to a question that is skipped-over inadvertently during programming,” he wrote. “Such things occur rarely.”
So we emailed Rasmussen Reports twice more. Chunn did not respond to our inquiries asking for more information.
We also emailed Ted Carroll, a partner at Noson Lawen Partners that is Rasmussen Reports’ majority investor, for comment. (Rasmussen Reports founder Scott Rasmussen left the company in August 2013 after a dispute with its board of directors.) “I don’t have an answer on this one beyond what Beth noted,” Carroll wrote, “Except that if something is found inaccurate with a survey post-publication I’d want it immediately removed also.”
Rasmussen Reports has never been strong about disclosure. It isn’t a member of the National Council of Public Polls and doesn’t participate in the AAPOR Transparency Initiative. But polling firms that have had to rescind or revise polling results have made some effort to describe the nature of the error.
In our view, a highly plausible explanation is that this wasn’t an error per se so much as an example of “herding.” Polls such as Rasmussen Reports’ that take methodological shortcuts reach only a tiny fraction of the voter population, resulting in poor raw data. However, as we’ve described, and as other researchers have found, these polls tend to produce more accurate results when there are other polls with stronger methodological standards surveying the same races. Essentially, the cheap polls may be copying or “herding” off their neighbors. This can take the form of a sin of commission: manipulating assumptions like those involving turnout models and demographic weighting to match the stronger firms’ results. Or it may be a sin of omission: suppressing the publication of polls that differ from the consensus.
In the past, Rasmussen Reports would not have been the first firm you’d accuse of striving to match the polling consensus. Instead, its surveys had a strong Republican “house effect”, meaning that they showed more favorable results for Republicans than other surveys of the same races. On Election Day, this house effect often translated into a Republican bias. Since Rasmussen Reports began publishing polls in 2000, its surveys have been 2.3 percentage points more favorable to Republicans than the outcomes of the elections they polled.
This year, however, Rasmussen Reports polls have shown little house effect, instead closely matching the average of other surveys. This could reflect methodological changes. For instance, Rasmussen has begun to use somewhat larger sample sizes, and it’s begun to conduct polls over two- or three-day windows rather than surveying voters only over one evening.
But the closer match between Rasmussen polls and the polling averages could also be a product of herding. In the case of Kansas, it may have been that Rasmussen Reports was herding off of a Marist College survey, published four days before the Rasmussen poll, which had Orman ahead by 10 points in the race — very close to the 12-point lead that Rasmussen’s disappearing poll showed for Orman. Perhaps Rasmussen got cold feet after CNN and Fox News published polls showing Roberts slightly ahead instead — and after the Rasmussen result came under criticism from poll watchers on Twitter.
Rasmussen Reports did finally publish a Kansas poll Thursday, which had Orman 5 points ahead. We’ve included it in the FiveThirtyEight database, while also restoring Rasmussen’s earlier (Oct. 9) Kansas poll to our database.
We’ve developed a detailed set of rules over the years that cover just about any odd polling situation. But we’ve never been confronted with the case of a disappearing poll before. In the absence of a better explanation, it looks as though Rasmussen Reports didn’t trust its own poll enough to stand behind it.