Skip to main content
ABC News
Who Are The People Who Don’t Respond To Polls?

“Can we trust election polls?” is a question that has reached a fever pitch in political junkie circles dating back to the 2016 election. One popular theory about why election polls missed in 2016 and 2020 is that Trump-friendly voters refused to respond to surveys, making Trump’s support among the population appear lower than it actually was. People give many anecdotal reasons for why this happened, but the big takeaway from this theory is that election surveys are undercounting Republicans and Trump voters. 

Of course, many proponents of this theory lack data when making this assertion. The FiveThirtyEight/Ipsos poll, conducted using Ipsos’s KnowledgePanel, can shed some light on whether this is happening in 2022. Back in April, we launched a panel study with a group of about 2,000 Americans whose demographic makeup was representative of the U.S. The goal of the project was to see how their fears and beliefs changed in the six months leading up to the midterm elections. However, we’ve found this project has additional, unintended value. Following that initial survey, there was a large drop-off in participation in the second wave and a smaller drop-off between the second and third wave, after which participation largely plateaued. In total, there was a 23 percent decline in the number of participants between the first wave and the sixth and most recent wave (the results of this wave are forthcoming).typical declines in the shares of responses by age and race, among other demographic groupings, such that younger, Black and Hispanic respondents participated at lower rates relative to other age groups and races and ethnicities.

">1 This allows us, in a limited way, to examine something called “nonresponse bias” — that is, who is not answering surveys — and how it impacts polling data.

Our data indicates that some respondents who lean toward the Republican Party are less likely to take part in follow-up surveys. But we didn’t find ourselves in a situation where all Republicans were not answering, and we were able to find a few clues as to who exactly these Republican non-respondents could be.

The charts below show the share of various types of respondents from our initial survey who then took part in subsequent waves.2 By looking at each individual wave, we can get a sense of response rates for the different demographic groups we’ve surveyed: 

People who said they voted for Trump in 2020 and that they plan to vote for Republicans in the midterms this year have very high response rates relative to the overall sample. (It’s worth noting, however, that recalled vote choice is not a perfect measure. We’re using it here as a proxy for partisanship and political engagement.)

Instead, for most of the time series, we see a dramatic drop-off in response rates among 2020 Trump voters who say they are not likely to vote for Republicans this year or people who say they view Trump “very” favorably (as opposed to “somewhat” favorably). 

Likewise, people who said they get most of their news from Fox News were also more likely than the average respondent to continue in the survey. And Americans who primarily get their news from social media or who do not consume political news at all were also among the most likely to drop out.3

Taken together, we have a picture of a specific slice of the Republican electorate that might not be responding to surveys: the Trump-supporting, social media news consumer.

Despite these differences in response rates, there is another factor we must take into account about how election surveys are conducted: “weighting” to match the demographics of the electorate. Since the 2020 election, weighting to vote choice or other political characteristics has become much more widespread. Our FiveThirtyEight/Ipsos survey weights for participation and vote preference in the 2020 election. These political weights can go a long way in repairing any gaps in the sample. 

One way to visualize the partisan impact of the sample is with the generic congressional ballot, which asks respondents if they plan to support an unnamed Democrat or Republican in an upcoming election. When we look at how respondents answered to the generic ballot in the first wave with our unweighted data, we can see clearly how the people who participated in our first wave and our most recent wave skewed Republican (33 percent) and independent (36 percent). The remaining sample in our most recent wave continues to show a slight skew towards Republicans, with 32 percent choosing the Democrat compared to 34 percent the Republican, suggesting the share of the sample planning to vote Democratic has increased. Meanwhile, independents were the largest group that failed to respond to the latest wave. If we left the data unweighted, we could possibly be overreporting the potential performance of Democrats.

However, when we adjust the data with weighting that incorporates 2020 vote preferences, we see there is no such skew. If anything, our latest wave leans slightly more Republican than it was before we weighted it. This suggests that weighting for vote preference can slightly overcorrect for missing Republican or Trump-leaning voters. 

Again, this analysis is based on a single series of surveys that revisits the same people. And while we try to account for some margin of error in our polling, there is likely still some level of nonresponse bias in our initial pool of respondents that goes beyond what we can measure in this analytic exercise. However, these nonresponse patterns are indicative of the groups that are challenging for most survey research to reach and, as a result, may help reveal the types of people missing in contemporary public polling. At the same time, it’s also clear that much of that skew can be accounted for by using appropriate weighting techniques to bring estimates back in line with benchmark information about the population.

Returning to whether nonresponse bias causes pollsters to underestimate Republican support, we are left with a definite “it depends.” On one hand, our research provides some evidence that particular Trump-leaning voters are less likely to participate in surveys over time. If this is true more widely, the polls could be understating GOP support. However, we also show that there are certainly Republicans in these election polls and survey weighting can correct for this handful of missing respondents. 


  1. We also saw typical declines in the shares of responses by age and race, among other demographic groupings, such that younger, Black and Hispanic respondents participated at lower rates relative to other age groups and races and ethnicities.

  2. The shares calculated for this analysis are not weighted. We’re focusing on unweighted data here to examine the raw number of people participating in surveys, their demographic breakdown and how that changes over time.

  3. Respondents were asked what their main source of news is. “Mainstream” sources include the answer choices “ABC / CBS / NBC News,” “The New York Times, Washington Post or Wall Street Journal,” “Telemundo,” “Univision,” “public television or radio” or “your local newspaper.” “Social” includes “YouTube” or “social media.” “Online” refers to “digital or online news.” Other options included “FOX News,” “MSNBC,” “CNN,” “Other” or “None of these” and skipping the question.

Sarah Feldman is a senior data journalist at Ipsos.

Bernard Mendez is a data journalist at Ipsos.


Latest Interactives