Study: Poll Respondents More Likely to Lie About Voter Registration, Receive Welfare Benefits

October 9th, 2012 3:25 PM

This presidential election, the reliability and fairness of pollsters has become a hot topic with both conservatives and liberals casting doubt on the accuracy of various polling firms. But what if the real problem with polling is more attributable to the people who respond to surveys than the polling companies themselves?

Thanks to a study examining the accuracy of polling, we now know that in some areas, surveys can be disturbingly inaccurate, in large part because people are willing to outright lie to a pollster. According to a report issued by the Pew Research Center for the People and the Press, 60 percent of people who aren’t registered to vote will falsely claim to be registered.


Many people also seem inclined to misinform pollsters about their voting habits. Of those who were verified as not having cast a ballot in 2010, 65 percent told Pew that they “always” or “nearly always” vote. (There was no data provided for the 2008 election.)

Both of these statistics mean that pollsters touting surveys of “registered voters” and “likely voters” need to make some corrections to their questionnaires to discover and remove people lying about their voting status. When you couple people’s willingness to lie about their registration and their actual voting with the fact that so-called “unlikely voters” favor President Obama by a 2-to-1 margin, it may account for why the polling industry has historically overestimated Democratic support in presidential elections.

The data cited above come from a comprehensive study done by the Pew Research Center for the People and the Press to gauge how accurate standard polling techniques are by cross-referencing them with several other different means of measuring many of the same opinions and demographic variables. The analysis was actually released in May of this year but seems to have been little noticed. (My thanks to PJ Media for bringing it to my attention.)

To my knowledge, this Pew study was the first comprehensive attempt to compare the results from standard polling techniques with those from other surveys. As Pew explains, multiple different datasets were used:

The study is based on two new national telephone surveys conducted by the Pew Research Center for the People & the Press. One survey was conducted January 4-8, 2012 among 1,507 adults using Pew Research’s standard methodology and achieved an overall response rate of 9%. The other survey, conducted January 5-March 15 among 2,226 adults, used a much longer field period as well as other efforts intended to increase participation; it achieved a 22% response rate.

The analysis draws on three types of comparisons. First, survey questions are compared with similar or identical benchmark questions asked in large federal government surveys that achieve response rates of 75% or higher and thus have minimal non-response bias. Second, comparisons are made between the results of identical questions asked in the standard and high-effort surveys. Third, survey respondents and non-respondents are compared on a wide range of political, social, economic and lifestyle measures using information from two national databases that include nearly all U.S. Households.

Overall, Pew’s researchers conclude that the different datasets do not contradict each other but there are several important areas of discrepancy. It appears the impetus for the study is the astoundingly high rate of non-response to pollsters, nation-wide it is 91 percent. That means that of 100 attempts to contact someone to ask him/her to participate in a poll, 91 of them will result in failure to complete the survey.

The study repeatedly suggests that in most matters, there is not much error in industry-standard polling techniques. Republicans and Democrats, for instance appear just as likely not to respond to a call from a survey company. There are a few key data points which should give political observers pause, however.

Perhaps the most significant disparity is that those who receive government money to help pay their food costs are over-represented in the five-day, industry-standard survey that Pew was double-checking. According to the study, 17 percent of respondents to the poll indicated that they had accepted food stamp money. In a much larger survey conducted by the government itself in March of 2011, just 10 percent of those polled claimed to have received food assistance.

This could be a function of Obama Administration policies designed to enroll more people in food stamp programs since 2011, a function of the current economic climate, or it could actually be that people who receive such benefits are more apt to respond to a pollster because they do not work as many hours as those who are not receiving food assistance.

It is worth noting that in the industry standard survey, Pew exempted those who were receiving Women, Infants, and Children (WIC) welfare benefits, see page 41 of the “top line” report for details. It is unclear why Pew did this since this may have yielded an even higher percentage of respondents who were receiving government food assistance.

White voters were also overrepresented in the industry standard poll compared to the official statistics. 73 percent of Pew respondents reported themselves as white compared to 68 percent in the CPS data. Having fewer racial minorities in a sample can make it more difficult to extrapolate about their general population, especially since sampling errors can be compounded when pollsters use weighting techniques to increase the minority share of the vote according to what they expect it to be.

Because pollsters do not actually release the real, unadjusted demographics for their surveys, we do not know how many black, Asian, or Hispanic voters were actually contacted. Because the sample size of minority voters is smaller in most general public surveys, this makes the likelihood that racial minorities who support Republicans may be underestimated due to statistical error. It could mean also that minority support for Democrats may be underestimated as well. Unfortunately, most polling companies do not release their confidence levels and error margins when estimating the voting preferences of racial minorities.

The 5-day poll does not have too many other significant differences when compared to the other sampling methods but there are a few points of difference for those who really want to delve into the numbers:

  • Registered Democrats are slightly more likely to claim they are “independents” than people who are registered Republicans. In the 5-day, industry-standard poll, 80 percent of people who claimed to be Republicans were verified as actually being registered Republicans in a national voter information database. Among those found to be registered Democrats, only 76 percent said that they were Democrats.

  • Internet users were more common in the industry standard poll, 80 percent to 74 percent.

  • People aged 30-49 were underrepresented in the industry standard poll. The government’s Current Population Survey (CPS) reported that 35 percent of Americans are of that age but in the smaller Pew sample, only 29 percent were of that age. Presumably this is because this age group is most likely to have a full-time job.

  • Older Americans were overrepresented in the Pew industry standard sample. 31 percent of respondents said they were aged 50-64 and 20 percent said they were 65 or older. This compares to just 25 percent and 17 percent according to the CPS. Social Security recipients were also more represented in the industry standard survey than in the CPS. 32 percent reported receiving checks from the agency compared to 27 percent in the CPS data.

  • Home owners seem slightly less likely to be represented in polls. 63 percent of respondents in the industry standard poll said they owned their home compared to 69 percent who said they did for the CPS report. The CPS data is from 2011 so potentially this could be a function of less home ownership generally.

  • People who engage in civic activism and volunteerism are drastically overrepresented in the Pew industry standard survey. 31 percent of respondents said they had contacted a government official in the past year, 55 percent said they had done some form of volunteerism. This compares to just 10 percent who told CPS researchers that they’d contacted a public official and 27 percent who said they had volunteered for something. Of course, the people in the 5-day survey may have also lied about volunteering. Unfortunately there is no way to verify their answers on this question.

  • College-educated individuals seem more likely to be reachable by pollsters. 39 percent of respondents in the industry standard survey reported having a college degree compared to 28 percent in the government data. 34 percent of Pew respondents reported having high school education or less compared to 43 percent of the general population.

Given the potential accuracy problems that this Pew study has raised, there ought to be many more such studies to cross-check polls in order to determine their accuracy. Unfortunately, this has not been the case. The American people deserve better.

We need more information on the voting preferences of racial minorities. We need to know if it is really true that people on food stamps are more likely to respond to pollsters. If it is true, are these individuals distorting the poll results in a particular direction? We need to know if pollsters are correcting their datasets to account for lying respondents. We also need to know if Democrats really are more likely to claim “independence” despite their own registration. Does that have an effect on polling data?

All of these are just a few questions that the people in the polling industry need to be asking themselves and answering to the rest of us.

Related Articles:

  1. Fact: Historically Speaking, Polls Have Underestimated GOP Vote
  2. NBC's Chuck Todd Slams Rasmussen Poll as 'Slop'
  3. NBC Poll: Recent News Coverage Gave Voters 'Less Favorable' View of Romney
  4. Dick Morris: Party Disparities Aren’t Main Cause of Polling Inaccuracy
  5. Quinnipiac Pollster Admits: ‘Probably Unlikely’ That Electorate Will Feature Massive Dem Skew