Pros and Cons of Including Partial Responses to Surveys

  • Pros
    • Respectful of all respondents’ contributions
    • Increases the sample size for early questions
    • Increases the reportable sample size for newsmaker surveys
    • Reduces the bias of topic salience
  • Cons
    • Incompatible with quota sampling or weighting
    • Confusing to readers when the sample size declines from question to question

Our standard practice for business-to-business surveys and surveys of our clients’ house lists is to include the responses from incomplete surveys. Depending on the topic and the length of the survey, 10% to 30% of respondents may not complete the entire questionnaire. A common reason that respondents abandon surveys is because of topic salience: they simply find the subject of a survey to be uninteresting to them. Including their answers for those questions to which they did respond improves the representativeness of results, which would otherwise skew towards those with a higher engagement with the topic of the study.

Of course, another reason that respondents abandon surveys is simply that they’ve been interrupted while answering. We will often see examples of this in the completion data; for instance, in a survey we data-cleansed this morning, one respondent was recorded as having taken 28 hours to complete a 7-minute questionnaire. Clearly, they were interrupted, then returned to that tab in their browser and finished the survey the next day.

Now that said, some partial responses are too trivial to be included. For instance, responses from those who screened into a survey by answering the up-front demographic questions but who then abandoned it before answering the first topic-specific question. Those partial responses might provide some clues as to how nonresponse bias varied demographically, and could be analyzed in the context of screen outs, but aren’t worth including in the main analysis.

The standard practice for many organizations is to remove all partial responses, but this has always struck me as disrespectful of the time respondents spent answering as much of the survey as they did. (For surveys with incentives, respondents only earn the incentive if they complete the survey, so there’s no added cost to including partial responses.)

For nationally representative surveys – which use quota sampling to approximate the proportions of the population by age, gender, region, ethnicity, education, and other attributes – partial responses may skew the results away from this representativeness. For instance, respondents with only a high school degree have higher abandonment rates than college-educated respondents. Because of the problems with partial responses when using quota sampling, for such surveys we only include complete responses.


Recent Headlines

  1. Postgraduate Degree Survey Jeffrey Henning 10-Jul-2019
  2. Voice Assistant Survey: Let the Young Speak Jeffrey Henning 09-Jul-2019
  3. PDI Releases New C-Store Shopper Report Jeffrey Henning 26-Jun-2019
  4. Issues with Grid Questions Jeffrey Henning 24-Jun-2019

White Papers

Consumer Research

  1. Postgraduate Degree Survey Jeffrey Henning 10-Jul-2019
  2. Voice Assistant Survey: Let the Young Speak Jeffrey Henning 09-Jul-2019
  3. Grocery Shopping Survey Jeffrey Henning 22-Jun-2019
  4. 2019 Summer Vacation Survey Jeffrey Henning 11-Jun-2019

From Jeffrey Henning's Blog

  1. Issues with Grid Questions Jeffrey Henning 24-Jun-2019
  2. Asking Too Much Jeffrey Henning 07-Jun-2019
  3. Helping You Through the Entire Survey Research Process Jeffrey Henning 02-Mar-2019
  4. Market Research Conference Calendar - 2019 Jeffrey Henning 12-Dec-2018