At the Fall 2015 NEMRA event, Kathryn Korostoff of Research Rockstar and Namika Sagara, Ph.D., discussed how different treatments of survey instruments affect the honesty of replies.

Namika presented traditional economics, with the assumption that people are rational, conduct perfect research, and know what is best for them. Traditional economics assumes people are rational actors like Spock or Sheldon. Sheldon from The Big Bang Theory evaluates where to sit based on air vents, angle to view the television, etc.

Behavioral economics recognizes that we are sometimes irrational, with bounded resources will power, using heuristics (mental shortcuts) to make decisions, and that we don’t always know what’s best or what we want. We’re more like Homer than Sheldon. Behavioral economics is best understood through experiments.

Surveys are descriptive data about what people are saying, not what they are actually doing. A/B testing works to understand which alternative works better. In behavioral economics experiments, in contrast, an experiment can have 2 to 20 different groups, changing one dimension at a time in different experimental manipulations or treatments. BE experiments help determine drivers of behavior.

Sometimes people need to be nudged. To encourage use of the stairs, a facility implemented a piano keyboard on the stairs next to the escalator and saw a significant increase in use of the stairs.

Social desirability prompts people to discuss their actual behaviors in a way with greater social appeal. Self-interest bias has people change their story for their own benefit. For instance, signing at the end of a form that all answers are honest, 79% cheated: signing at the beginning reduced cheating to 37%. Signing before instead of after improved compliance with a tax form, with reporting car mileage, and with online sales reporting.

One study of college students found that an anonymity treatment increased reporting of socially undesirable behaviors but was less factual about actual online behaviors than what people were reporting when it was on the record. One online survey had 46% of respondents misreport personal information such as age, educational status, and gender. One theory is concern over true anonymity due to privacy breaches at online sites. In another study, participants were more willing to lie in an email than on paper.

The team conducted an online survey with 1,914 participants, using screening and quota controls to ensure consistent demographic profiles across a control group and the three test groups.  Group 1 had to put their initials at the beginning of the survey, group 2 was told why it was important to get accurate information, group 3 had both treatments, and group 4 had no treatment.

When it came to describing smoking, the more well-educated people were more likely to lie about smoking. The control did worse, followed by the group that had to initial at the beginning. When it came to the amount of smoking, the less educated group lied the most on the control.

The educated people lied more when asked to be honest, while the less educated people lied less when asked to be honest.

Alcohol consumption was underreported the most by the group that received both treatments.  Well-educated people are more concerned about perceptions around alcohol than less educated people.

The treatments work, but inconsistently. Something is better than nothing, but there was no clear winner. Some variations appear to correlate to education level. Less educated paritcipants show distinct social desirability around exercise, while more educated participants show social desirability bias around alcohol consumption.

Author Notes:

Jeffrey Henning

Gravatar Image
Jeffrey Henning, IPC is a professionally certified researcher and has personally conducted over 1,400 survey research projects. Jeffrey is a member of the Insights Association and the American Association of Public Opinion Researchers. In 2012, he was the inaugural winner of the MRA’s Impact award, which “recognizes an industry professional, team or organization that has demonstrated tremendous vision, leadership, and innovation, within the past year, that has led to advances in the marketing research profession.” In 2022, the Insights Association named him an IPC Laureate. Before founding Researchscape in 2012, Jeffrey co-founded Perseus Development Corporation in 1993, which introduced the first web-survey software, and Vovici in 2006, which pioneered the enterprise-feedback management category. A 35-year veteran of the research industry, he began his career as an industry analyst for an Inc. 500 research firm.