Survey questionnaire questions are deceptively difficult to write. This post from Jeff Sauro at MeasuringU discusses several common sources of survey research bias, and provides tips on how to avoid them.
There is an art and science to writing survey questions. A number of books can help you with the process. But even the best written questions can be susceptible to biases that can creep into your results and affect the quality of your responses and conclusions.
- How many hours per year do you spend volunteering?
- How much do you agree that online articles provide valuable information for businesses to succeed?
Survey questions are the most effective way to characterize a sample of prospective or current users (who they are, what they think, and what they do). This data is often a main source of input for segmentation, personas, market feasibility, and decisions on prioritizing product functionality.
Biases can be particularly pernicious because they’re harder to spot than more glaring problems (like non-mutual exclusivity or double-barreled questions). In fact, there are not always clear remedies to the many biases that can affect your results. However, often just being aware of them is enough to help mitigate unwanted effects.
Here are nine common biases I’ve documented from the literature [pdf] and our experience conducting surveys, and in some cases ideas on how to fix them.
1. Social Desirability & Conformity
- Don’t you agree that recycling is an important initiative for companies to embrace?
- Approximately how much time do you spend reading to your children each night?
- On average, how much time do you spend planning meals for your family?
If it’s socially acceptable (recycling, reading to kids, or caring for your family), respondents are much more likely to endorse and exaggerate. In additional to socially desirable, a number of studies show people will conform to group norms [pdf] both offline and online.
In fact, it’s hard to convince respondents to go against what’s acceptable even when things are clearly bizarre[pdf]. This means respondents will have a propensity to provide the socially acceptable response over the true response.
2. Yea Saying and Acquiescing
- Do you want your coffee machine to have different profiles?
- Do you use the mail merge feature of Word?
Respondents can tend to be agreeable (acquiesce) and respond usually positively to just about any question you ask them in a survey. One of the best way to minimize this “yea” saying is to minimize simple yes-no answers and instead have respondents select from alternatives or use some type of force choice or ranking.
Note: While a common solution to minimize acquiescent bias is to reverse the tone of items in rating scales, we’ve found, along with other research, that reversing the item wording can actually cause more harm than good in rating scales.
3. Order Effects
The order you ask questions matters. Mentioning products, brands, or events can affect how people rate their familiarity and attitudes on subsequent questions. This can be especially harmful in branding and awareness surveys as the mere exposure of a brand name first can influence later questions and findings. Response options also matter. A respondent might remember a choice that appeared in an earlier question and be more likely to select the response on later questions. You can often manage many order effects through properly sequenced questions and randomization.
- How much influence do you have on IT purchase decisions at your company?
- What’s your income and highest level of education?
Respondents will likely round up on income (especially men), education, and their reported power and prestige when making decisions. This is different than outright lying or cheating on a survey. If a question asks about prestige, assume the responses are inflated to present the respondent in a more favorable light. Exactly how much they are inflated will depend on the question, context and respondents.
5. Threat & Hostility
- Think about the last time you were in a car accident. Did you access the insurance company’s mobile app?
Getting people to think about unpleasant things and events can get them in the wrong state of mind, which can cast a negative shadow on subsequent questions. Studies have shown getting people in a hostile mindset will affect their attitudes [pdf], and consequently survey responses.
Even rather benign questions (like asking people their marital status) may prime respondents with negative thoughts as participants recall bad past experiences (like a divorce or death in the family). Moving more sensitive demographic questions and anything that could potentially elicit negative thoughts to the end of a survey when possible may help.
When respondents know where the survey is coming from (the sponsor), it will likely influence responses. If you know the questions about your online social media experience are coming from Facebook, your thoughts and feelings about Facebook will likely impact responses.
This can be especially the case for more ethereal measures like brand attitude and awareness that can be affected from the mere reminder of a brand in the email invitation or name and logo on the welcome page. One of the best ways to minimize sponsorship bias is to obfuscate the sponsor as much as possible and/ or use a third-party research firm (shameless self-promotion).
Asking about gender, race, technical ability, education, or other socio-economic topics may reinforce stereotypes in the mind of the respondents and may even lead them to act in more stereotypical ways. For example, reminding people that stereotypes exist around those who are more technically averse (age), math ability (gender), or intelligence (education level) may affect later responses as the stereotype primes respondents through the questions.
8. Mindset (Carry-Over Effects)
- Thinking about the last time you moved, how many items did you list on craigslist?
- Overall, how satisfied are you with the craigslist website?
It’s likely the response to the second question above is affected by the mindset of the initial question (moving). Rating the experience of moving and using craigslist during the move in the first question will likely have an impact when respondents mentally switch to the second broader question about craigslist in general. It’s likely respondents may still be thinking of how they used craigslist during their move even if you don’t intend them to.
These mindset biases can sometimes be offset by managing the order, but often can’t be avoided entirely if you’re getting respondents to consider multiple mindsets in the same survey.
9. Motivated Forgetting
- After you saw the iPhone commercial for the first time, how long was it until you purchased one?
Memories are malleable and in general, we’re not terribly good at remembering events accurately. People tend to distort their memories to match current beliefs, also called telescoping. Respondents may recall an event but report that it happened earlier than it actually did (backward telescoping) or report that it happened more recently (forward telescoping).
Many research questions rely on participants to recall specific events or behavior. There can be a tendency to recall events that didn’t happen or forget the specifics of an event.
Bias Doesn’t Necessarily Mean Garbage
Just because a survey has bias doesn’t mean the results are meaningless. It does mean you should be able to understand how each may impact your results. This is especially important when you’re attempting to identify the percentage of a population (e.g., the actual percent that agree to statements, have certain demographics like higher income, or their actual influence on purchase decisions).
While there’s not a magic cure for finding and removing all biases, being aware of them helps limit their negative impact. A future article will discuss some ideas for how to identify and reduce the effects of biases and other common pitfalls in survey design.
Latest posts by Jeff Sauro (see all)
- How to Know Which Items to Remove in a Questionnaire - December 4, 2017
- 5 Ways to Get at the Why Behind the Numbers - October 5, 2017
- Cleaning Data From Surveys & Online Research - August 15, 2017