Filed under:

Susan Chambers

Seven Common Survey Design Errors and How to Avoid Them

Illustration of a smiling person standing in front of a clipboard, archery target, magnifying glass, pencil and gears.
Illustration of a smiling person standing in front of a clipboard, archery target, magnifying glass, pencil and gears.
Copyright: liravega258

Not so many years ago, administering a survey to get feedback from stakeholders (e.g., clients, members) was a time-consuming and expensive venture. These days, conducting a survey study has become much simpler with the advent of web-based survey tools. But perhaps the process has become a bit too quick and easy.

While it’s now easy to generate and administer a questionnaire on platforms such as SurveyMonkey, if you are not familiar with some basic survey design principles, it’s also easy to end up with poorly designed questions that result in response biases and measurement errors, thereby skewing your results and casting doubt on the validity and reliability of your data. (Validity means your survey is measuring what you think it’s measuring, and reliability means the questions consistently elicit the same responses over time. For a more detailed explanation, see Yeona Jang’s post “Survey Data: Reliability and Validity. Are they Interchangeable?”)

In addition to yielding poor quality, unusable data for planning and decision making purposes, poorly designed questions also confuse and annoy respondents, making them less likely to want to participate in future surveys.

The following is an overview of some common survey question design errors and some tips for correcting them.

1. Not being transparent

Be clear in your mind about what you want to know, and design your questions accordingly. For example, if you want to ask about the usefulness of a certain workshop, ask participants to rate the usefulness of the workshop, not how much they valued the workshop.

2. Asking for too much personal or private information/not respecting privacy regulations

Many individuals are cautious about sharing personal information (e.g., age, income, ethnicity) if they don’t know how their information is going to be used. If you need some general demographic information and you don’t need to calculate an average or median figure, consider using age or income ranges as response categories. Also, be up-front with participants about how you will be using the information and what you will do to safeguard your respondents’ privacy and anonymity.

3. Asking double-barreled questions

Double-barreled questions often look like compound sentences: two main ideas are joined with a co-ordinating conjunction such as “and” or “or.” Here’s an example of a double-barreled question:

Do you think that politicians should be allowed to accept gifts from lobbyists, and that politicians should be recalled if they don’t follow their constituents’ wishes?   Yes / No

Not only would a question like this perplex your survey participants, it would also perplex you when it came time to analyze the data. How would you know whether the responses referred to the issue of accepting gifts from lobbyists or recalling politicians? A better way to frame your question would be as follows:

For each of the following items, please tell us whether you agree or disagree:

Politicians should be allowed to accept gifts.   Agree / Disagree

Politicians should be recalled if they don’t act on behalf of their constituents.   Agree / Disagree

4. Asking leading questions

If you are trying to nudge your participants’ answers in a particular direction, then you have written a leading question. Leading questions often start with phrases such as “Don’t you think…” or “Wouldn’t you like….”

Reword leading questions in order to avoid introducing biased or false responses that negatively affect the quality of your data. Here’s an obvious example of a leading question:

Don’t you think this is the most helpful blog post you’ve ever read?   Yes / No

A better way to ask the question is to rephrase it like this:

On a scale of 1 to 5 where 1=Not at all helpful and 5=Very helpful, how helpful was this article for learning how to design survey questions? 1 / 2 / 3 / 4 / 5

5. Using value-laden words in your questions

Avoid using highly emotional or judgmental words in survey questions. If you attach a negative label to a behaviour or belief that you’re asking about, you’re not likely to get truthful answers to your questions. For example, respondents are not likely to give an honest reply to the following question:

How much time per day do you waste on ridiculous social media activities instead of working on important projects?

Most likely, they’d be more willing to answer truthfully if the question were phrased neutrally:

How much time per day do you spend on social media platforms?

6.  Not using mutually exclusive and exhaustive response categories

Avoid overlapping response categories. An example of this would be: Are you 20–25 years of age, 25–30 years of age, etc. Because these categories overlap, survey participants don’t know which category to select, so they may skip the question. Alternatively, you may end up with under-counts or over-counts depending on which response category 25-year-old participants select. Be sure to create mutually exclusive categories (e.g., less than 20, 20–24, 25–29, and so on).

You can also run into difficulty if you don’t have enough response categories to include all legitimate potential responses to the question (i.e., the response categories are exhaustive). There are two strategies for making sure your survey has exhaustive response categories. One approach is to include categories such as “Under 20 years of age” or “More than $100,000” for quantitative data points. A second approach is to add an “Other” category and provide a space to describe responses that don’t fit the suggested response categories.

7.  Failing to include “Don’t know,” “Not applicable,” and “Neutral” response options

Most participants want to answer survey questions as truthfully as possible. However, if a survey item is not applicable to your respondents or they truly are neutral on a topic and there is no response category to accurately capture those responses, you are either going to end up with missing data or measurement errors. It’s better to risk having a few individuals take the path of least resistance and select all of the middle (neutral) values on a series of scaled items than to artificially inflate the overall number of positive (or negative) responses on a scaled item. (See Michaela Mora’s article “Is It Right to Include a Neutral Point in Rating Questions?”)

Of course, there are a few other methodological principles to consider at different stages of a survey study; these topics will be covered in a survey guide that I’m currently developing for Editors Canada committees. In the meantime, learning how to identify and fix the seven errors covered here will go a long way to strengthening the quality of your surveys and data.

___

The Editors’ Weekly is the official blog of Editors Canada. Contact us.


Discover more from The Editors' Weekly

Subscribe to get the latest posts to your email.

4 Comments on “Seven Common Survey Design Errors and How to Avoid Them”

  • Very valuable tips, Susan. I’m looking forward to reading your survey guide.

    • Hi Merel,

      I’m glad you found the tips to be useful. The survey guide is coming along slowly–but it is coming along. I hope the guide will be helpful to our members and volunteers.

  • Gael Spivak

    says:

    That’s a great summary, Susan. The two things I especially dislike in surveys are leading questions and not having the “Don’t know,” “Not applicable,” and “Neutral” response options. I’m glad you explained those so well.

    And it’s wonderful that you are doing a survey guide!

    • Hi Gael,

      Thanks for your feedback. As both a researcher and a survey participant, I prefer to see a neutral point, when possible for agree/disagree items. Unfortunately, a neutral point on a satisfaction scale is not always so helpful from a research perspective, so it’s best to be a bit flexible about what kind of scale to choose. The other thing I dislike in surveys are double-barreled questions.

Comments are closed.

To top