Avoid survey-writing pitfalls
There are many guidelines for writing good survey questions (you can see a full Coursera/University of Michigan course on survey design). Here are some common pitfalls that Google's People Analytics team tries to avoid:
1: The double-barreled question
Double-barreled questions combine two questions into one question. This can sometimes be subtle; beware of conjunctions (e.g. "and," "or") in your questions.
Poor: "My VP and director encourage innovation through our organization."
What if your VP encourages innovation but your director doesn't? How should employees respond? Break the question up.
Better: "My director encourages innovation throughout our organization."
2: The leading question
Leading questions encourage respondents to answer in a particular way. Agreeing with the leading statement typically requires less effort. As a result, such questions typically bias your data.
Poor: "Don't you think we should spend more time on product reviews?"
Better: "More time spent on product reviews would improve our launches."
3: The non-specific question
Unclear or vague questions will confuse respondents and make answers unreliable. Your goal should be to have every respondent understand the question in exactly the same way.
Poor: "What do you think about performance reviews?"
Is the question about the logistics? The workload? The effectiveness of the review process?
Better: "Please rate your overall satisfaction with the amount of work required of you in the last performance review cycle."
4: The overly broad question
Sometimes you don't want to get too specific so you don't constrain respondents' answers. However, questions that aren’t clear about what they’re asking just confuse respondents.
Poor: "How well do you know this person?"
What does it mean "to know?" Think about what you're trying to measure and ask a question that tells you that and only that. With surveys, precision beats generality every time.
Better: "In the past quarter, about how many times did you interact with this person (e.g., via email, phone calls, meetings together)?"
5: Missing response options
When asking structured questions, your list of response options should be exhaustive. Include any possible answer that a respondent might want to give. Testing your survey reveals response options that you missed. Include an “Other,” “Don’t Know,” or “Not Applicable or N/A” option.
6: Overlapping response options
In addition to being exhaustive, you need response options that are mutually exclusive. If you're asking people to choose just one option, no choice should overlap with another.
Poor: How long have you worked here? A) Less than a year; B) 1-2 years; C) 2-3 years; D) 3-4 years; E) 5+ years
If you’ve been an employee for 3 years, should you select C or D?
Better: How long have you worked here? A) less than a year; B) one year or more, but less than two years; C) two years or more, but less than three years; D) three years or more, but less than four years; E) four or more years
7: Forcing responses with “required” questions
By not including required questions, you can allow the survey taker to choose whether they feel comfortable and equipped to answer each question. This allows you to build user trust and avoids collecting skewed data from respondents being forced to fill something in. Including an "Not Applicable or N/A" option is also a good idea. If you must require a response to a given question, explain why to the respondent in the survey text.