Run an
employee survey

Introduction

Back in 2004 Larry Page and Sergey Brin asked Stacy Sullivan, then Head of HR and now Google’s Chief Culture Officer, to find out how Google employees, or Googlers, were feeling about work by interviewing them and reporting back. At that point the company had thousands of Googlers, and Stacy realized that face-to-face interviews (as she had done in the past) were no longer the most efficient or rigorous way to gauge employee sentiments. So Google began its first version of an employee survey.

Surveys are a great way to gather data about your organization. Whether it’s an annual employee survey, a form for employees to give feedback to their managers, or a post-training assessment, collecting data directly from your employees can inform decisions and guide organizational action.

"A good survey is a lot of work; a bad survey isn't worth doing." ~Googler mantra

Survey writing is a science; many research institutions, including The University of Michigan, Harvard University, and Duke University, have rigorous programs that share scientifically validated methodologies.

State your goals before surveying

At the start, you want to think about the end. What actions will come out of the survey? What different decisions are you ready to make -- or what will stay the same based on the results? Before you write the first survey question, consider:

What organizational question(s) will the survey help answer?

  • Can you clearly state your goals for the survey?
  • Can you state your main questions concisely?
  • Will your leader(s) support and advocate for this survey?
  • Do you have hypotheses that the survey will help you test?

If you can't answer "Yes!" to all of these questions, it's good to gather more information from other data sources such as employee interviews or focus groups. And it's possible that a survey is not the right approach.

Can you ensure action as a result of your survey?

  • Do you have the time and resources to not only conduct the survey, but also to analyze and share results?
  • Do you know who your target respondents are? Can you get enough responses to inform decisions?
  • Is your organization willing to take action based on the findings?
  • Will your organization hear the results objectively, and trust any findings, positive or negative, enough to act?

Understand structured vs. open-ended questions

At Google, surveys often use two types of questions:

1) Structured questions provide a predetermined set of responses. These responses help you compare response patterns across populations, convert responses into numeric data to conduct statistical analysis, or easily track changes in individuals or groups over time. Responses to structured questions come in several varieties. Two of the most common are:

Categorical scales

  • Yes/No, True/False
  • Department/Function (Sales, Operations, Engineering)
  • Region (North, South, East, West)

Ordered scales

  • Questions with numerical ranges (1-5; Never, Once, Twice, More than 3 times)
  • Likert-type scales (Strongly Disagree, Disagree, Neutral, Agree, Strongly Agree)

2) Open-ended questions ask respondents to answer without a set of constrained choices. These types of questions add context to structured questions; they’re also helpful to use when you aren’t sure of, or do not want to constrain, the set of choices respondents will be using to answer the question. Examples include open-ended items, like “What do you like best about working here?” or “Give 1-2 suggestions for how we can give employees more autonomy” or “How many days per month do you work from home, on average?”

Write quality questions

Survey writing seems easy from the outside. But remember: your respondents can’t ask clarifying questions, so you need clear, concise, unambiguous questions from the start. Coursera and the University of Michigan have teamed up to provide a short online course on the basics of survey design, and below are some tips:

Use clear, simple language.

Simple sentences and commonly used words make the job of the respondent much easier, especially for survey takers who are non-native speakers of the language in which the survey is written. Define any terms that will not be understood; consider adding parentheticals to clarify them (e.g., define “overall well-being” as “your emotional, physical, and financial health”).

Don't get too cute.

Clever questions might be fun, but they often don't give you reliable data. Save the humor for your survey invitation or introductory text.

Keep the survey as short as possible.

It's sure to feel a lot longer to your respondents than it does to you.

Consider the expectations you might create with your questions.

If you ask about potential wage increases or the possibility of staffing cuts, your survey respondents could reasonably hope for raises or fear for layoffs. This could influence responses as well as create unrealistic expectations. For each question, ask yourself: What will respondents assume about your intent, based on this question?

Don't ask questions that your respondents can't answer.

"Do you use RQFP or FLEEM?" would be a poor question for anyone who didn't know what either of those things are. Even if respondents can check the option "Don't know," confusing your respondents or making them feel uninformed is not a good idea.

Don't rely too heavily on open-ended questions.

Open-ended questions can require a lot of time to analyze. They can also be time-consuming to answer. If you find yourself needing lots of open-ended questions, you might be better served by interviews or focus groups instead of a survey. People sometimes think that surveys are a good way to crowd-source ideas, but remember that the quality of ideas is often related to the amount of time people spend on them. Short, open-ended questions rarely result in breakthrough ideas.

The Federal Employee Viewpoint Survey (FEVS) is one resource for sample questions. Each year, the U.S. Office of Personnel Management (OPM) measures employee attitudes across all Executive Branch agencies and publishes its results along with survey items.

Avoid survey-writing pitfalls

There are many guidelines for writing good survey questions (you can see a full Coursera/University of Michigan course on survey design). Here are some common pitfalls that Google's People Analytics team tries to avoid:

1: The double-barreled question

Double-barreled questions combine two questions into one question. This can sometimes be subtle; beware of conjunctions (e.g. "and," "or") in your questions.

Poor: "My VP and director encourage innovation through our organization."

What if your VP encourages innovation but your director doesn't? How should employees respond? Break the question up.

Better: "My director encourages innovation throughout our organization."

2: The leading question

Leading questions encourage respondents to answer in a particular way. Agreeing with the leading statement typically requires less effort. As a result, such questions typically bias your data.

Poor: "Don't you think we should spend more time on product reviews?"

Better: "More time spent on product reviews would improve our launches."

3: The non-specific question

Unclear or vague questions will confuse respondents and make answers unreliable. Your goal should be to have every respondent understand the question in exactly the same way.

Poor: "What do you think about performance reviews?"

Is the question about the logistics? The workload? The effectiveness of the review process?

Better: "Please rate your overall satisfaction with the amount of work required of you in the last performance review cycle."

4: The overly broad question

Sometimes you don't want to get too specific so you don't constrain respondents' answers. However, questions that aren’t clear about what they’re asking just confuse respondents.

Poor: "How well do you know this person?"

What does it mean "to know?" Think about what you're trying to measure and ask a question that tells you that and only that. With surveys, precision beats generality every time.

Better: "In the past quarter, about how many times did you interact with this person (e.g., via email, phone calls, meetings together)?"

5: Missing response options

When asking structured questions, your list of response options should be exhaustive. Include any possible answer that a respondent might want to give. Testing your survey reveals response options that you missed. Include an “Other,” “Don’t Know,” or “Not Applicable or N/A” option.

6: Overlapping response options

In addition to being exhaustive, you need response options that are mutually exclusive. If you're asking people to choose just one option, no choice should overlap with another.

Poor: How long have you worked here? A) Less than a year; B) 1-2 years; C) 2-3 years; D) 3-4 years; E) 5+ years

If you’ve been an employee for 3 years, should you select C or D?

Better: How long have you worked here? A) less than a year; B) one year or more, but less than two years; C) two years or more, but less than three years; D) three years or more, but less than four years; E) four or more years

7: Forcing responses with “required” questions

By not including required questions, you can allow the survey taker to choose whether they feel comfortable and equipped to answer each question. This allows you to build user trust and avoids collecting skewed data from respondents being forced to fill something in. Including an "Not Applicable or N/A" option is also a good idea. If you must require a response to a given question, explain why to the respondent in the survey text.

Test your survey

There are lots of reasons to test your survey, including:

  • Eliminate confusing or poorly worded questions/response options
  • Give an idea of how long it takes to fill out the survey, which should be conveyed to respondents in the survey invitation

You may want to start by trying out some questions or instructions very informally. Review the questions with stakeholders to ensure their support, gather ideas, and address questions. Once you have a full survey, bring some testers physically together to test the survey, as there can be lot to learn from being in a room with a group of people while they attempt to complete the survey.

Offer anonymous, confidential, or identified

Before building the survey, you will want to decide whether it will be anonymous, confidential, or identified.

In an anonymous survey, no background information is tied to the data, so you have no way to identify an individual respondent.

  • Pros: Respondents may feel safer providing honest feedback since their data can't be tied back to them.
  • Cons: If you want to be able to cut your results by another variable (i.e., department, location) you'll have to ask respondents to specify this info during the survey. This makes the survey longer and potentially leaves you with incomplete or less reliable data.

In a confidential survey, some background information is tied to each response, but it is not revealed with the respondent’s answers. Only those analyzing the data can use this background information to make better sense of the results.

  • Pros: Your survey is shorter because you don't have to ask demographic information. Your data is also far more meaningful as you can cut the data in a variety of ways.
  • Cons: You must ensure that proper data confidentiality rules have been put in place and communicate those to the survey taker.

In an identified survey, survey respondents’ identities are explicitly tied to their responses, and shared with the results. This approach is typically only used for highly innocuous topics, and when you will need to follow-up with respondents based on their data.

  • Pros: You can easily follow-up with people, or ask clarification questions about responses.
  • Cons: You must make very certain that respondents are fully aware that their responses will be known, or you’ll risk compromising trust with employees (and, worse, employees’ responses may get them in hot water!).

Survey the right people

Google's People Analytics team constantly thinks about survey fatigue and the target population. Does everyone in the target group need to receive the survey? Sometimes it makes sense to do this, like when the group is fairly small, or if you are explicitly seeking high participation (as in an annual company survey that goes out to all employees). However, many times surveying a sample of respondents yields data that is just as powerful and informative as surveying an entire population.

Simple random sampling is a low-effort approach that we sometimes use as it gives everyone in the target population an equal, non-zero chance of being asked to participate. A very basic approach to simple random sampling is:

  • Get a list of the people in your target population.
  • Use a random number generator to assign numbers to the list of names of people in your target audience.
  • Pick the top "n" number of people.

The main drawback to the simple random approach is the risk of drawing too few participants from a small portion of the company and not being able to make conclusions about that population. For example, in a global survey you could use a more advanced technique called stratified random sampling. In this form of sampling, you draw a random-but-proportionate number of people from each group in your target population (e.g., 10% of Sales in North America, 10% of Sales in Europe). It takes more effort, and you may have trouble identifying the appropriate groups, but this approach ensures that your sample includes people from all the important sub-groups of your target population.

You can learn more about survey sampling methodologies from the Harvard University Program on Survey Research.

Write an effective survey invite

Now that you've poured a lot of effort into survey creation, you want to make sure people actually complete it. Here are a few tips to improve the response rate:

People respond well to requests from people they know. If appropriate, have the invitation for the survey come from the respondent’s leader.

Relevance helps a lot. Be sure that the purpose of the survey matches with the interests of your target population. People are a lot more likely to answer questions about things that are meaningful to them.

Lay out the details, like the purpose of the survey, how long it will take, when the survey will close, and what you’ll do with the results.

Fewer words win. Keep the invitation short, sweet, and to the point. If you have a lot of information you need to communicate, write a Frequently Asked Questions (FAQ) document and include the link to the survey in the email.

Share the results

Once the survey is closed and all the data is collected, it’s time to analyze and share the results. Here are some guidelines to consider when doing so:

Socialize the results with leaders first and then have them reshare. It's often useful to share survey results with leaders first, making sure they have the full picture, and then have them deliver the results to their teams. Make the results widely available, but give leaders the opportunity to put them in context.

Share comparative results. It’s often useful for teams to see how their results stack up to the organizational average or how this year's results compare to results from the prior year. This can help identify organization-wide challenges and group-specific opportunities to improve or successes to be shared.

Share the good, the bad, and the ugly. When reporting the results, it's helpful to share them completely and objectively, no matter what they say. Selectively sharing results or omitting “bad” responses can destroy trust and your ability to get truthful answers.

Plan how to take action and share the plan. Sharing a plan to take action at the same time as results helps the organization immediately contextualize the data and demonstrates a commitment to action.