Adopt an
analytics mindset

Introduction

Many years back, Google experimented with 42 different shades of blue on the Google Toolbar to find which color would optimize click-through rates. Launching experiments and learning from the behavior of users to improve products is ingrained in Google's product development and business practices. Indeed, many organizations use this type of experimentation in their product development, customer engagement and business practices. The people analytics mindset is about instilling this type of data-driven approach in people issues.

“All people decisions should be informed by data and analytics.” ~Google People Analytics motto

The People Analytics team at Google was founded to ensure all people decisions are informed by data. The team didn’t start with fancy forecasting algorithms or advanced predictive tools. Instead, the team began by understanding the people problems that needed to be addressed and the organizational context. Today, the People Analytics team not only develops and maintains a wide set of data and metrics but also tests hypotheses, runs experiments, reviews academic research, builds models, and uses science to make work better for Googlers.

Ask the right questions

Don’t lead with data and metrics. Think of the questions you have about your organization and work backwards to figure out what data you need to inform answers. Spending time upfront to clearly define the problem statement is essential.

How do you identify the problems and questions most important to the organization? Understand your business and organizational imperatives and how your people policies and programs may best address them. The types of questions that people analytics can best answer align with the triple aim framework used to improve US healthcare. It focuses on three elements:

  1. Effectiveness. Are your people programs, policies and processes yielding the right outcomes? For example, if we are talking about the hiring process, improving effectiveness would result in hiring better performers over time.
  2. Efficiency. Can we get to the same outcomes in a shorter time span or by spending less money or with fewer people? In the hiring example, higher efficiency would result in a lower cost per hire.
  3. Experience. Since we are talking about people after all, can we improve how individuals experience these programs and processes? In hiring, this might include measures of how candidates perceived their interviewing experience and their interactions with recruiters.

Depending on your organizational context, any one of these elements -- effectiveness, efficiency and experience -- may be more important than the others. The best solutions are those that improve effectiveness, efficiency, and experience all at once. But there are often trade-offs across these elements: be cognizant of how any one of these might affect the other and how you would mitigate undesired effects.

Understand the analytics value chain

Once you have the problem statement defined, use the analytics value chain to address it. Think of the analytics process as a value chain. Each step up the chain requires additional work but yields additional value. Moving up the analytics value chain from opinion to informed action requires a thoughtful approach to understanding, measuring, and analyzing the problem.

For every idea there is a spectrum of how much data can be used to support it, from no data (an opinion) to a data-backed hypothesis. In the absence of data, people tend to leverage their own opinions. As Jim Barksdale, the former Netscape CEO, said, “If we have data, let’s look at data. If all we have are opinions, let’s go with mine.”

Opinions themselves are not bad. But data should inform your opinion and, importantly, help you make a more convincing argument when suggesting action. An opinion could be “Employees spend too much time on expense reports.” But without data, this isn’t a very convincing, or useful, opinion. Data could say “Employees spent more than 100,000 hours on expense reports last year.” While still far from action, it’s a far more convincing and useful place to begin building your solution.

With data, you can begin to build useful metrics to better define the problem and conduct an analysis to see if a possible solution or insight lies within your data. One critical insight can be the basis for a hypothesis to be tested and a potential action to solve the problem.

Choose your data and metrics

It's tempting to start with the data you have rather than the data you need. At Google, the People Analytics team tries to understand the challenge before choosing what to measure to try and solve it. Asking the right questions and developing clear hypotheses are critical before starting to think about the right data and metrics.

What’s the distinction between data and metrics? The number of hires made in a quarter is a piece of data. The costs to make all the new hires in a quarter is another piece of data. You can combine the two (dividing the latter by the former) to create a “metric” called cost per hire. The metric is more valuable information than the individual pieces of data because it can be tracked over time and compared across groups to study trends and patterns. For example, you can see some of the data and metrics the Workforce Analytics team at the Gap collects and how they use them to draw insights for their organization.

Some of these data and metrics might be readily available from your HR systems. For other information, you may need to actively collect the data. For example, you may ask your employees about their attitudes, perceptions and beliefs. At Google, the annual employee survey “Googlegeist” captures a snapshot of how Googlers feel about their managers, teams, the organization, and our culture. Learn more about designing surveys.

Make inferences using statistics

To make data and metrics useful, you need to be able to draw inferences. Statistics can help you interpret your data and determine these conclusions. From measures of averages to t-tests to regression analysis, understanding statistics is critical. Brush up on your stats knowledge with lots of free materials in the Khan Academy's math and probability subject area.

Statistical analysis can be powerful, but is not without its pitfalls. Alex Reinhart, a statistics Ph.D. student, highlights some popular missteps (even among scientists) in Statistics Done Wrong. Below are two common pitfalls to keep in mind:

Correlation doesn’t equal causation.

Two variables might be correlated (i.e., they move in consistent directions in relation to each other) but that doesn’t mean that one causes the other. The blog Spurious Correlations showcases some examples.

Regression to the mean.

This statistical concept explains why very tall parents tend to have shorter children, as phenomena tend to return to average over time. What might look like improvement (e.g., a low performer transfers to a new team and improves) or even decline (e.g., a star team gets worse over time) is just the tendency to return to average.

Tell a story with your data

Cold, hard facts, even when they’re accompanied by compelling statistics don’t stimulate action. Action comes from a compelling story: backed by data, tailored for your audience, and quickly understood.

Know your audience: It’s key to know what each audience member is passionate about, and how they prefer to receive information. For example, bring data relevant to the audience’s purview and your recommendations (e.g., hiring data for staffing leaders, healthcare data for benefits leaders). And think about sending materials ahead of time for review to make the most of your meeting or presentation.

Keep it short: Boil your message down to three minutes or less to prepare for two situations:

  1. The classic elevator speech where a leader asks for what you’re working on.
  2. Meetings with decision makers where the time for your topic gets cut down dramatically

Storyboard: Start with a blank document or even a sheet of paper. Don’t start by trying to build a presentation. Remember three basic components: context, findings, and call to action.

Organize your horizontal and vertical logic: Once you’ve got your storyboard and you plan to create a presentation on your analysis and recommendations, remember that leaders may just read the titles of each slide while flipping through the document. They may only read in detail when they need convincing, so it's important that your story’s horizontal and vertical logic flow well.

  • Horizontal logic means that the headings of each slide convey your story including the context, findings, and a call to action. There should be a clear flow from one topic to the next. Create an executive summary by using the titles of each slide as bullets and use active titles that describe the main takeaway.

  • Vertical logic means that everything on a given slide reinforces the title. Resist the urge to include anything off-topic, ancillary, or non-critical. Classic vertical logic also has a takeaway at the bottom of the slide, reinforcing the topic and serving closure.

Take action on your findings

The final steps of any analysis, experiment, or survey is to take action on the results. It is often helpful to have an action plan in place once the results come in. Here’s a way to help turn an insight or finding into change.

Determine your action plan basics:

  1. An owner who is responsible for making things happen
  2. A schedule for what is supposed to happen by when
  3. A metric so you know that a task is done (e.g., 100% of a population attend a training)
  4. A schedule for updates to your stakeholders (e.g., monthly or quarterly)
  5. A plan for communicating results to your employees

When proposing a change to your organization, you might encounter some resistance. You may want to try and pre-identify potential barriers to action and plan how to overcome them.

Some challenges can include:

  1. Denial: “These results are just a one-off.”
  2. Analysis paralysis: “Let’s get some more data. Can we run another experiment?”
  3. Resistance to change: “This change is really hard. Our program is good enough right now.”

Involve the right leaders of your organization throughout the process of defining, investigating, and taking action on findings. They will be supportive of your proposed actions, but they can be powerful advocates on your behalf to overcome some of these challenges.