When possible, it's valuable to record the outcomes of formal people decisions - in a database, over time, with accessibility to the appropriate analysts. Here are some metrics Google started to collect:
Hiring (e.g., applicant, feedback, interview score, offer data)
Performance (e.g., ratings)
Promotion (e.g., nominations, decisions)
Pay (e.g., base pay, merit increases, bonus)
The Google People Analytics team analyzes aggregate trends in each of these data sources by gender and, if the sample size is large enough, by ethnicity. It's important to incorporate as many relevant control variables as possible to ensure results of these analyses can be interpreted. The team works to compare people in the same jobs, similar levels, and account for the factors that could affect each outcome variable.
By monitoring these people decisions regularly, the team was able to identify discrepancies in the promotion process. Google monitors and reports promotion rates every cycle, and splits promotion rates by gender to check for differences. In 2010, the team found a difference between men and women software engineers. In Engineering, Googlers can self-nominate for promotion when they feel ready to move to the next job level. In one cycle, Google's data showed that junior, female software engineers were not getting promoted at the same rate as their male counterparts. When digging into this, the People Analytics team found that the problem stemmed from differing self-nomination rates. Men, who in many cultures are typically more comfortable self-promoting, were nominating themselves at higher rates than their equally qualified but, on average, less self-promoting female peers. To solve this, a respected senior leader shared the data with Googlers encouraging all engineers to self-nominate if they were ready, and told managers to keep their eyes open for promo-ready Googlers. Following the nudge, promotion rates equaled out.