All the talk about big data and analytics tends to obscure the most obvious and important question: Does it work? Cade Massey, practice professor of operations and information management at Wharton, recently provided some fantastic insight on this essential question on the Knowledge@Wharton site.
Even if you're just getting started in exploring how analytics can affect your human resources processes, you'll find that Massey's comments form a helpful framing of the subject.
In fact, the questions he raises--regarding how to evaluate job candidates and employee performance--are relevant to all organizations, even if you're light years away from hiring a data scientist.
1. Evaluate how effective your interview process is. Massey cites the example of Google, which used analytics to evaluate whether job interviews were necessarily predictive of strong employee performance. "They were spending hours of their managers' time interviewing candidates--eight interviews, nine interviews, 10 interviews. And they discovered they do not do much [to predict future performance]," Massey tells Knowledge@Wharton. "[Google said] let's just cut that back. Let's cut it down to the bare minimum. Let's have three or four interviews."
What's interesting about Google's conclusion is that it runs counter to what many top-tier professional services firms have discovered. For example, Capital Group, the legendary investment-management firm, has as many as 20 different employees interview serious candidates. While some Capital associates wonder about the productivity of such an exhaustive evaluation, most feel that the lengthy process is the best way to screen out which candidates are engaged by the work itself, as opposed to (merely) the compensation.
The point here is not to jump to an a priori conclusion about whether fewer interviews (like Google) or more (like Capital Group) is better for your organization. The idea is to use analytics--or even a simple spreadsheet--to assess your interview process. Who are your best/worst employees? How many interviews did they go through before you hired them? Run the numbers. Figure it out.
If you're a growing business, you won't need big data or fancy analytics software to answer this question. Ask your in-house Excel expert to create a spreadsheet exploring this subject, and see what she finds.
2. Evaluate your unconscious job-candidate biases. In addition to assessing your interview process, you can assess your entire candidate-evaluation process. How much weight, for example, are you giving to the interview, as opposed to experience and references?
It's possible, for example, that you have an experience bias. Mukul Pandya, Executive Director/Editor-in-Chief of Knowledge@Wharton, cites an NPR story about an experience bias that Xerox discovered. "One of the counterintuitive things they discovered was that, if somebody had a lot of experience working for different call centers, it was not necessarily a good thing," he says. "It might just be that they had a high burnout rate. It was, in fact, a predictor of potentially bad performance."
Again, the idea is not to jump to a conclusion. It's to ask the overarching question--How is my organization evaluating candidates?--and to use numbers to formulate an answer.
For instance, Massey describes how this process might work when it comes to B-school admissions. "You can go in and find out, whether you know it or not, if you have been implicitly putting 20 percent weight on GPA and 50 percent weight on the prestige of the company they work for and 30 percent on [other factors]," he says. "An organization might not know that its process put that much weight [on a particular factor]. Maybe that is okay. But maybe [after seeing the data] they decide they would rather put weight on something else."
3. Evaluate your performance evaluations. In the investment world, where performance-based bonuses are a big deal, those bonuses are often tied to overall fund performance. But is fund performance necessarily indicative of a strong performance by an investment analyst?
"There have been studies in some places that sometimes--I am not going to say all places, all times--the relationship between a fund manager's performance in one year is unconnected to his performance the next year," says Massey. "The idea is, if that is true, then there is a lot of chance in this process and these differences are not functions of skill. If they are not functions of skill, then maybe we should not be rewarding them heavily each year [their fund performs well]."
The reason analytics are important in a case like this is because of the prevailing culture in the investment world. In many settings, overall fund performance is entrenched as a key performance indicator (KPI); therefore, it would be very difficult for an executive to anecdotally make the case to decouple bonuses from a historically significant KPI. "But if you bring data and you run the numbers and you do a thorough study, you might be able to convince them," notes Massey. "You can actually figure out how much of this is...skill-based, and how much of it is chance."
And the closer your KPIs and bonuses get to rewarding skill (rather than chance), the closer you'll be to capturing a true assessment of employee merit.
And that's an idea that any employee or leader will support.