According to research by Jobscan, 99 percent of Fortune 500 companies use some type of ATS system -- like Oracle's Taleo, PeopleFluent, Avature, or Greenhouse, among a few handfuls of others. Among smaller enterprises, Capterra found that 75 percent of recruiters and talent managers overall use some form of applicant tracking software. So, it's safe to say that most positions being filled today are being filled with the aid of these systems.
Those who use them do so with good intentions and for very good reason. ATS systems enable HR leaders and recruiters to sift through large volumes of applications in a very short period of time by eliminating candidates that do not meet certain criteria. And since, according to Glassdoor, each online job posting attracts, on average, 250 applicants, some way of separating the wheat from the chaff is absolutely required.
Applicant tracking systems have, for years, been that tool, enabling talent managers and recruiters to achieve much higher levels of efficiency than without them. Through the use of keywords, primarily, recruiters and talent managers tell the ATS what they are looking for, and the system takes over, finding résumés that meet the desired criteria and overlooking those that don't. The machine does in minutes what might have taken hours of manual work, poring over individual résumés one by one. But here's the problem: This process can be inherently discriminatory, and is very likely avoiding the kind of leaders people actually want to follow. And A.I. is making a bad situation worse.
Here's why: Remember, ATS systems utilize keywords assigned by the recruiter to find ideal candidates, so any unconscious biases held by the recruiter will be programmed into the algorithm. What's more, the machine is taught to look for the ideal and ignore whatever does not meet it -- they work on the basis of negative elimination. So, if the best schools are asked for, anyone who attended anything else will be eliminated. If continuous employment is asked for, anyone with gaps will be eliminated. Certain job experience will be sought, so a summer or second job in an off industry to make ends meet or pay back loans will get a candidate bounced. If geography is asked for, where you live can be a killer. You get the idea.
Foremost, these systems can produce patently discriminatory results. According to Headstart, a "study of over 20,000 applicants shows that legacy ATS platforms enable inequitable hiring processes that lead to severe discrimination." This happens not because the operators are inherently racist or sexist, but because they seek to avoid work and demographic keywords that are typical to female and minority applicants, like job gaps for child care, second-tier schools, certain geographies, second jobs to pay for school, certain majors popular with minorities, etc. Because candidates with these backgrounds won't have the desired traits, they will be eliminated out of hand.
As a result, the ATS will drive toward a homogenous workforce of nearly alike, non-diverse hires that meet exacting criteria believed by the hiring manager to deliver success in the roles they're trying to fill. In other words, the system gives hiring managers exactly what they ask for.
The same Capterra study referenced above found that only 5 percent of those surveyed believe that their ATS system has had a negative impact on their operation. So clearly, in near unanimity, these folks are blissfully unaware of the discriminatory implications of these systems. And it doesn't stop with race and gender.
Take soft skills. Typically, soft skills are not included in ATS algorithms. Hard skills and action/outcome words like "results," however, are. Accordingly, more empathic leaders are almost always overlooked by ATS systems in favor of hard charging type-As. So, if a caring, empathic leader doesn't also list hard skills matching the micromanaging, autocratic model HR leaders look for, well, they're out. Likewise, recruiters will often use school tiering and GPA, when they can find it, as proxies for intelligence, which they still believe is a predictor of success. As a result of an over-reliance on bias for action and intelligence at the expense of soft skills, ATS systems deliver an indistinguishable mix of executive leaders that few actually want to work for -- in fact, the exact sorts of leaders that people are running away from to the tune of 4+ million each month in the Great Resignation. And don't expect A.I. to make things better.
I spoke to Sergio Suarez, Jr., CEO at TackleAI about this dilemma. Sergio has been writing code since he was 11, first to help his family's business and now as the leader of a startup artificial intelligence firm focused on data processing. Sergio confirmed what I'd learned about ATS systems and the unfortunate, albeit unintentional outcomes they are driving today. He said, "Selectiveness is important in hiring, but the way hiring algorithms behave borders on discriminatory, and refuses to give deserving talent legitimate consideration."
Sergio also helped me understand that A.I. is likely to make things worse, not better, as it is most likely to be employed to look at data from historical hires to create algorithms for forward hires. The system will continue to hire more of the wrong people, but, according to Sergio, "because the system will have taken control of the decision-making process, the dangerous part is you won't know why you are hiring who you are hiring." When the system finds a candidate that aligns with historical patterns (built on prior bias-fed algorithms, mind you), the system is going to believe it did something right -- it will confirm its own biases and seek to repeat them. In short, it will likely make the problem worse.
So, what's the answer? It's not about booting these systems to the curb. It's about getting the part about telling them what you want more often correct.
According to Sergio -- whose company has actually seen no post-lockdown departures during the Great Resignation -- it's about curating the data correctly on the front-end, whether you are using an ATS system or heading into the open frontier of A.I. He advises against letting A.I. curate your hiring data and instead suggests that you start over by telling the system not things you think will make a hire successful but things you know will do so based on the people who are working for you now. That includes adding for the capture of critical soft skills.
Sergio also stresses the importance of getting associate performance reviews done right. "If associate appraisals are unfair (in either direction), the hiring data is going to be biased," he says. The key, then, to emulating TackleAI's success in hiring and keeping great people is to remove bias from the system by using what you already know makes a great hire.
It's also a matter of shifting your thought process to what actually matters. It means considering that associate engagement matters more than time to hire. It could mean believing that lower turnover matters more than where your people went to school. It might also mean that candidate satisfaction is a thing that should matter to you, as well. Finally, it should include a recognition that people should matter more than anything -- more than the system you use, more than some arbitrary hiring deadline, more than your desire to tell the gang at the club you're using A.I., and certainly more than your own built-in biases.
So, get it right on the front end. Follow the example of Sergio Suarez, Jr. and the team at TackleAI: To find more great people, simply tell your system about the ones you already have.