R.I.P. résumés

AI-assisted screening of job applications reduces sexism, racism and ageism in the workplace.

R.I.P. résumés

Four years ago, much to his mother’s chagrin, Jahanzaib Ansari dropped out of university to pursue his entrepreneurial dreams.

His first effort, a bespoke tailoring business founded with two friends, was a success, but Ansari left. He had personal reasons but also wanted to try his hand at selling technology  —  in this case, a device that could create drinking water from moisture in the air. When that didn’t pan out, he went to Amsterdam seeking to work with another tech startup.

Nothing materialized and, by November 2015, Ansari was back in Canada, broke and looking for a job. To his surprise, he did not get a single response to the many résumés he sent to prospective employers.

“I’ve got a good soft-skills set, I’ve gone to school in Canada  —  I felt like I should have been getting a lot more job interviews,” he recalls.

Then he came across studies showing that members of minorities who anglicized their names were more likely to get a second look.

Jahanzaib became Jay, and “I got so many job interviews, I was astounded.”

One of them paid off, but not for long.

Dismayed at what he’d had to do just to get the job, Ansari decided to leave it and try to change the system.

He and two partners, Maaz Rana and Faisal Ahmed, founded Knockri, and joined the ranks of those now using cutting-edge technology to help employers find the best person for a job  —  without bias getting in the way.

One reason that racism, sexism and all the other “isms” are so pervasive in society is the fact that they aren’t necessarily intentional: nearly everyone is guilty of unconscious bias. Even the most determinedly open-minded can’t fight brain science, according to psychologist Timothy Wilson.

Wilson, who teaches at the University of Virginia, says that we are constantly exposed to millions of bits of information, but the brain can process only 40. To cope, “it creates shortcuts and uses past knowledge to make assumptions,” he told Fast Company magazine in 2014.

In other words, our experiences and our culture can influence how we assess others, and we don’t even realize it.

Why your résumé isn’t giving you a foot in the door

Artificial intelligence can help solve the problem by providing a bias-free screening tool. This saves employers valuable time and the need to sift through résumés, which are not always a reliable reflection of an applicant.

“All the data on that résumé doesn’t actually predict success,” explains Caitlin MacGregor, chief executive officer of Plum.io, a Kitchener-based company with a different approach to applicant screening.

In fact, she says, résumés introduce a whole slew of biases  —  where the applicant went to school, past experience, name, gender. Only when “you don’t use that as your short-listing factor” do you have a set of criteria “that matters.”

 

 

Plum, like Knockri, works closely with clients to identify what they are really searching for in a job candidate. These qualities are programmed into the AI, and job seekers are filmed as they respond to questions that have been formulated with the help of psychologists to tease out whether they possess the desired attributes.

For each assessment answer that is provided, Knockri’s proprietary technology performs a contextual behavioural analysis of candidate transcripts. The algorithm is blind and has been trained not to account for skin colour, race, ethnicity, appearance or sexual preference as measures of desirability for a hire.

The AI, he adds, “Automatically classifies work behaviours from transcript, identifies how important those behaviours are for the skill being measured and calculates a candidate quality score. It was mapped out to be an inclusive technology from its inception, allowing leaders to build a diverse and talented workforce.”

AI as job matchmaker

Obviously, all of these features benefit whoever is doing the hiring, but it could also benefit applicants, helping them avoid altogether a job for which they’re simply a bad fit. There is evidence that as many as 70 percent of workers may be in positions for which they are not well suited.

To Jamie Schneiderman that is a “huge problem” and it’s one of the reasons why he launched Toronto-based Clearfit Analytics. “We set out with a mission to get people into the right jobs,” he says, adding that the end result is “happier people and a more productive company.”

Schneiderman knows from personal experience how important personal satisfaction can be. After earning a master’s degree in business administration from Harvard, he worked with some large, well-known companies but “I found my career unbelievably frustrating.”

 

 

Neither employers nor employees, he adds, want to make poor decisions, but avoiding them requires that they do things differently. Clearfit uses artificial intelligence to play matchmaker. Every applicant (the company says it has worked with 5,000 companies in the past 11 years and collected data on 1.5 million people) answers a series of questions. “Then our system can combine people together to build ‘success profiles’ on a role-company-industry basis,” Schneiderman explains. “I can take a person who’s applying to a company and automatically match them with the job for them.”

In some cases, that may not be the job the person has applied for, but ultimately the best match is what’s important for everyone. And for Clearfit’s large clients, who have to fill thousands of jobs every year, it’s a huge benefit. Some want applicants to see only openings for which they are suited; others show them a list of all potential jobs and then keep people who qualify on file for when an opening appears. Employers also receive information they can use to make better decisions throughout an employee’s career. Clearfit’s patented “predictive analytics platform” can formulate custom career paths so companies can provide employees the training they need “to have a more fulfilling and successful career,” Schneiderman says.

AI helps employers truly see their candidates

Also, like Knockri and Plum, Clearfit says its process increases diversity because applicants are shortlisted strictly on the basis of how they perform during the screening.

“Our system doesn’t know how old you are, what sex you are, your ethnicity, your religion,” Schneiderman says. “It can’t see you. It can’t judge you. It’s solely looking at every single person in an equal, unbiased way.”

Plum’s MacGregor describes AI-assisted screening as “the top of the funnel”  —  at the other end, employers see only a list of applicants who meet, or are close to meeting, the criteria they have set out. In some cases, qualified candidates are right under their noses.

The psychometric assessments that are used can bring to the fore people already on staff who would otherwise escape notice. In such cases, companies avoid the time, energy and financial burden of employee turnover. “We have candidates who were previously overlooked surfacing,” MacGregor says. “The short list of candidates ends up being far more diverse.”

 

 

Of course, when humans take over, bias can enter the process, but at least the pool of available talent is better.

AI alone has not made these advancements in hiring possible, as MacGregor notes. Instead, to work properly, the technology has to be married to the lessons of industrial-organizational psychology  —  the study of how we behave in the workplace  —  or the machine will just replicate the errors of the past.

Rather than simply automating something that is faulty and outdated, she says, “our belief is you have to redesign the system to get a better result.” Just as training can combat unconscious bias in the workplace, AI “provides us the opportunity” to overhaul the hiring process.”

Schneiderman agrees and says that, because it can learn, AI can adapt by adjusting on the basis of the information it is given. Like a child, “it doesn’t have inherent biases.”

The potential catch, then, is that AI still depends on the humans who input the data. But the companies using it say they strive to ensure that their criteria remain as objective as possible. There is no point in promising to find the best person for the job if someone like Jahanzaib Ansari is disqualified simply because of his name.