Tech

LinkedIn’s job-matching AI was biased. The company’s solution? More AI.

Published

on


More and more companies are using AI to recruit and hire new employees, and AI can factor into almost any stage in the hiring process. Covid-19 fueled new demand for these technologies. Both Curious Thing and HireVue, companies specializing in AI-powered interviews, reported a surge in business during the pandemic.

Most job hunts, though, start with a simple search. Job seekers turn to platforms like LinkedIn, Monster, or ZipRecruiter, where they can upload their résumés, browse job postings, and apply to openings.

The goal of these websites is to match qualified candidates with available positions. To organize all these openings and candidates, many platforms employ AI-powered recommendation algorithms. The algorithms, sometimes referred to as matching engines, process information from both the job seeker and the employer to curate a list of recommendations for each.

“You typically hear the anecdote that a recruiter spends six seconds looking at your résumé, right?” says Derek Kan, vice president of product management at Monster. “When we look at the recommendation engine we’ve built, you can reduce that time down to milliseconds.”

Most matching engines are optimized to generate applications, says John Jersin, the former vice president of product management at LinkedIn. These systems base their recommendations on three categories of data: information the user provides directly to the platform; data assigned to the user based on others with similar skill sets, experiences, and interests; and behavioral data, like how often a user responds to messages or interacts with job postings.

In LinkedIn’s case, these algorithms exclude a person’s name, age, gender, and race, because including these characteristics can contribute to bias in automated processes. But Jersin’s team found that even so, the service’s algorithms could still detect behavioral patterns exhibited by groups with particular gender identities.

For example, while men are more likely to apply for jobs that require work experience beyond their qualifications, women tend to only go for jobs in which their qualifications match the position’s requirements. The algorithm interprets this variation in behavior and adjusts its recommendations in a way that inadvertently disadvantages women.

“You might be recommending, for example, more senior jobs to one group of people than another, even if they’re qualified at the same level,” Jersin says. “Those people might not get exposed to the same opportunities. And that’s really the impact that we’re talking about here.”

Men also include more skills on their résumés at a lower degree of proficiency than women, and they often engage more aggressively with recruiters on the platform.

To address such issues, Jersin and his team at LinkedIn built a new AI designed to produce more representative results and deployed it in 2018. It was essentially a separate algorithm designed to counteract recommendations skewed toward a particular group. The new AI ensures that before referring the matches curated by the original engine, the recommendation system includes an even distribution of users across gender. 

Kan says Monster, which lists 5 to 6 million jobs at any given time, also incorporates behavioral data into its recommendations but doesn’t correct for bias in the same way that LinkedIn does. Instead, the marketing team focuses on getting users from diverse backgrounds signed up for the service, and the company then relies on employers to report back and tell Monster whether or not it passed on a representative set of candidates. 

Irina Novoselsky, CEO at CareerBuilder, says she’s focused on using data the service collects to teach employers how to eliminate bias from their job postings. For example, “When a candidate reads a job description with the word ‘rockstar,’ there is materially a lower percent of women that apply,” she says.



Copyright © 2021 Vitamin Patches Online.