

Measuring speech patterns in a video interview may screen out candidates with a speech impediment, while tracking keyboard inputs may eliminate candidates with arthritis or other conditions that limit dexterity. How do you rework the model group so the output isn’t biased?”ĪI tools used to assess candidates in interviews or tests may also pose problems. “When you look back and use the data, if the model group is mostly white and male and under 40, by definition that’s what the algorithm will look for. Walton, a partner with law firm Fisher & Phillips LLP. You’re using the past as a prologue to the present,” said David J.

“You’re trying to identify someone who you predict will succeed. Notably, Amazon had to scrap a recruiting tool - trained to assess applicants based on resumes submitted over the course of a decade - because the algorithm taught itself to penalize resumes that included the term “women’s.” That’s especially true if an employer’s “model group” of potential job candidates is judged against an existing employee roster. Unfortunately, each step of this process can be prone to bias. The goal, of course, is to help companies find someone with the right background and skills for the job. What if they can’t get a job, but they don’t know the reason why?” Looking beyond the ‘model group’ĪI recruiting tools are designed to support HR teams throughout the hiring process, from placing ads on job boards to filtering resumes from applicants to determining the right compensation package to offer. “This is about the context in which we are making sure that people have equitable access to economic opportunity. “New York City is looking holistically at how the practice of hiring has changed with automated decision systems,” Julia Stoyanovich, Ph.D., a professor of computer science at New York University and member of the city’s automated decision systems task force, told HR Dive. Department of Justice that “blind reliance” on AI tools in the hiring process could cause companies to violate the Americans with Disabilities Act. Equal Employment Opportunity Commission and the U.S. It aims to address concerns from the U.S. While Illinois has regulated the use of AI analysis of video interviews since 2020, New York City’s law is the first in the country to apply to the hiring process as a whole.
