Culture 3 min read

New Study Raises Questions About Hiring Algorithms

As more companies integrate AI in their hiring processes, people now start to ask: how fair are hiring algorithms when it comes to assessing applicants?

ShotPrime Studio / Shutterstock.com

ShotPrime Studio / Shutterstock.com

Companies are increasingly depending on AI to hire new employees. But how fair are these hiring algorithms?

In an ideal world, the decision to hire an employee would be based entirely on their ability to do the job. But, the world is far from perfect, and hiring decisions are rarely objective.

As a result, organizations now hand off the processes to tech companies who use machine learning to screen applicants. Perhaps, the machines will do a better job at hiring than humans, the thinking goes.

Now a new study from a team of researchers at Cornell University has raised a question about the automated screening process:

Are Hiring Algorithms Unbias?

To answer this question, the Cornell University team scoured tons of public information.

They selected 19 vendors who specialize in algorithmic pre-employment screening and combed through the available information. These include webinars, company websites, documents, as well as vendor’s claims and practices.

Intellectual property laws shield algorithmic models. That means tech companies don’t have to disclose information about their models.

However, some vendors offer some insight into how their tools work.

While the insight provided a better understanding of the hiring tools, it made no mention of efforts to combat bias. In other words, tech companies may not be considering how to evaluate or mitigate algorithmic bias in their hiring tools

A doctoral student in computer science and first author of the study, Manish Raghavansaid:

“Plenty of vendors make no mention of efforts to combat bias, which is particularly worrying since either they’re not thinking about it at all, or they’re not being transparent about their practices.”

The researchers also noted that some tech companies used terms like “bias” and “fairness” vaguely in their document. For example, a vendor could claim that its hiring algorithm is “fair” without revealing how the company defines fairness.

Raghavan points out that calling an algorithm “fair” appeals to our intuitive understanding of the term. In reality, the automated screening tool is probably producing a narrower result than we imagined.

Now you’re wondering:

Are Hiring Algorithms Bad For Applicant Screening?

According to Raghavan, the answer is a definite no.

Years of empirical evidence suggest that humans suffer from various biases when assessing employment candidates. While a hiring algorithm may not be perfect yet, it can be improved to overcome these biases.

Raghavan said:

Despite their many flaws, algorithms do have the potential to contribute to a more equitable society. And further work is needed to ensure that we can understand and mitigate the biases they bring.”

With so many AI hiring vendors out there, it’s essential to understand how the companies address bias and discrimination.

Read More: 5 Ways Automation is Changing job Applications

First AI Web Content Optimization Platform Just for Writers

Found this article interesting?

Let Sumbo Bello know how much you appreciate this article by clicking the heart icon and by sharing this article on social media.


Profile Image

Sumbo Bello

Sumbo Bello is a creative writer who enjoys creating data-driven content for news sites. In his spare time, he plays basketball and listens to Coldplay.

Comments (0)
Most Recent most recent
You
share Scroll to top

Link Copied Successfully

Sign in

Sign in to access your personalized homepage, follow authors and topics you love, and clap for stories that matter to you.

Sign in with Google Sign in with Facebook

By using our site you agree to our privacy policy.