Jump to content
  • Sign Up
×
×
  • Create New...

What will a robot make of your résumé? The bias problem with using AI in job recruitment


Recommended Posts

  • Diamond Member



What will a ****** make of your résumé? The bias problem with using AI in job recruitment

by Melika Soleimani, Ali Intezari, David J Pauleen and Jim Arrowsmith,

This is the hidden content, please

Credit: Pixabay/CC0 Public Domain

The artificial intelligence (AI)

This is the hidden content, please
, spreading to almost every facet of people’s professional and personal lives—including job recruitment.

While artists *****

This is the hidden content, please
or simply being replaced, business and management are becoming increasingly aware to the possibilities of greater efficiencies in areas as diverse as supply chain management, customer service, product development and human resources (HR) management.

Soon all business areas and operations will be under pressure to adopt AI in some form or another. But the very nature of AI—and the data behind its processes and outputs—mean human biases are being embedded in the technology.

Our

This is the hidden content, please
looked at the use of AI in recruitment and hiring—a field that has already widely adopted AI to automate the screening of résumés and to rate video interviews by job applicants.

AI in recruitment promises

This is the hidden content, please
during the hiring process by eliminating human biases and enhancing fairness and consistency in decision making.

But our research shows AI can subtly—and at times overtly—heighten biases. And the involvement of HR professionals may worsen rather than alleviate these effects. This challenges our belief that human oversight can contain and moderate AI.

Magnifying human bias

Although one of the reasons for using AI in recruitment is that it is meant to be to be more objective and consistent,

This is the hidden content, please
have found the technology is, in fact,
This is the hidden content, please
. This happens because AI learns from the datasets used to train it. If the
This is the hidden content, please
, the AI will be too.

Biases in data can be made worse by the human-created algorithms supporting AI, which

This is the hidden content, please
.

In interviews with 22 HR professionals, we identified two common biases in hiring: “stereotype bias” and “similar-to-me bias.”

Stereotype bias occurs when decisions are influenced by stereotypes about certain groups, such as preferring candidates of the same gender, leading to gender inequality.

“Similar-to-me” bias happens when recruiters favor candidates who share similar backgrounds or interests to them.

These biases, which can significantly affect the fairness of the hiring process, are embedded in the historical hiring data which are then used to train the AI systems. This leads to biased AI.

So, if past hiring practices favored certain demographics, the AI will continue to do so. Mitigating these biases is challenging because algorithms can infer personal information based on hidden data from other correlated information.

For example, in countries with different lengths of military service for men and women, an AI might deduce gender based on service duration.

This persistence of bias underscores the need for careful planning and monitoring to ensure fairness in both human and AI-driven recruitment processes.

Can humans help?

As well as HR professionals, we also interviewed 17 AI developers. We wanted to investigate how an AI recruitment system could be developed that would mitigate rather than exacerbate hiring bias.

Based on the interviews, we developed a model wherein HR professionals and AI programmers would go back and forth in exchanging information and questioning preconceptions as they examined data sets and developed algorithms.

However, our findings reveal the difficulty in implementing such a model ***** in the educational, professional and demographic differences that exist between HR professionals and AI developers.

These differences impede effective communication, cooperation and even the ability to understand each other. While HR professionals are traditionally trained in people management and organizational behavior, AI developers are skilled in data science and technology.

These different backgrounds can lead to misunderstandings and misalignment when working together. This is particularly a problem in smaller countries such as New Zealand, where resources are limited and professional networks are less diverse.

Connecting HR and AI

If companies and the HR profession want to address the issue of bias in AI-based recruitment, several changes need to be made.

Firstly, the implementation of a structured training program for HR professionals focused on information system development and AI is crucial. This training should cover the fundamentals of AI, the identification of biases in AI systems, and strategies for mitigating these biases.

Additionally, fostering better collaboration between HR professionals and AI developers is also important. Companies should be looking to create teams that include both HR and AI specialists. These can help bridge the communication gap and better align their efforts.

Moreover, developing culturally relevant datasets is vital for reducing biases in AI systems. HR professionals and AI developers need to work together to ensure the data used in AI-driven recruitment processes are diverse and representative of different demographic groups. This will help create more equitable hiring practices.

Lastly, countries need guidelines and ethical standards for the use of AI in recruitment that can help build trust and ensure fairness. Organizations should implement policies that promote transparency and accountability in AI-driven decision-making processes.

By taking these steps, we can create a more inclusive and fair recruitment system that leverages the strengths of both HR professionals and AI developers.

Provided by
The Conversation


This article is republished from

This is the hidden content, please
under a Creative Commons license. Read the
This is the hidden content, please
.

Citation:
What will a ****** make of your résumé? The bias problem with using AI in job recruitment (2024, June 10)
retrieved 10 June 2024
from

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.







This is the hidden content, please

Science, Physics News, Science news, Technology News, Physics, Materials, Nanotech, Technology, Science
#****** #résumé #bias #problem #job #recruitment

This is the hidden content, please

For verified travel tips and real support, visit: https://hopzone.eu/

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Vote for the server

    To vote for this server you must login.

    Jim Carrey Flirting GIF

  • Recently Browsing   0 members

    • No registered users viewing this page.

Important Information

Privacy Notice: We utilize cookies to optimize your browsing experience and analyze website traffic. By consenting, you acknowledge and agree to our Cookie Policy, ensuring your privacy preferences are respected.