A long time ago, the terms artificial intelligence (AI) and machine learning were both vague. These data-driven processes now automate tasks in a variety of industries and homes worldwide. To create new platforms and expand diverse datasets, AI is being used more and more in the hiring process. Do these AI hiring tools, however, intentionally or unintentionally discriminate against job candidates based on their age, race, and gender?
The Hiring Process and AI
Businesses have adopted artificial intelligence at a 270% rate in just 5 years, with a $266.92 billion market expected by 2027. Surprisingly, this expansion is being aided by the human resources sector. Talent acquisition departments are swamped with resumes and applications as the future of work shifts towards digital and remote work. Finding the ideal candidates from a large applicant pool is, according to 52 percent of talent acquisition leaders, the most difficult aspect of the hiring process; obviously, they could use some assistance.
AI and machine learning software can help with that. Talent Acquisition Managers (TAs) can automate a variety of time-consuming tasks thanks to AI in the hiring process. The time it takes to screen, hire, and onboard new candidates can be sped up, as can the completion of menial administrative tasks, as well as the development and improvement of standardized job matching procedures.
What AI Hiring Apps Can’t Fix?
Many AI hiring software programs are available today to help with all aspects of hiring, including recruiting, onboarding, retention, and everything in between. To counteract this bias, these tools must have a variety of applications and hiring practices from which to learn. Using training data, the initial set of data that machine learning uses to help it understand complex results, these apps can determine which applicants are the most qualified for open positions.
Because Kurt and Angela Edwards thought that cultural traits and soft skills were important metrics that businesses should prioritize, they founded Pyxai. Using this uniform metric, recruiters and hiring managers can compare candidates fairly. Kurt Edwards is adamant that this will lead to better hiring decisions “While soft skills are more difficult to learn, hard skills can be. That is what you should check for and gauge.”
Artificial Intelligence (AI) Types Employer Bias
Hiring tools can recognize human patterns thanks to training data, machine learning, and natural language processing. However, if the decision-makers have biases, the data may already be tainted.
Kurt Edwards feels that this underappreciated bias damages the talent pipeline. Because of the prevalence of unconscious bias in the recruiting sector, some qualified candidates simply did not enter the funnel. The halo effect, for instance, can lead interviewers to concentrate on a candidate’s positive traits while overlooking their negative ones. Multiple biases may be present in AI hiring software applications.
Mistaking People of Color for Others
Facial recognition software is sometimes used in hiring and interview processes to screen candidates for personality and fitness. However, these AI hiring tools may hurt scoring if they are unable to recognize skin tone, gender, expression, and body language.
Reinforcing Gender Stereotypes
Since ancient times, gender stereotypes have been pervasive in Corporate America, primarily to the detriment of women. For manager and C-suite roles, word association-based AI applications may yield results that are discriminatory against women. Due to occupational segregation, women are only allowed in lower-level, less senior positions.
Removing a Whole Age Group
Despite most people not listing their birthdate on a resume, age and the number of years of employment typically correspond. The results of programming AI hiring software to consider this metric as a skill measure can age entire demographics out of consideration.
Ignoring the nuances of disability
AI hiring software may score a candidate’s enunciation, body language, and facial expressions. Numerous disabilities may hurt a candidate’s performance in a phone or video interview, disqualifying them from consideration.
Fortunately, algorithmic bias has a name thanks to Joy Buolamwini, the creator of the Algorithmic Justice League. By using fair, equitable, and inclusive AI hiring practices, it is possible to permanently eradicate the “coded gaze.”
Fixing Hiring AI Problems
There has, according to Pyxai’s founder Angela Edwards, been a change in basic assumptions in recruiting. The incoming workforce has higher expectations for businesses in terms of modern training models, improved DEI initiatives, and general higher ethics. This change could be exactly what the human resources sector needs to give underrepresented groups more equitable opportunities.
Companies need to use more inclusive coding, according to Buolamwini’s TEDx talk. When coding is inclusive, it reflects the people writing it and its intended use. HR professionals can be deliberate about how they use the training data that goes into AI hiring applications. When configuring the tools, they can decide which profile information about a candidate to consider or leave out, such as age, race, and gender.