The Goodwin Blog

Stay Up To Date With The Latest News & Insights

Using AI in Hiring: How to Stop Bias from Foiling the Benefits

Using AI in Hiring How to Stop Bias from Foiling the Benefits (1)

Human resource professionals and hiring managers have used technology for years to assist in hiring and employee onboarding processes. Today’s most robust solutions, those that employ artificial intelligence (AI), are seeing new features and tools emerge daily, making talent acquisition far more efficient.  

While AI contributes substantial benefits to the hiring process, it’s critical in today’s environment to avoid potential biases that can permeate algorithms that are used in job candidate evaluations and hiring decisions. This defeats many benefits of AI-powered hiring technologies and can also inadvertently run afoul of discrimination laws. 

Let’s start with the benefits  

There are numerous ways that AI streamlines and enhances talent acquisition and related processes. Here are nine of the most powerful ones:  

  1. Analysis of external perceptions: AI tools can evaluate external perceptions of and sentiments around a company’s brand across social media and other online platforms. Companies use these insights to improve employer branding and talent recruiting strategies.  
  2. Virtual assistance: AI-assisted interactions, supported by virtual assistants and chatbots, greet new job candidates, answer their questions, and guide them through the application process.  
  3. Resume screening: In this initial applicant screening process, AI algorithms analyze resumes for relevant data, identify keywords matching those in job descriptions, and give hiring managers a jump start on job candidate evaluations.  
  4. Interview scheduling: AI eases this very time-consuming process by coordinating availability and scheduling (and adjusting) interviews between hiring managers and job candidates.  
  5. Virtual interview video analysis: Specialized AI tools analyze all types of verbal and non-verbal expressions, providing insights into whether a job candidate is well-suited for a particular job.  
  6. Candidate matching analytics: Predictive analytics help identify candidates who are more likely to succeed in a role, based on factors like work history, skillsets, and cultural fit. Predictive modeling is also used to help identify candidates who are most likely to stay in their new jobs, aiding in talent retention.  
  7. Cognitive and skills testing: AI-powered testing helps hiring managers assess a job candidate’s thinking and problem-solving abilities, personality traits, and required technical skills.  
  8. Bias reduction: As noted, AI can be biased. Fortunately, it can also be designed and trained to focus on objective data during the candidate evaluation process to ensure a more diverse and inclusive workforce.  
  9. Personalized employee onboarding: AI assists in the new employee onboarding process by creating personalized plans with relevant information and answers to specific questions about the job, team, facility, and company, and other must-have information.  

How does AI promote unconscious bias?  

“The biggest risk inherent with the use of AI in hiring is the perpetuation and amplification of biases and discrimination,” says Jon Hyman, who provides businesses and HR professionals with news and updates on labor and employment law. “AI algorithms learn from existing data. If that existing data is biased or reflects systemic or otherwise existing inequalities, AI systems can inadvertently reinforce those biases.”  

In other words, if the data an AI hiring solution uses to create a desired hiring outcome is fundamentally biased, results will also be biased. Similarly, if data used in an AI hiring model is based on an inaccurate representation of the population (gender, age, ethnicity and other factors), the model may produce biased results.  

For example, if historical hiring data sets reflect gender, racial, or other biases present in human decision-making, an AI model and its algorithms may use and increase those biases when making recommendations or predictions, resulting in discriminatory outcomes to the disadvantage of certain groups of people, including the most qualified candidates.  

What can you do to prevent bias in AI-assisted hiring?  

What’s important to recognize about using artificial intelligence to assist in hiring, or any application, is that it must start, progress, and improve through human interaction. Ideally, this calls for a diverse team that brings a variety of perspectives and experiences into the development process. And it’s imperative to approach development with a commitment to ethical practices 

In short, AI models and machine learning (ML) algorithms need to be developed and trained by humans using diverse and accurately representative data sets to mitigate biases. It also involves ongoing monitoring and transparency to maintain fair and unbiased outcomes.  

Here are four key measures you can employ to avoid discriminating against job candidates in the hiring process and progressively improve your AI model:  

  1. Establish ethical guidelines for the development and usage of your solutions. Ensuring fairness and nondiscrimination is more easily achieved when complying with industry best practices 
  2. Since AI training data and machine learning algorithms rely on existing data, design yours to be transparent, keep representative data current, and use AI tools to conduct regular audits to identify and correct patterns of bias and improve your model. As with any business strategy, update your model to address emerging opportunities or challenges, and as with any form of IT, keep your solutions up to date.  
  3. Engage appropriate professionals or counsel to ensure legal compliance so that your AI-assisted hiring practices, processes, and outcomes adhere to anti-discrimination laws and prevent inadvertent discrimination.  
  4. Improve AI-powered hiring processes in multiple ways: Encourage user feedback within your HR and hiring team to identify algorithm problems and unconscious bias and improve performance. Create and use fairness metrics to evaluate the impact of your artificial intelligence model on various demographic groups. Provide regular training for your HR professionals and hiring managers on how your specific AI solutions work and how to identify bias, so that they can make better, more informed hiring decisions.  

Partner with a DEI-focused talent recruiter  

It’s important to stay current on developments in artificial intelligence for hiring because this field of IT is continually evolving – including in the critical area of data privacy.  

Most important of all is keeping real people involved in your talent recruitment and hiring processes. In conjunction with AI technology, it’s also a good idea to partner with an established and knowledgeable recruiting firm, one that can help you source and onboard the best candidates for essential roles, while also helping you perfect your hiring processes.  

Contact Goodwin Recruiting for more insights into the importance of diversity, equity, and inclusion (DEI) in today’s workforces.