Regular post  DEI

Using AI to Improve DEI&B in Recruiting

Artificial intelligence (AI) brings numerous benefits to recruitment teams—data-informed decision-making, efficiency, improved candidate matching, and personalization—enhancing their processes and improving outcomes. In fact, 65 percent of recruiters use AI, and 67 percent feel it has improved the hiring process.

Eliminating bias in the hiring process

The Pew Research Center explored how Americans felt about using AI when hiring and evaluating workers and reported the following statistics:

  • 79 percent feel biased treatment due to a candidate’s race or ethnicity is problematic. Of these people, 53 percent think this unfair treatment will improve as more employers use AI in hiring.
  • 64 percent of Blacks, 49 percent of Asians, 41 percent of Hispanics, and 30 percent of Whites believe bias and unfair treatment based on race or ethnicity is a significant problem.
  • 47 percent believe AI is better than humans at treating all job applicants equally.

Using AI to improve diversity, equity, inclusion, and belonging (DEI&B) in recruiting is a promising approach that can help eliminate bias and promote fair hiring practices. Here are some ways to leverage AI to achieve these goals:

Job description optimization: Seventy-seven percent of companies use ChatGPT to assist in writing job descriptions. AI can analyze and optimize job descriptions to ensure they are inclusive and appealing to diverse candidates. Research has found that women don’t apply for jobs unless they meet 100 percent of the requirements—while men apply when they meet only 60 percent. AI can identify and remove biased language or gender-specific terms and keywords to attract a more diverse pool of applicants. It can also analyze the tone conveyed in the job description to ensure it’s not too negative or demanding.

Resume screening: Resumes with Caucasian-sounding names get 50 percent more interview callbacks than those with Black-sounding names. AI can anonymize resumes by removing personally identifiable information—such as names, addresses, and photos—to prevent unconscious bias during the initial screening process. This allows recruiters to evaluate candidates solely on objective qualifications and skills rather than subjective factors. In fact, blind hiring improves the likelihood that a woman is hired by 25-46 percent.

Skills and competency assessment: Traditional recruitment frequently depends on subjective evaluations of applicants, which may be influenced by unconscious bias. AI-powered assessments and simulations can help objectively assess candidates’ skills and competencies. Tools like The Predictive Index (PI) are used during the hiring process to assess the behavioral and cultural fit of the candidate. These assessments can be designed to focus on specific skills and abilities rather than personal characteristics, reducing bias in the evaluation process.

Candidate sourcing: AI can help expand candidate sourcing by searching for potential applicants from a wide range of platforms and databases. Casting a wider net increases the likelihood of finding candidates from underrepresented groups. AI can also proactively identify potential candidates who might not have actively applied for a position but possess the desired skills and qualifications.

Initial interview and analysis: Pre-recorded one-way video interviews—used globally by 61 percent of employers—ask all candidates the same questions in the same manner and order. AI can analyze interview recordings or video submissions to evaluate candidates objectively based on predetermined criteria. This helps minimize biases introduced through nonverbal cues—such as facial expressions and body language—tone of voice, or physical appearance.

Bias detection and mitigation: AI algorithms can be trained to identify and flag potential bias in the recruitment process. By analyzing data patterns, language usage, or decision-making processes, AI can alert hiring managers and recruiters to potential bias and prompt them to take corrective actions.

Monitoring and reporting: AI can help track and analyze diversity metrics throughout the hiring process. It can provide insights and reports on the demographic makeup of candidate pools, shortlisted candidates, and hires. This data helps organizations identify areas for improvement and measure the effectiveness of their DEI&B efforts to continuously improve to create a more equitable and representative work environment.

The relationship between AI and humans

While AI can assist in reducing bias by bringing efficiency and objectivity to the hiring process, human oversight and intervention are crucial. Human involvement is necessary to ensure fairness, ethical considerations, and alignment with organizational values. Recruitment professionals and hiring managers are needed to select the data used to develop AI models, interpret the results, make informed decisions, and address any potential biases that may arise during the hiring process.