When it comes to compliance, AI should enhance equality

Craig Leen, former OFCCP Director at the DOL, breaks down what talent teams need to know about AI in HR regulations.

When it comes to compliance, AI should enhance equality

Talent leaders shouldn’t worry about AI harming equal employment opportunities. In fact, they should know that the right AI only enhances it. 

Craig Leen, former Director of the Office of Federal Contract Compliance Programs (OFCCP) at the U.S. Department of Labor, is a proponent of using AI in hiring, especially when finding equal opportunities for all workers.

The OFCCP focuses on systemic discrimination by looking for disparities in protected classes, including race, ethnicity, gender, and disability status. During Leen’s tenure, he says they found many instances of discrimination caused by unconscious bias. And, in many cases, where AI could have intervened. 

Unlike AI recruiting platforms based on aptitude tests, body language, or facial recognition — all of which have a history of discrimination — Leen says AI should be able to assess thousands of resumes and identify applicants who may be overlooked because of human bias. AI is also helpful in identifying skills and the potential to perform certain roles. 

We recently caught up with Leen to understand how organizations can better understand new AI regulations, how the OFCCP flags discrimination, and why the ability to explain compliance shortcomings remains critical.

AI validates disparities that are hard to explain

It may be surprising that up to half of OFCCP audit cases are flagged as disparate impact — or unintentional discrimination. However, only 2 to 5 percent of those organizations are ultimately penalized for intentional employment discrimination. 

The difference is intent. And the only way to differentiate between intentional and unintentional discrimination is to provide evidence of a lack of intent.

Audits — internal and external — require organizations to answer questions like “why did that disparity occur?” or “is there evidence of specific intent to discriminate against a group based on a protected class basis, such as race or gender?”

AI can expose this evidence against intentional discrimination for roles at scale. For instance, a construction role in question may have a 150-pound lifting requirement that could disproportionately impact women and people with disabilities. Likewise, for positions requiring a doctorate in physics, the talent pool disproportionately skews male.

“If you’re seeing a higher proportion of a particular race or gender selected, you need to determine why that is happening at this level,” Leen says. “Is this happening because of something explainable, or is it tied to race, ethnicity, or gender bias? You will have to explain it either way. So I ask companies, ‘Do you want to have AI there to help you or not?’”

“There’s this idea that AI is going to cause harm to certain people,” Leen continues. “The less you can explain a particular AI’s outcome, the more valid that concern. The opposite is also true though, particularly for good AI.” Click to Tweet

When it Comes to Compliance, AI Should Enhance Equality

Audit as you go: Reevaluate job requirements by focusing on skills over qualifications

When audits find unintentional discrimination, organizations must reevaluate job requirements to align with business needs.

Understanding how a job is done today, compared to how it was done in the past and how it may change in the future, helps determine which skills and qualifications are relevant and which ones to remove. 

“The ability to audit in real-time is really important,” Leen says. “You want a system that assists you in proving a requirement is job-related. If you are audited, you can immediately go to the system and explain what caused that disparity if it’s based on a particular skill requirement.”

Is that 150-pound lifting requirement still relevant? If it becomes a 75-pound lifting requirement, the talent pool should include more qualified people. Likewise, is a Ph.D. required for the job, or could someone with a master’s degree qualify? 

Leen says to focus on skills over qualifications. Equal employment opportunity increases dramatically by removing qualifications like an Ivy League degree, doctorate, or certificate requirement. 

“Recognize that many combinations of experiences may lead to the development of a certain skill, and by looking for that skill only, it gives someone a chance to compete for the job,” Leen says.

“Whereas in previous times, before you had AI that could help identify skills, it was hard to figure this out. So you might say, we’re just going to require that you have this certification or MBA. Or we’re only going to look at these five schools that we’ve had a good experience with in terms of candidates.”


Related: Learn more about how OneTen is closing the opportunity gap for Black Talent in America, creating 1 million family-sustaining jobs for individuals without a college degree.

What to look out for in new AI regulations

“You definitely don’t want AI to harm equal employment opportunity,” says Leen. “You also want AI to enhance it. Good AI is more likely to identify people who have historically been overlooked by humans or have been subject to unconscious bias.” Click to Tweet

Leen broke down emerging AI regulations most recently passed in New York City: AI cannot demonstrate any biases that will affect how people are hired.

Just as organizations should audit in real-time to correct adverse impacts in talent programs, AI tools must also constantly screen for bias.

Leen says that AI recruiting platforms should be neutral in regard to protected class, and organizations should assess vendors for enhancing equal employment opportunity. 

They should also ask potential AI providers these questions: What are they doing about bias? Can the AI provider show that there is no bias here? If there are two candidates, will they be treated equally regardless of gender or race? 

Focusing on skills over qualifications is another method. How does a skills-based approach enhance diversity programs? Removing exclusionary qualifications exposes a broader talent pool, increasing the likelihood of attracting and hiring a more diverse workforce.

Organizations should also introduce anonymous screening to the recruitment process by masking profiles. 

“I would generally recommend profile masking so that when you make that selection of who to interview, you’re not making that decision in any way based on race, ethnicity, gender, disability status, or sexual orientation,” Leen says.

AI-powered recruitment also enhances equal employment opportunity by bolstering affirmative-action programs. It helps organizations proactively identify more applicants from underrepresented groups. 

Related: With targeted campaigns, NextRoll increased its representation of BIPOC talent in people management roles from 11 to 35 percent. Learn about how NextRoll defines and meets diversity goals with Eightfold Talent Intelligence.

Organizations might wonder what’s next for AI and if a solution to systemic discrimination is on the horizon.

“I think in 10 years or sooner, AI will be the standard of care,” Leen says. “You’ll have to use AI. You can’t just rely on humans making decisions that are affected by unconscious bias.”

Listen to Eightfold AI’s podcast, The New Talent Code, to hear the entire conversation.


You might also like...