top of page

 Join our weekly blog

Artificial Intelligence, Equal Employment Opportunity (EEO) Laws, and employer responsibility




This is an increasingly important question as AI, or artificial intelligence, is seeping into so many aspects of recruiting and hiring. From the automated chat bots that engage and evaluate candidates to the fancy resume and internet scanning technology to automatically build ranked profiles of applicants, to those AI tools that help auto generate job descriptions and interview questions, AI is everywhere.


So, who is responsible if one of those AI solutions turns out to be in violation of the EEO protection laws? Is it the vendor who created the technology or the employer who uses it?


The short answer – it’s the employer!


That’s right, much like many other tools, regulations focus more on how it’s used, not who made it.


But, before jumping into why, let’s start with a little background of ‘what’.


Introducing the EEOC and Title VII


The Equal Employment Opportunity Commission (EEOC) enforces federal equal employment opportunity laws prohibiting employment discrimination based on race, color, national origin, religion, sex (including pregnancy, sexual orientation, and gender identity), disability, age, and genetic information, which for the purposes of this explanation are the ‘protected classes’. Included in these laws is Title VII which prohibits intentional employment discrimination through, among other things, the use of tests or selection procedures that disproportionately exclude protected classes if those tests or procedures are “not job related for the position in question and consistent with business necessity” (known as “adverse impact”).


And, it is here, under the umbrella of ‘tests or selection procedures’ that artificial intelligence lives. In other words, the EEOC defines AI as a “selection procedure”, and therefore AI is covered under Title VII.



Employer responsibilities: it’s not the tool that matters, it’s what you do with it


Like many other products, solutions, and processes, it’s less about what it is, and more about how it’s used.


For example, a pencil on its own has no positive or negative connotation, it just is. But, if someone were to use that pencil to write a hate speech, or to disclose government secrets, it is a tool of a hate crime or treason. If someone uses it to stab another person, it is a tool of a violent crime. This is an extreme example, but it makes the point clear. Tools are just that – tools. They can be used or misused by their users.

And, according to the EEOC, AI is no exception.


For example, if the employer provides discriminating instructions to an AI sourcing tool (“Find me candidates that have college degrees from this school with low demographic representation”), they have caused the AI to produce a discriminating result. If an employer feeds an AI solution data that reflects a discriminating pattern (such as a data set of all male engineers to evaluate as a ‘good engineer’), then the employer has caused the AI to build a discriminating pattern.


And even if the employer has not fed the AI discriminating instructions or data, the fault for EEO Title VII violations still lies with the employer.


But… the technology made me do it


“The technology made me do it” doesn’t work for questionable purchases due to a ‘other things you may like’ algorithm, and it doesn’t work for suggested people to hire.

According to the EEOC, the act of choosing and using an AI tool and relying on the output of that tool for candidate engagement and employment choices puts the employer at fault for any discriminatory results. In other words, employers are responsible for validating that the AI solution they are implementing falls within the acceptable behavior of equal employment opportunity.


With the responsibility resting on the employer, does this mean employers should avoid all AI solutions?


No, of course not. It means that employers have the responsibility of selecting the right solutions, understanding what the solutions are doing, and ensuring they are using the solutions appropriately.


With great power comes great responsibility


With the responsibility on the shoulders of the employer, there are steps, you, as employers, can take to ensure that the solutions they use (AI or otherwise) meet their needs while complying with regulations.


Start with the why


“We need to hire this person – they’re amazing!”


“Great, to do what?”


“Umm…”


While we’d all like to be in a position where we can hire people because they are just amazing, the reality is, most of us work in organizations with limited resources which means we have to limit our hiring to those that can fill positions we need.


The same goes with technology. Before evaluating any amazing, fancy, new, or just cool technology, first you must have a need to fill.


What problem must you solve with the technology? How will you know the technology is successful and/or adding the value you expect? What is the measurement of success that will warrant the price tag of acquiring, configuring, and adopting the new technology?


Clearly defined needs and measurements of success will allow you to effectively evaluate any technology and protect you from the distractions of shiny features and big promises. It will also protect you from the liability of misuse as you can focus your technology evaluation on the true use cases of your organization.


Look behind the curtain to understand how it works


“Trust me, I’m amazing at what I do. There’s no need to waste your time evaluating me.”

… If a candidate said that, would you believe them? “Stop the interview – there is no need. They just said they are ‘amazing’. Hire them!”


Yeah… that is completely ridiculous. So don’t do that with your Talent Acquisition technology either.


Understand how the technology works. Have the vendor walk through the science behind it – what it’s doing, why it works, how they know it works. And don’t let the vendor give you any excuses either. If the solution is proprietary or camouflaged by a bunch of big words, that is a great sign that this isn’t the vendor for you. Remember, it is your responsibility as an employer to select and use the right technology.


Once you understand how it works, don’t just take the vendor’s word for it, request proof. This is usually in the form of data such as performance metrics, result sets and validation data that proves that the results are accurate and predictive.


Then prove it yourself through a trial or proof of concept pilot. Implement the solution in a realistic scenario (either using a real open job or through a simulation that is as close as you can get to the real thing such as data from a recently posted job). See if the results are as you expect.


Not sure how? No problem – we have you covered. Check out our blog post on technology evaluation.


Verify that it’s the right solution for you


Have you ever found that amazingly talented candidate who is super skilled in what you are looking for, but doesn’t give two diddly squats for your mission and has no interest in your organization’s values? Or, perhaps, they refuse to move to your location or require a much larger salary and rank than you have to offer? Even the most skilled candidates may not make the best employees for you.


The same goes for technology solutions.


The technology may be amazing, but the vendor may not be a good partner for your organization, or the total cost of the solution may be out of your price range, or they may not have the flexibility or the responsiveness or support that your team needs. Just as candidates are more than their skills, vendors are more then their technology. Evaluate the whole vendor.


Not sure how? No problem – we have you covered. Check out our blog post on vendor alignment.


Measure, monitor, and adjust


Once you hire someone, you stop paying attention and assume they are doing what they were hired to do successfully, and consistently, right?


No, of course not. Hiring talent is only the beginning of that talent’s journey. For many organizations, there are entire departments and processes for talent management and employee engagement.


And so it goes with technology.


Once you select and adopt (or modify existing) Talent Acquisition solutions, you are only at the beginning of the story, not the end. Employers have the responsibility to use the tools appropriately. This means ensuring that the solutions are not being misused or abused (on purpose, or by accident).


Set up dashboards or other types of metrics collection and reporting so you always know what is going on with your solution and can catch any anomalies or potential issues before they become liabilities.


For example, measuring the demographic representation of your candidate pool at every step (anonymously, ideally), will alert you to any missing demographics or shifts in demographics throughout the process.


Bringing it all together


According to the EEOC, the responsibility of AI-based processes adherence to EEO regulations falls to the employer, not the vendor. And, as such, it is the responsibility of the employer to choose wisely, use appropriately, and validate constantly that the output is what you expect and need, and that it adheres to the EEO regulations.

You can do it. We can help.


If you need a little guidance – from understanding the concepts around AI, to technology selection, to setting up meaningful metrics – we are here to support you.


Drop us a quick message at info@career.place or through our website at www.career.place. We’d love to hear from you.




66 views

INCLUSIVE   EQUITABLE   EFFICIENT   #NoBias

See how career.place can help your hiring process

bottom of page