Why Is Artificial Intelligence Biased Against Women?

Last Updated: December 16, 2021

Job.Opens a new window com’Opens a new window s founder, Arran Stewart explores one of AI’s biggest faults – its problem with diversity – and what can be done, especially by him and people like him to fix the issue.

San Francisco, home to the largest tech hub in the world, has banned the use of facial recognition technology by law enforcement and other municipal agencies as part of a broader anti-surveillance ordinance. Activists and politicians, who pushed for the ordinance, cited studies that showed facial recognition technology, which is powered by AI and deep learning, is less accurate when distinguishing between individual women and people of color. This can lead to misidentification, false arrests, and imprisonments of innocent people. However, this reduced efficacy for such a large segment of the population is a built-in part of artificial intelligence – albeit an unintentional one. It’s the result of tech’s unconscious bias.

Unconscious bias is “a prejudice that happens automaticallyOpens a new window , is outside of our control and is triggered by our brain, making quick judgments and assessments of people and situations, influenced by our background, cultural environment, and personal experiences.” Prejudices such as these are just as harmful as deliberate acts of discrimination – if not more damaging. Since we aren’t actively fostering these biases, we can often overlook addressing them and working to counteract them. It also makes us less likely to recognize or even acknowledge  the way our biases affect the groups, who are disadvantaged by them.

Bias Based on Data

With the rapid growth of machine learning, artificial intelligence and it’s related technologies, technologists are just beginning to reckon with the repercussions of our unconscious biases.

Amazon, for example, had to scrap a four-year-old recruitment matching tool because it had taught itselfOpens a new window to favor male applicants over female ones. Equally qualified female candidates were ranked lower than their male counterparts, with some graduates of all-female colleges losing whole points due to their alma mater. The system was trained on data submitted by applicants over a 10-year period, who were overwhelmingly male (73% of Amazon’s leadership is maleOpens a new window ). Despite the company building the technology to be neutral, it still taught itself to be biased based on the data it was given by the people who built it, which reflected their reality – a (majority white) male-dominated industry.

AI’s race and gender biases are a result of who has the power in the backroom. Over 70% of all computer programmers are white males and despite our best attempts at neutrality, we were raised in a society that inherently devalues women and POC, teaching us both explicitly and implicitly that they are less capable than white men. This colors our worldview and in turn, the technology we create; we aren’t necessarily actively misogynistic or racist but our environment allows us to perpetuate the biases ingrained in us by society unchallenged.

Learn More: Future of Work: 10 Key Trends for the Next 10 yearsOpens a new window

Acting Against the Passive Biases

It’s up to tech companies to do the work behind the scenes to make sure that AI and its relatives are as equitable as they can be. It’s not enough to simply acknowledge that there is a problem, especially when it’s a problem we can fix. We, including myself, a white male at the top of a rising AI-based recruitment platform, must amplify the voices of the women and POC, who are being actively disenfranchised by our passive biases. We must make conscious decisions to elevate the POC and women around us to roles where they are part of the decision making the process. We have to listen when they tell us about the ways our privilege is clouding our judgment and advocate for and work with them to fix the issues. We need to make sure our hiring strategies are deliberately diverse because right now, they’re passively biased and it’s not helping anyone.

Equality isn’t the same as equity. The balance of power in the field is too greatly shifted in one direction for us to simply wait for someone else to make the change – we need to work on eliminating the barriers altogether. Not just for the women and POC we know personally and tell us their stories, but so that we – the privileged – can be better ourselves. Isn’t that why we became technologists in the first place – to make the world a better place?

Learn More: How to Define Your Recruitment Chatbot’s PersonalityOpens a new window

Arran Stewart
Arran Stewart

Co-owner, Job.com

Co-owner of Job.com who is passionate about recruitment and technology.  A dedicated disruptor with a relentless focus towards making things better for the candidate and has committed his entire career creating products that improve the job seeking process for the masses.  Having worked in all areas of traditional recruitment, job boards, aggregators, applicant tracking systems, talent management systems, multi-posting technology, semantics and of course matching technology.  Beyond this, Arran’s other obsessions are his wonderful family, a childhood love for cars and motorsport formula 1.
Take me to Community
Do you still have questions? Head over to the Spiceworks Community to find answers.