Recruitment in academia: 4 challenges AI can help overcome

AI-powered hiring technology is the perfect tool for helping universities improve recruitment in academia, broadening talent pools and finding top talent.

Recruitment in academia: 4 challenges AI can help overcome

Hiring in academia is under constant scrutiny.

The hiring process at universities continues to be less-than-fair to all candidates despite the outward appearance of a level playing field. Though universities are making efforts to overcome both conscious and unconscious biases that impact the selection of candidates, bias is difficult to completely eliminate without the adoption of technology that reduces the human element early in the selection process.

One such innovation that universities need to adopt into their recruiting processes is AI-powered hiring technology. Such tools can help search committees overcome some of the most common biases that affect hiring in academia.

Below are four of the hiring challenges that this kind of technology can help academic institutions overcome, thereby leveling the playing field for all qualified candidates and creating a better pool of applicants for each open role.

The prevalence of hidden-criteria searches

Brian Leiter, professor of jurisprudence and director of the Center for Law, Philosophy and Human Values at the University of Chicago, wrote the seminal piece on hidden search criteria in academic hiring. In it, he argues that there are two different criteria for some positions in academia: the criteria listed on the job posting, and the unspoken criteria that candidates rarely know about.

That second set of criteria, Leiter explains, defines what the search committee wants to see but is illegal to advertise. Examples of hidden criteria include decisions made in advance that the hire needs to be a woman or the chosen candidate must be a spouse to someone in the department. Those job announcements then are often framed to subtly define exactly what kind of candidate the university will be hiring to encourage the “right” candidates, Leiter notes.

But not all candidates will pick up on those hints in the announcement. These types of criteria essentially lead to a false search in which qualified applicants are lied to for the sake of appearances of legitimizing the hiring of a preselected candidate, Leiter notes.

Such was the experience of Messiah College history professor John Fea early in his career, when a position he applied and made it through the interview process for was slated to go to a woman before the process even began. “Everyone knew I had no chance,” Fea writes of the experience. “The faculty in the department and the administration I met with were just going through the motions.”

The remedy to these practices is “stymieing and removing the perpetrators of unlawful bias from the process,” Leiter says.

Row of professionals, human resources, job interview, recruitment in academia concept

The pervasiveness of unconscious bias

Unconscious biases are personal predispositions that influence decision-making. These are biases that people often don’t even know that they have, but they can have powerful effects on the hiring process in academia.

Marie Chisholm-Burns, dean of the College of Pharmacy and professor of pharmaceutical sciences at the University of Tennessee Health Science Center, describes different circumstances where unconscious biases creep into academic hiring. These include biases towards pre-selected candidates, against candidates who are deemed a threat because they are too similar to peers at the university and against candidates who do not fit someone’s mental image of “the right person for the job.”

These implicit biases inhibit a university’s chances of finding and hiring the most qualified candidates.

The first, and often most difficult, step toward remedying these biases is to recognize that they exist. Once those biases are acknowledged, universities can look for solutions to help overcome them in the hiring process — one of the most powerful of which is AI-powered hiring software.

Such tools can remove unconscious bias by only collecting and analyzing certain data points relevant to the position and a person’s success in it. These tools work because they evaluate candidates’ qualifications without processing identifying information.

The practice of name-recognition hiring

There’s a lot of demographically identifying information to be inferred from a person’s name, including their gender and ethnicity. This information can easily feed into a selection committee’s biases and knock a great candidate out of the running for a job.

But it isn’t just the person’s name that gets scrutinized in name-recognition hiring. Selection committees also have their biases for or against certain industry publications or other universities that can affect their perceptions of candidates who may have published in those journals or attended those universities.

Maggie Kuo, a 4former writer for Science Magazine, reports that hiring committees at some research-intensive universities value most whether a candidate has published in big-name journals and the reputations of their institution and adviser. This aligns with data from a study by Aaron Clauset, Samuel Arbesman, and Daniel B. Larremore, which shows that only one-quarter of academic institutions produce 71 to 86 percent of all tenure-track faculty.

The amount of paperwork involved in an academic application facilitates the practice of name-recognition hiring. “Many of the standard materials used in academic hiring are famously rife with biases,” claims Chad Orzel, a professor of physics and astronomy at Union College in Schenectady, NY. Between the CV, letters of recommendation, cover letters, published writing samples and examples of course syllabi, the application process is riddled with opportunities for name-recognition biases.

Hiring software can help eliminate some of this bias by taking all names out of the data so a committee can focus on a candidate’s skills and qualifications. By taking an anonymous approach to hiring, universities create a hiring process that targets the most qualified people, no matter their gender, race or matriculation.

icons of human silhouettes - representing recruitment in academia concept

The preference for academic inbreeding

Academic inbreeding is a systemic issue in academic hiring. The practice refers to universities intentionally hiring faculty members who have graduated from the hiring institution. It is a type of bias that significantly shrinks the talent pool of qualified and/or acceptable candidates.

Researchers Philip G. Altbach, Maria Yudkevich and Laura E. Rumbley show how “fundamentally problematic” academic inbreeding is and why it may be difficult to overcome in the hiring of new faculty. But that difficulty should not dissuade universities from making the effort to reduce the practice.

AI can help universities overcome this bias in hiring. By utilizing AI-powered hiring tools in the hiring process, selection committees can bypass school data to greatly broaden the talent pool. This helps institutions find the best candidates regardless of which universities they graduated from.

It’s no secret that the hiring processes at some universities are rife with biases. As a result, those institutions naturally struggle to find and hire people who are truly the best candidates for a vacancy. Hiring technologies that are powered by AI are one of the best solutions for overcoming these issues and improving recruiting practices.

Images by: dotshock/©123RF.com, rawpixel/©123RF.com, nicoelnino/©123RF.com

You might also like...