New research from Monash Business School reveals that women perceive artificial intelligence (AI) assessments in job recruitment as reducing bias, while men fear losing an advantage. Professor Andreas Leibbrandt, from the Department of Economics, led the study, focusing on how AI in recruitment could address biases that hinder underrepresented groups from securing desired roles.
According to Professor Leibbrandt, people from minority groups often face inferior market outcomes, earning less and struggling to find or retain jobs. Understanding these barriers is crucial for dismantling them. With the rise of AI, the recruitment process is evolving, as a majority of organizations now incorporate AI in their hiring practices. This shift brings into question how both applicants and recruiters respond to AI involvement.
Levelling the playing field
In the study’s first experiment, over 700 applicants for a web designer position were informed whether their application would be assessed by AI or a human. The results showed that women were significantly more likely to complete their applications when AI was involved, whereas men were less likely to apply.
The second experiment examined 500 tech recruiters’ behavior. When recruiters knew the applicant’s gender, women were consistently rated lower than men. However, when the applicant’s gender was hidden, this bias disappeared. Interestingly, when recruiters had access to both AI scores and the applicant’s gender, no gender difference in scoring emerged. This finding suggests that AI serves as an anchor, helping to eliminate gender bias in recruitment assessments.
Professor Leibbrandt’s research distinguishes itself by focusing on human interaction with AI, rather than the algorithms themselves. His aim is not just to reduce bias, but to build a workplace where everyone has an equal opportunity to succeed.
Looking ahead, Professor Leibbrandt plans to investigate the impact of informing job applicants about potential biases in AI training data. He also intends to explore issues such as ‘narrative discrimination,’ where unconscious stereotypes influence hiring in the tech industry, as well as bias in remote work environments. Through these initiatives, he seeks to further workplace inclusion and ensure fairer recruitment practices.





