Gender biases often appear in the most unusual places, and new research from Rutgers University shows that the stock image libraries are no different. After analyzing the images in various stock libraries portraying computer programmers, nurses, civil engineers and librarians, they found significant stereotyping and gender bias at work.
The researchers assessed images of people in each of the four professions across Shuttershock, Wikipedia, Twitter and the New York Times website, while comparing this with gender representation statistics from the US Bureau of Labor Statistics.
The analysis clearly shows that gender stereotypes are strong, with images of women dominating portrayals of librarians and nurses, with men dominating images of computer programmers and civil engineers. This was especially so when the images were curated by automated systems, as is often the case on platforms such as Twitter.
Challenging stereotypes
These stereotypes were more likely to be challenged when humans had a role in the curation of images, as is often the case on Shuttershock and the New York Times. Search results on NYTimes.com, for instance, would regularly produce images of male nurses and female civil engineers. Indeed, they would often do so more commonly than the labor statistics would suggest.
There is clearly an under-representation of women in male-dominated professions, and this is exacerbated on various digital media platforms. The researchers highlight how there has at least been some progress in the equal representation of sexes in such images, with improvements made between 2018 and 2019.
“Gender bias limits the ability of people to select careers that may suit them and impedes fair practices, pay equity and equality,” the researchers say. “Understanding the prevalence and patterns of bias and stereotypes in online images is essential, and can help us challenge, and hopefully someday break, these stereotypes.”
The researchers hope that the work will go a little way towards overcoming any biases that persist, and especially in encouraging the developers of the AI algorithms that perpetuate these biases to improve their systems. It might also persuade those who manage such systems of the merits of human curation. Either way, they hope that a fairer representation of the sexes will result in the coming years.