Just finished watching a romantic comedy on Netflix? Well, now you might notice similar movies lined up in your list, all thanks to how Netflix suggests things to you.
When you use websites like Amazon, Facebook, Instagram, and Netflix, special computer programs (called algorithms) keep track of what you do. They use this information to suggest things you might like. This helps you avoid being overwhelmed with too much stuff and gives you answers that make sense when you search for things.
But a study from the University of Utah found something interesting. It turns out that the suggestions you get might be a little biased based on whether you’re a boy or a girl.
Biased suggestions
“Everything you’re consuming online is filtered through some kind of recommendation system,” the researchers explain, “and what we’re interested in understanding is whether there are subtle biases in the types of information that are presented to different people and how this affects behavior.”
The researchers say that it’s quite easy to study gender bias because Facebook provides information about people’s genders. It’s not really surprising that computer programs (algorithms) learn biases from the way people talk, since those biases exist in how we use words. The big questions are how much this is happening and what it leads to.
In their study, the researchers first showed that biases about genders found in language also end up in algorithms. For instance, the algorithms connected women with negative things like being impulsive, not handling money well, and acting without thinking.
The group then made a small change in an advertisement. They swapped out one word, using “responsible” instead of “irresponsible.” Then they observed who got the ad. Surprisingly, ads with negative traits were more often sent to women, even though there was no real reason to do so.
A vicious cycle
The researchers discovered that this creates a cycle. People who don’t pay close attention continue the bias. They click on these ads and follow the suggestions they get, unknowingly helping the algorithm keep the gender bias going.
“There are actual consequences of this bias in the marketplace,” they explain. “We’ve shown that people are split into different kinds of consumption bubbles and that influences your thoughts and behaviors and reinforces historical biases.”
The study carries important lessons for tech companies that operate online. It shows they should take more active steps to lessen gender bias in the computer programs (algorithms) that show us ads and suggestions. Those who create ads might want to test them first to catch any hidden bias that could wrongly target certain people. People who use the internet should also be alert to these biases as they browse and be cautious when they see ads or suggestions.
The way these things work is often not clear to most people, as big online companies don’t share a lot about how their algorithms operate. However, Amazon seems to be sharing more information with its users about the suggestions they receive.
While this research looked at gender bias, the experts believe that similar biases probably exist for other things like age, sexual orientation, and religion.