Does Language Undermine Women In Science?

Societal stereotypes hold women back in a great many ways, with the undermining of equal opportunities having a tremendous impact across society more generally.  Nowhere is this more pernicious than in STEM fields, where female participation continues to lag despite well meant efforts to rectify matters.

New research from Carnegie Mellon highlights how languages themselves can fundamentally undermine attempts to create more gender equality in these domains.  The researchers set out to try and gauge whether language has an effect on the kind of career-related stereotypes that so often dissuade women from entering STEM fields.

“Young children have strong gender stereotypes as do older adults, and the question is where do these biases come from,” the researchers say. “No one has looked at implicit language – simple language that co-occurs over a large body of text – that could give information about stereotypical norms in our culture across different languages.”

Implicit biases

The researchers specifically looked at how words co-occur with women as opposed to men.  For instance, “woman” is often associated with “children”, “home”, and “family”, whereas “man” is commonly associated with “business” and “career”.

“What’s not obvious is that a lot of information that is contained in language, including information about cultural stereotypes, [occurs not as] direct statements but in large-scale statistical relationships between words,” the researchers say. “Even without encountering direct statements, it is possible to learn that there is stereotype embedded in the language of women being better at some things and men at others.”

The researchers found that when languages have a stronger embedded gender association, there appeared to be stronger career stereotypes at play in those societies.  This is important, because it’s well known that gender stereotypes emerge in children by around the age of two.

When the researchers examined stats around gender associations embedded in 25 different languages alongside international data on gender bias, they found that the age of the country appeared to influence the findings.  Those countries with older populations appeared to have stronger biases.

“The consequences of these results are pretty profound,” the researchers say. “The results suggest that if you speak a language that is really biased then you are more likely to have a gender stereotype that associates men with career and women with family.”

The researchers believe their findings highlight the importance of ensuring things like children’s books are written in such a way as to remove any potential for gender bias.  There is also a risk that gender bias can appear in computer algorithms.

“Our study shows that language statistics predict people’s implicit biases — languages with greater gender biases tend to have speakers with greater gender biases,” the researchers conclude. “The results are correlational, but that the relationship persists under various controls [and] does suggest a causal influence.”

Facebooktwitterredditpinterestlinkedinmail