No, duh, rings the echo of a legion of GFPs who remember all the research from Invisible Women that showed just this:
A 2015 study identified the top five words used to refer to people in human–computer interaction papers published in 2014 and found that they are all apparently gender neutral: user, participant, person, designer and researcher. Well done, human–computer interaction academics! But there is (of course) a catch. When study participants were instructed to think about one of these words for ten seconds and then draw an image of it, it turned out that these apparently gender-neutral words were not perceived as equally likely to be male or female. For male participants, only ‘designer’ was interpreted as male less than 80% of the time (it was still almost 70% male). A researcher was more likely to be depicted as of no gender than as a female. Women were slightly less gender-biased, but on the whole were still more likely to read gender-neutral words as male, with only ‘person’ and ‘participant’ (both read by about 80% of male participants as male) being about 50/50. (IW, p.9)
Still, more data is always welcome, and this research is an interesting example of both the benefits and the dangers of AI. Benefits, because this scale of analysis is made possible by AI (and one of the things I’ve been most excited by in my research for the AI episode of
the podcast is the potential for us to use the statistical power of algorithms to help us CLOSE the data gap and even maybe fix some biases), but also dangers, because as lead author April Bailey points out, “the same collection of texts scoured by this research is used to train a range of
AI tools that will inherit this bias, from language translation websites to conversational bots.”
“It learns from us, and then we learn from it,” says Bailey. “And we’re kind of in this reciprocal loop, where we’re reflecting it back and forth. It’s concerning because it suggests that if I were to snap my fingers right now and magically get rid of everyone’s own individual cognitive bias to think of a person as a man more than a woman, we would still have this bias in our society because it’s embedded in AI tools.”