Psychologist logo
man searching internet on phone
Digital and technology, Sex and gender

Online search algorithms reflect – and perpetuate – gender bias

Image searches for the term “person” showed more men in countries with worse gender equality.

25 August 2022

By Emily Reynolds

The problem of bias in artificial intelligence (AI) is of growing concern. Algorithms have, for example, wrongly classified the faces of Black women when detecting gender; wrongly predicted that a Black defendant will reoffend twice as often as wrongly predicting that a White defendant will reoffend; and shown far more men than women on Google image searches for the term "CEO". This last study even found that biased image searches can actually shape people's beliefs about the number of men and women who hold particular occupations.

Now a study in PNAS goes further. It has again found that internet search algorithms reflect gender biases – and that those biases can themselves reinforce social inequality more widely.

In the first and second study, the researchers conducted a Google image search for the word "person" in 153 different countries (the word was translated into the dominant local language for each country). They then looked at the proportion of men and women in the first 100 results of the search. They also examined the level of gender inequality within each country, using the Gender Gap Index, which provides a score of societal-level gender inequality based on political representation, professions, educational attainment, and other factors.

The researchers found that countries with higher national levels of gender inequality, as reflected in their Gender Gap Index score, also tended to show a greater proportion of images of men in the Google image search results for "person". In Hungary and Turkey, for example, there were around 90% men depicted compared to 10% women; in Iceland and Finland, which have low gender inequality scores, it was 50% of each. This suggests that gendered inequality in a particular society is reflected in algorithms.

In the next studies, the team designed fake screenshots of Google image search results for professions that participants were unlikely to have heard of – chandlers, drapers, perukers, and lapidaries. Participants were first asked who was more likely to do the job, a man or a woman, and estimated the salary, friendliness, and intelligence of a typical professional. They then saw the screenshots, which showed either men-dominated search results or gender-balanced search results.

After seeing the images, participants were asked the same questions again. Finally, they were asked whether a man or woman was more likely to be hired in the role, and also to choose between a male and female applicant for each profession.

Before seeing the screenshots, participants initially displayed bias towards men overall, stating that the professions were likely to be dominated by men. After seeing the images, participants who saw male-dominated search images were not only more likely to expect men to be in the roles, but were also more likely to select them when choosing applicants. Those who saw more gender-balanced results, however, were more likely to conceptualise the professions as being female, and chose to hire more women. Male-dominated image searches, therefore, seem to reinforce existing societal biases against women.

This therefore suggests that the search results we see for particular professions can affect not only our own assumptions and biases about the roles that men and women hold, but the way we behave. Though the results are striking on their own, the team points out that the actual effect of gender biased algorithms is likely to be even greater: we are exposed to all kinds of different algorithms, each of which is likely to have its own gendered (and racialised) biases built in.

Being individually aware of such biases when we live in a society where we rely on technology is unlikely to make as big a difference as is needed. Looking at how to make AI more ethical, therefore, lies at the door of those creating these systems.