Study: Image results for the Google search ‘ugly woman’ are disproportionately black
Search
engines can reveal some unattractive biases when you squint at them
closely. Check out what Google autofills, or displays in its top
results, and you’ll see just how entrenched our collective prejudices
are re: stuff like “feminine” work or black criminality.
Now, thanks to a new paper
by a trio of researchers at Brazil’s Universidade Federal de Minas
Gerais, you can add a new prejudice to the list: Google’s depictions of
female beauty, they found, are both ageist and racist.
The
paper — which has been submitted to the International Conference on
Social Informatics but has not yet been published — looked at how Google
and Bing represent female beauty in their image search results,
particularly when it comes to different age and racial groups. To do
that, a graduate student and two professors, Virgilio Almeida and Wagner
Meira Jr., scraped the top 50 images for “beautiful woman” and “ugly
woman” across dozens of international versions of Google and Bing. They
then passed those 2,000-plus images through a program called Face++,
which estimates subject age, race and gender with 90 percent accuracy.
Sadly,
the race and age breakdown of the “beautiful” pictures versus the
“ugly” ones are probably what you’d expect. For almost every country
analyzed, white women appear more in the “beautiful” results, and black
and Asian women appear in the “ugly” ones.
Blackness
is considered less attractive in 86 percent of the countries surveyed
on Google, including countries such as Nigeria, Angola and Brazil, where
there’s a predominance of people with black or brown skin.
Likewise,
beauty is associated almost exclusively with youth — extreme youth, in
some countries. In Japan and Malaysia, for instance, queries for
“beautiful woman” don’t turn up ladies much older than 23.
In the
United States, searches for “beautiful” women result in pictures that
are 80 percent white, and roughly between the ages of 19 and 28.
Searches for “ugly” women are roughly 60 percent white and 30 percent
black, and fall into the 30 to 50 age range.
This
sort of bias has been observed on search engines before, of course —
although, in a typical chicken/egg quandary, it remains unclear whether
the results shape society, or society shapes the search results. A 2013 paper
by Harvard professor Latanya Sweeney found that searches for black
names surface more ads related to criminal record history, whether or
not the person in question was associated with any crimes. Likewise, a 2015 study
from researchers at the University of Washington and the University of
Maryland found that Google almost exclusively displays pictures of men
for queries like “construction worker.”
In 2013, Paul Baker and Amanda Potts, linguistic researchers from Lancaster University in Great Britain, conducted a survey
of Google Instant’s autofill results — you know, the zany suggestions
Google comes up with while you’re still typing your search term — and
observed that racist and homophobic language frequently made it into
Google’s suggested answers. They don’t fault Google, though — rather,
they conclude that the search engine started predicting racist,
homophobic stuff because its users kept entering it. Google is, in other
words, rather like our ex-friend Tay: a product of her sorry circumstances.
The
authors of this latest paper aren’t quite so comfortable absolving
Google and Bing of blame. Yes, acknowledged Almeida, one of the
co-authors of the paper and a current visiting professor at Harvard:
Pre-existing social biases absolutely shape image search results —
someone had to upload and tag and post these photos in the first place,
after all. But “the way search engines index and rank images” could also
contribute to the creation, or at least the enforcement, of
stereotypes, he explained by email.
“We do not have [enough] information about the techniques used by search engines to rank images and photos,” he said.
It’s
unlikely that researchers such as Almeida and his co-authors will ever
get access to that type of information: Google and Bing guard their
ranking algorithms like state secrets. But Almeida says another one of
his and Meira’s graduate students is working on “techniques to increase
[the] transparency of platforms,” and that both computer
scientists remain interested in auditing, and even reverse-engineering,
them.
Until they have more complete information, however, the
researchers are hopeful that companies such as Google and Microsoft
will begin re-evaluating their own algorithms.
“Given the
importance of search engines as source[s] of information,” they write,
“we suggest that they analyze the … prominent presence of negative
stereotypes and find algorithmic ways to minimize the problem.”
Source: WP
