Google has modified its autocomplete suggestions following complaints that they led users to sexist and racist entries.
Earlier this month, the Observer reported that typing “are women” into the search engine led to suggested word “evil”. According to Metro, the result of that search term included a highlighted statement that “every woman has some degree of prostitute in her” from a website called sheddingoftheego.com.
For “are Jews”, it also suggested “evil”, and for “are Muslims” it suggested “bad”. Top results for these searches included links to articles such as “Most Muslims Are Bad People” and “Top 10 Major Reasons Why People Hate Jews”.
“I feel like I’ve fallen down a wormhole, entered some parallel universe where black is white and good is bad,” wrote journalist Carole Cadwalladr in her Observer article. “We are inside a machine and we simply have no way of seeing the controls. Most of the time, we don’t even realise that there are controls.”
On Monday, searches for women and Jews no longer returned such results, but the “Muslims are bad” autocomplete is still present.
“We took action within hours of being notified on Friday of the autocomplete results,” Google said in a statement.
“Our search results are a reflection of the content across the web. This means that sometimes unpleasant portrayals of sensitive subject matter online can affect what search results appear for a given query. These results don’t reflect Google’s own opinions or beliefs – as a company, we strongly value a diversity of perspectives, ideas, and cultures.
“We do our best to prevent offensive terms, like porn and hate speech, from appearing, but we acknowledge that autocomplete isn’t an exact science and we’re always working to improve our algorithms.”
In April, Google came under fire for the search results for “unprofessional hairstyles for work”, which showed mostly black women with natural hair, while a search for “professional” ones resulted in pictures of white women with combed hair.