Seemingly innocuous Google search yields antisemitic allusion to Holocaust

Google says it’s looking into the search results and wants to improve them. But according to researchers, the results may not be an accident.

A Google search page is seen through a magnifying glass in this photo illustration taken in Berlin, August 11, 2015 (photo credit: REUTERS/PAWEL KOPCZYNSKI)
A Google search page is seen through a magnifying glass in this photo illustration taken in Berlin, August 11, 2015
(photo credit: REUTERS/PAWEL KOPCZYNSKI)
The Google results are shocking: Do an image search for “Jewish baby strollers” and you’ll see row upon row of portable ovens — an offensive allusion to the Holocaust.
Google says it’s looking into the search results and wants to improve them. But according to researchers, the results may not be an accident. It’s possible that they’re the result of a coordinated extremist campaign on a fringe website to yield those specific images.
The Network Contagion Research Institute, which studies the way hate speech spreads online, located a series of posts on the 4chan message board, dating back to 2017, that purposefully pair images of ovens on wheels with the term “Jewish baby stroller.” There were at least a dozen such images turned up in one search, dating from August and September 2017. That means these results may have been in place for years, even though they drew attention Friday.
Posting that specific term next to the image may have manipulated Google’s search algorithm, such that it promoted those images when users search the term, says Joel Finkelstein, the institute’s director.
“What happens is they trick Google into putting that stuff up top,” Finkelstein said. “They paste the image with the words so that when you search those words, the image comes to the top.”
Oven references are relatively common among antisemites, who make them to allude to Jews belonging in the crematoria Nazis used to incinerate the bodies of Jews they killed in the Holocaust.
Google told the Jewish Telegraphic Agency in a statement that the images are “disturbing,” and are the result of an algorithm. It did not include clear information on how such search results may be prevented.
“We understand these are disturbing results, and we share the concern about this content,” the statement said. “It does not reflect our opinions. When people search for images on Google, our systems largely rely on matching the words in your query to the words that appear next to images on the webpage. For this query, which is for a product that doesn’t actually exist, the closest matches are web pages that contain offensive and hateful content. We’ve done considerable work in improving instances where we return low quality content, and we’ll look at this situation to see how we can return more helpful results.”
Google rarely removes individual search results or makes adjustments for one specific search term. A spokesperson said the company looks for “broader systematic improvements that can make Search better for other queries like it.” In particular, the spokesperson said, the challenge here is a “data void” where the only content available for a search term is “offensive [or] of low quality.”
Network Contagion Research Institute researchers say there could be another possibility: that an antisemitic meme, also from 2017, led Google’s search algorithm to mistake the portable oven for a stroller because they look somewhat similar. Like the 4chan posts, the meme is the picture of a portable oven over the text “Jewish baby stroller” in all caps.
“It’s either a raid from 4chan trolls or it’s a meme that circulated on the web,” said Alex Goldenberg, the institute’s lead intelligence analyst. “The Google search algorithm is driving it to the top for some reason, or the item in the meme is tricking the Google algorithm.”
Goldenberg added, “It’s notable that Google Image search didn’t pick that up.”
If it was a coordinated action by online antisemites, called a “raid,” it wouldn’t be the first one. In a 2016 “raid” called Operation Google, extremists tried to undermine a new tool Google had for spotting and filtering out racial and ethnic slurs. They did this by replacing the slurs in their comments with the names of tech companies. So, for example, they used the word “Google” instead of the n-word, and used the word “Skype” to refer to Jews. They hoped that doing that would force Google to censor its own name, which did not happen.
This also isn’t the first time Google has yielded antisemitic search results. According to MEMRI, a media research organization, and the World Zionist organization, the search term “oy vey” yielded antisemitic results as well. In 2016, Google made changes so that its search function no longer suggested the search term “Jews are evil.”
Goldenberg noted that part of the goal of “raids” is to generate media coverage, such that antisemitic terms spread more widely.
“The nature of these raids is to attract attention to the antisemitism,” he said.
Google later took to Twitter to issue an apology:

However, Google said it would not remove the images.

"Some have asked when these results will be removed. We only remove web-based results for extremely limited reasons...In cases like this, where we don't have policy that covers removal, we work to see if there are ways to surface more helpful content...."