Online antisemitism peaks in national tension, helped by Russian trolls

To the Network Contagion Research Institute, which studies how hate manifests on social media, the tweet was a perfect encapsulation of how antisemitism is now showing up online.

White supremacists clash with counter protesters at a rally in Charlottesville, Virginia, U.S., August 12, 2017 (photo credit: REUTERS/JOSHUA ROBERTS)
White supremacists clash with counter protesters at a rally in Charlottesville, Virginia, U.S., August 12, 2017
(photo credit: REUTERS/JOSHUA ROBERTS)
Soon after the 2017 far-right rally in Charlottesville, Virginia, a Twitter user going by the name “jojoh888” knew who was to blame for the clashes between neo-Nazis and the antifa activists who opposed them: George Soros, the Jewish billionaire and progressive philanthropist.
“George Soros is the puppet master. He’s funding both sides,” the account tweeted, echoing a false claim expressed frequently by far-right activists.
“jojoh888” was a Russian troll controlled by Vladimir Putin’s government that was suspended in 2018 in an effort to root out disinformation from Twitter.
To the Network Contagion Research Institute, which studies how hate manifests on social media, the tweet was a perfect encapsulation of how antisemitism is now showing up online.
A new report by the institute illustrates how antisemitism has been a feature of the widely publicized Russian disinformation campaigns surrounding American elections and other moments of social tension, with an eye toward sowing division in the United States by way of social media. The anti-Jewish slurs from Russian accounts dovetail with increased online antisemitism around significant dates like Election Day or Inauguration Day, or during periods of crisis like the Charlottesville rally or the ongoing pandemic.
“These tropes are used very reliably by anti-democratic forces at very specific times,” Joel Finkelstein, the institute’s co-founder, told the Jewish Telegraphic Agency. “They’re used in order to sow doubt into the institutions of democracy: elections, inaugurations, foreign interventions and during social unrest, and in viruses and epidemics.”
Tracking such posts can help social media platforms create an “early warning system in response to surges in anti-Jewish disinformation,” the report said.
The study examines some 250 million posts on four online extremist communities: the social network Gab; the message board 4chan; the now-defunct pro-Donald Trump Reddit community The_Donald; and the ecosystem of Russian trolls on Twitter. By analyzing the rhetorical context in which extremists online use anti-Semitic buzzwords, it maps how and where anti-Jewish hatred appears in fringe communities.
“The disinformation by which Jews have been presented throughout history as the scourge of humanity has been expressed in terms of threats to religion, science, power, dominant ethnic groups, nation-states, and in more modern iterations, as threats to human rights and racial justice,” the report said. “With the advent of social media, these ancient, ‘viral’ conspiracy themes are able to spread more quickly and more easily than at any point in history.”
The report, like previous studies on antisemitism, shows how the hatred largely works by using code words rather than explicitly mentioning Jews: Instead of writing about the Jews as a whole, online anti-Semites target individuals like Soros or the Rothschild family; place three parentheses around names of people who are Jewish or, ostensibly, controlled by devious Jews; or apply age-old Jewish smears such as the blood libel to the State of Israel.
The report also shows that anti-Semitic tropes are an animating feature of extremist activism and tend to peak online, along with other divisive rhetoric, during tense moments in the United States.
For example, while there are some 2,000 or 3,000 tweets about Soros posted on an average day, according to the report, that number spiked to more than 14,000 in the days before the Pittsburgh shooting, when President Trump and others were blaming Soros without evidence for undocumented immigration. It was around that time that the Pittsburgh shooter demonized Soros on social media before committing the attack.
During the racial justice protests this spring, there were 500,000 tweets about Soros in one day.
Similarly, comments on 4chan about “(((Kushner)))” and “(((Trump)))” spiked around the time of Trump’s bombing of Syria in March 2017, and then experienced another bump when Trump moved the U.S. Embassy in Israel to Jerusalem. The implications was that Kushner, a Jew, was behind the decisions to bomb Syria and move the embassy, and that Trump is controlled by Jews.
“Antisemitism is becoming extremely prominent in politically extreme communities,” Finkelstein said. “It gives them a critical edge. And the result is it makes them more insidious and better organized.”
Russian trolls, Finkelstein said, are especially willing to demonize whichever Jewish figure is most likely to rile people. Around the Syria attack, Russian trolls mentioned Kushner on Twitter nearly 600 times. When the Charlottesville protest happened months later, there were 1,200 tweets from Russian trolls about Soros.
“When there’s a foreign intervention, they say that Israel and Kushner are controlling the behavior of the United States in the foreign arena,” said Finkelstein, adding that when unrest or a terror attack occur within the United States, Russian trolls advance the notion that “that’s Soros, that’s an outsider who’s trying to agitate things internally.”
He added, “They’re willing to use whatever tune of antisemitism is most adaptive to exacerbate ethnic tensions.”
Russian trolls, the Network Contagion Research Institute found, account for a minority of anti-Semitic posts online. Most posts are written by actual people. But the Russian trolls and other anti-Semites feed off each other, sharing and spreading the same noxious ideas.
“State actors are now working essentially in tandem with extremist communities,” Finkelstein said. “They’re sharing content. They’re mutually inspiring one another.”
Over the past few months, platforms like Facebook, Twitter, YouTube and TikTok have taken steps to ban Holocaust denial, the anti-Semitic, pro-Trump conspiracy QAnon and other forms of hate. Finkelstein called QAnon “a force multiplier for mainstreaming Jewish hatred in the form of conspiracy,” but said banning anti-Semitic invective should not be the only answer.
On one hand, he said, anti-Semitic conspiracy theories will originate on extremist sites like 4chan and then migrate to mainstream networks like Twitter. But too many bans could drive extremists to spend more time on their own platforms — and in an increasingly toxic environment of hate that could motivate violent attacks. The Pittsburgh shooter, he noted, posted his hateful messages on the far-right platform Gab.
Getting kicked off mainstream platforms “is like a badge of honor for people who are disaffected with grievances, and that’s like a cesspool for antisemitism,” he said.
The institute hopes that its latest report will show tech platforms what to look for — and when to look for it. If an event is approaching that may widen social rifts, for example, that would be the time to look out for tweets about Soros. The report aims to help platforms catch anti-Semitic rhetoric right when it’s posted or monitor for it before it goes up.
If platforms let the hate go unchecked, Finkelstein said, tropes about Soros, Kushner or Israel could easily lead people to more explicit anti-Semitic ideas, as well as violence.
“The hatred is structural. They don’t even have to realize it’s happening,” he said. “If you have a billion people screaming about how Soros is trying to take over the world with the Rothschilds, you are an inch away from saying Jews. … We acclimate people towards condoning things that are increasingly less stomachable.”