Beyond the physical war between Hamas and Israel and another war is raging. This other war is taking place in social media; it’s caught major players like Facebook unprepared, and its caused government officials around the world to quietly express concerns about vilification on social media potentially fuelling violence on the streets in places far removed from the zone of conflict.
In the Jewish Daily Forward, Jay Michaelson recently lamented the unwillingness of people to think critically about online material regarding the current conflict. He highlights how activists on all sides refuse to consider valid arguments, inconvenient truths, from the other side. His criticism was directed equally at left-wing Jews, right-wing Jews, and Palestinian activists. While Michaelson’s own political views shine through in his article, and many will disagree with them, the silo mentality he speaks of is very real, especially in a time of emergency.
The problems go deeper still. Some who support the Palestinian cause, and see Jews supporting Israel, are now overtly spreading antisemitism. Others have expressed the view that antisemitism is somehow legitimate at this time. I’ve also seen people post that opposing antisemitism is wrong, or is a distraction from the “real issue”. Anyone claiming to support human rights, but who takes this line, loses all credibility. Racism, in all its forms, is wrong at all times and in all places. This is no exception. These people need to be called out by those who otherwise agree with them, after all, they will simply ignore anyone else.
The last part of this war is the abuse of the reporting systems on Facebook. People on both sides have been encouraging their supporters to flag as spam anything they disagree with. Facebook appears to handle spam reports quite differently from reports of hate speech. While it can be very difficult to get Facebook to remove hate speech, such as antisemitism, when enough people report something as spam Facebook seems to automatically hide it and warn the user their content has breached the community standards. Where this happens to a page repeatedly, the page ends up being suspended for 30 days.
I’ve seen reports of pages, and users, being suspended on both sides of the conflict. I’ve seen both sides claim this is a reflection of bias at Facebook. While I don’t believe Facebook has a bias in the conflict, they are clearly ill prepared for the current gaming of their systems. Some of the posts that have triggered suspensions are clearly not a breach of the community standards. The Jewish Internet Defense Force received a 30 day suspension from Facebook for posting images of IDF planes and asserting that Jews can defend themselves (see image). Efforts to appeal the decision to Facebook staff appear to have fallen on deaf ears.
On the Palestinian side, Hamdi abu rahma, a photographer based in Gaza has also received a 30 day suspension of his page. While it’s possible this suspension relates to the posting of photographs containing graphic violence (which the community standards prohibit) it’s equally possible it resulted from flaws in the system. The lack of explanations as to which part of the community standards have led to content being removed, and ultimately to suspensions, is a problem Facebook ought to fix. Without this feedback the suspensions appear arbitrary and Facebook remains open to accusations of bias.
Facebook’s often poor response to reports of hate speech also contributes to the impression of bias. Content that should be immediately removed often remains online for long periods. Earlier today Facebook thankfully removed a page called “Death to Jews Save Gaza”, yet a similar page calling for “death to Israel” remains online. Facebook needs to recognise that such content crosses the line into antisemitism, and the blood drenched cartoons such pages promote harp back to ancient blood libels and demonization.
Facebook urgently needs a process to review the suspension of accounts that have resulted from the current conflict. They also need to become more aware of new antisemitism, and of content that cross the line from legitimate criticism or political opposition into vilification. As Facebook users, we all need to be a little more careful about what we post, and we need to try and hold the anger in check. It won’t be easy, but Jay Michaelson’s four questions can help: the first question asks what the likely impact of your post will be, the second asks if you are posting out of anger, the third asks if your post demonizes “the other side”, and the last asks if you are sharing facts or spin.
Share the news, share the facts, and share insightful commentary but please, don’t fan the flames of hate. Social media can be a powerful tool, especially in times of crisis, but please, use it wisely.
Dr Andre Oboler is CEO of the Online Hate Prevention Institute and Co-Chair of the Working Group on Antisemitism on the Internet and in the Media of the Global Forum to Combat Antisemitism.