Social media networks have an endless list of failures from platform to platform in dealing with antisemitism and even terrorist content. There has been a pattern on social media platforms from ISIS recruitment on Twitter, to incitement to violence on Facebook, to white nationalist content on Telegram, Gab and Twitter, to Holocaust denial on YouTube, Twitter and TikTok. But last week, at least three internet platforms – Zoom, Facebook and YouTube, finally got it right.
Several weeks ago, a student group at San Francisco State University (SFSU) announced a student event, “Whose Narratives? Gender, Justice, & Resistance” with Leila Khaled, a PFLP terrorist who is responsible for two plane hijackings – TWA Flight 840 from Rome to Tel Aviv in 1969, and on El Al Flight 219 in 1970. Obviously, the fact that a known PFLP terrorist is lecturing students on an American campus is problematic, but when a coalition of Jewish organizations spoke out against this activity, SFSU doubled down defending Khaled’s “free speech.”
The problem with this is that allowing a member of a designated terrorist organization in the US to speak and promote her activities (such as hijacking planes) is a potential violation of federal law – whether the event be on campus or on Zoom. Due to the novel coronavirus pandemic, the event was initially scheduled to be hosted on Zoom. That’s when the Lawfare Project notified the video call giant that hosting Khaled puts the company in jeopardy of providing material support or resources to terrorist organizations, and poses a massive legal liability to Zoom.
Zoom acted promptly and professionally, immediately canceling the event. In a statement, Zoom wrote, “In light of the speaker’s reported affiliation or membership in a US-designated foreign terrorist organization, and SFSU’s inability to confirm otherwise, we determined the meeting is in violation of Zoom’s Terms of Service.”
Naturally, the anti-Israel campus group claimed that not supporting a member of a designated terrorist organization was “censorship” and doubled down, rescheduling the event for Facebook live, and launching a (largely unsuccessful) social media campaign to “support” Leila Khaled. But the problems were only beginning for this student group.
When the event was scheduled to take place last week, the organizers opted to host it on YouTube live as a last resort, only to have it shut down 20 minutes into the event for the same reasons. Ironically, proving all three networks right, the last thing Khaled said before the stream was cut off was, “people have the right to fight those who occupy their land by any means possible, including weapons.” Following the event, YouTube confirmed they removed the stream due to “content praising or justifying violent acts carried out by violent criminal or terrorist organizations.”
In a testament to the abysmal state on US campuses when it comes to fighting terrorism, SFSU not only failed to cancel the event, but the president of the university, Lynn Mahoney, actually condemned the social network response, stating in a letter that the university disagreed with Zoom’s decision but recognized its right as a private company to enforce its policies. “We worked hard to prevent this outcome and have been actively engaging with Zoom,” she wrote.
While there is a long way to go on social media networks, and apparently, even further to go on US campuses, the actions of YouTube, Facebook, and Zoom were not only morally right, but they proved the necessity of having strong laws against terrorism and the promotion of terrorism both online and off-line. When social media content promotes terrorism or violence against protected groups, it’s more important now than ever that we have laws in place to hold these networks accountable and fight against antisemitism and the promotion of terrorism.
The writer is the CEO of Social Lite Creative LLC.