The specter of bots and trolls

Digital disinformation is thus not the opium of the masses.

An attendee uses a new iPhone X during a presentation for the media in Beijing, China October 31, 2017. (photo credit: REUTERS/THOMAS PETER)
An attendee uses a new iPhone X during a presentation for the media in Beijing, China October 31, 2017.
(photo credit: REUTERS/THOMAS PETER)
Since the announcement of the 2019 Israeli elections, national publications and news websites have all warned of the possible influence of trolls and bots on the elections’ outcome. Some have argued that Israeli political parties may deploy bots, or automated software meant to imitate human behavior to sway the electorate one way or another.
Others have asserted that foreign countries might attempt to influence the elections through the use of trolls or individuals who inundate news sites and social media platforms with emotionally-charged content and conspiracy theories. Even the former head of the Mossad, Tamir Pardo, recently stated that Russia and other nations could determine the outcome of the elections through digital means without anyone being the wiser. This is because bots and trolls are difficult to identify and even harder to neutralize. A specter is thus haunting Israel – the specter of bots and trolls.
Concerns over digital disinformation first rose to prominence during the Crimean crisis of 2014. According to reports published by NATO’s Strategic Communications Center of Excellence, Russia strategically deployed trolls in both Ukraine and neighboring countries to rally support for its annexation of Crimea. Russian trolls were utilized to disseminate fake news stories, from alleging that Ukrainian nationalists had crucified and murdered a Russian child in eastern Ukraine, to implicating the interim Ukrainian government in building concentration camps for Russian ethnic minorities in Crimea.
At the same time, Russian bots were deployed to redirect web traffic toward news sites which spin fiction and harangue conspiracy theories. Since then, reports suggest Russia has deployed trolls and bots to infiltrate and influence the 2016 US elections, the Brexit referendum, the recent elections in Germany and France and even the polarizing referendum in Catalonia.
Russia is, of course, not the only country to strategically use bots and trolls to affect political processes in foreign countries. It was recently discovered that Iran operated a global fake-news empire spanning hundreds of fictitious social media accounts, bots and fabricated news sites. The network was meant to influence public opinion in countries that traditionally oppose Iranian foreign policy, including the US and Saudi Arabia. Alarmingly, this empire also targeted Israelis, exemplified by the case of the Iranian fake Tel Aviv Times news site.
On a more localized scale, other governments utilize bots and trolls to influence domestic politics. In Brazil, China and Azerbaijan, bots and trolls have aided regimes in silencing political opponents, influencing topics discussed by online publics and increasing leaders’ supposed online popularity. The number of these bots and the scale of their usage has become unprecedented. A recent study found that 400,000 bots disseminated 3.8 million tweets in the final month of the 2016 US elections alone, while in the days leading up to the Brexit referendum, Russian bots published more than 40,000 pro-Brexit messages.
THE POWER of bots and trolls lies not in the amount of messages they disseminate, nor the rhetoric they employ. Rather, bots and trolls can substantially hamper a citizen’s ability to gauge the national temperament. This, in turn, can alter their political choices.
Exposure, for instance, to Russian bots during Brexit may have led many British Twitter users to deduce that there was far greater support for Brexit than they had originally assumed. Faced with a daily barrage of pro-leave messages, these users may have concluded that Brexit was a plausible outcome of the referendum, thereby prompting them to vote in favor of leaving the European Union.
Other bots can influence citizen’s choices given that humans, by nature, wish to belong to the majority. If domestic or foreign bots convince Israelis that Prime Minister Benjamin Netanyahu has lost the support of Israelis, he may soon lose the elections, with Israeli voters choosing to side with the majority of their fellow citizens.
Fear of the possible impact of bots and trolls on the Israeli elections is, therefore, not unfounded as both political parties and foreign actors may try to sway the elections through digital means. However, the recent cry of “Lions and tigers and bots, oh my!” may be exaggerated for three reasons.
First, Israelis do not exist solely online. After they log onto Twitter and Facebook, Israelis drive to work and listen to the radio, hold conversations near water coolers and spend holidays with their families. Each of these activities exposes Israelis to various sources of information and different political opinions. Thus, Israelis do not exist within digital filter bubbles and are not as susceptible to bots as one might assume.
Second, Israelis are avid consumers of news broadcasts, which usually obtain the highest television ratings. As such, Israelis are less susceptible to fake news and conspiracy theories that are disseminated by trolls.
Last, studies suggest that consumers of online news often visit websites that negate their political affiliation. Moreover, news websites which attract the most followers are those that cater to both left- and right-wing audiences. This suggests that Israelis may typically gather information from diverse sources, again limiting the impact of bots, trolls and fake news.
Digital disinformation is thus not the opium of the masses. In fact, the 2019 elections are more likely to be influenced by the actions of politicians and attorneys-general, than by the musings of fictitious social media accounts.
Ilan Manor is a PhD candidate at the University of Oxford, where Patrick Thewlis is a research officer. Both are members of the Oxford Digital Diplomacy Research Group.