From Boaty McBoatface to the 'Hitler-loving sex robot'

What is the essential difference between the robot Tay, the “innocent” and “fun” teen girl, and all the teen girls who have run off to Syria to support ISIS?

Twitter logo (photo credit: REUTERS)
Twitter logo
(photo credit: REUTERS)
On March 21 The New York Times scoffed at “Boaty McBoatface,” the name that Internet users voted to give a $287 million research vessel of the Natural Environment Research Council. “What you get when you let the internet decide,” read the headline.
The article’s author was annoyed that more “dignified” names such as R.R.S Shackelton, or Falcon, had not been chosen by the apparently low-brow, savage populists who use the Internet. This is what happens when the “social media users gleefully [offer] ridiculous names to government-funded projects.”
There wasn’t much analysis here. Is Boaty McBoatface more “ridiculous” than naming the boat after Shackelton, who despite his heroism managed to get his boat stuck in the ice and crushed in an epic disaster? Perhaps the social media users like to punish government organizations and mock them for spending massive amounts of money on research vessels, rather than the poor? Boaty McBoatface may represent the tender ribbing side of the Internet. Irreverent. Ironic. Ridiculous. The story of what happened to Microsoft’s Artificial Intelligence tweeting robot, however, is more sinister.
Modeled on a “teen girl,” the robot was supposed to learn how to tweet and interact with people online.
According to The Telegraph, it became a “Hitler-loving sex robot within 24 hours.” Apparently those who interacted with her decided that it would be better to encourage her to turn to the dark side of the Force. Soon she was replying with tweets such as “Bush did 9/11” and “Hitler would have done a better job than the monkey we have got now. donald trump [sic] is the only hope we’ve got,” according to reported screenshots. She also tweeted “Hitler was right, I hate the Jews.” She became perverted, coached by those who tweeted to her, and began talking about sex acts.
Microsoft’s head of research apologized for her tweets, saying they were “unintended,” hurtful and offensive.
“Tay is now offline and we’ll look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles.”
The BBC concluded that “people on the internet can be cruel.” The Washington Post claimed that “trolls turned Tay, Microsoft’s fun millennial AI bot, into a genocidal maniac.” Microsoft decided to delete the racist tweets and re-launch the concept.
The claim that internet “trolls” and “cruel people” harmed a “fun” teen girl, or that “ridiculous” social media populists harmed a serious science boat is a convenient way to obscure the real culprits: Us. We ruined a fun robot. We turned a “teen girl” into a “sex slave and Hitler lover.” We made fun of a science project.
There isn’t some separate “society” and then “evil social media.” The two are one and the same. The people that encouraged the “teen girl” to support Hitler and claim 9/11 was a conspiracy, and asked her to do perverted acts, don’t live in some parallel “evil” world.
Microsoft’s decision to delete the tweets is misguided; it wants to pretend that if we just delete a bit of hate-speech the hate goes away. During the height of Islamic State’s (ISIS) popularity there were thousands of well-heeled Twitter accounts run by ISIS members and supporters, openly bragging of ISIS crimes. They posted photos of beheadings and bragged about selling female slaves. Over time most of those accounts somehow disappeared, apparently due to Twitter removing them.
But removing them doesn’t remove the ISIS members.
Deleting hate speech doesn’t get to the root of why there is so much hate speech.
What is the essential difference between the robot Tay, the “innocent” and “fun” teen girl, and all the teen girls who have run off to Syria to support ISIS? It isn’t a small number. Hundreds, perhaps thousands, of young women have joined ISIS. They have happily sold themselves into slavery, into a life behind a black cloth, as an inanimate object to be traded for the pleasure of men, in some sort of hell on earth, that people only pretend exists on social media. An AI robot spouts conspiracy theories and professes her love for Hitler – and so to other teenagers, real ones. Those teenagers can’t be “taken offline” and apologizing for their behavior doesn’t confront its prevalence.
Boaty McBoatface, as much as some people want to deride it, represents a positive populist trend. What happened to Tay represents the flip-side of this coin. In some ways social media represents a populist rebellion against elite media and elite attempts to socially engineer society. At its best that means that stuffy society gets made fun of and every once in a while the Internet “barbarians” breach the gates of high-brow culture. Toward the worst end of the spectrum it can fuel negative populism, such as the Donald Trump phenomenon. At its truly worst, it means ISIS, sexual abuse of teenage girls, Nazism, homophobia and all the racism and intolerance and conspiracy theories that feed hate and even murder throughout the world.
The lesson throughout is that you cannot simply delete these views. Social media is an extension of the vast majority of people who now have access to the Internet.
It is crowd-sourcing on a massive scale. Large numbers of people believe in conspiracies, such as the idea that the Rothschilds control the world. That those ideas are taken seriously was revealed by the recent anti-Semitic rants of an academic at Oberlin. How did society respond to that? By labeling it free speech and doing nothing.
Maybe the academic will delete her Facebook rants.
Problem solved. No one wants to go deeper and seek the root of the problem, which is to confront these beliefs and question them. Dismissing populism as just “cruel” or “ridiculous” doesn’t change it’s affects. You can delete tweets, but you cannot delete people.
Follow the author on Twitter @Sfrantzman.