Social media’s bomb of hate goes TikTok

Anytime you have countless underage teens uploading and creating content, you’re going to have a lot of awkward, inappropriate and even offensive material.

Tik Tok logos are seen on smartphones in front of displayed ByteDance logo in this illustration (photo credit: REUTERS)
Tik Tok logos are seen on smartphones in front of displayed ByteDance logo in this illustration
(photo credit: REUTERS)
The new social network TikTok has taken the social media world by storm. Previously known as, the app developed by Chinese company ByteDance allows users to upload up to 60-second lip-synched videos with a variety of creative and interaction features. It’s the fastest growing app in town – especially among teens – and is ranked the seventh most downloaded app of the 2010s, and it’s only been on the market for 18 months. The latest social media craze has not gone without controversy, and it certainly has a darker side.
Anytime you have countless underage teens uploading and creating content, you’re going to have a lot of awkward, inappropriate and even offensive material. Why? Because children don’t always think things through before they act, and are known for impulsive, immature behavior. That problem is only amplified on social media. Add peer pressure, and any kind of irrational or unhealthy thoughts that every teen deals with at one point or another, and you have a recipe for disaster – a literal TikTok time bomb. The truth is that children should not have global access to the unfiltered thoughts of hormonal, underdeveloped teens (or even the thoughts of younger children), and the fact they do – even with TikTok’s “community” standards – is exceptionally dangerous.
In February, TikTok came under fire for a violent video depicting real-life terror attacks against Israeli citizens. The video, aimed at Palestinian children, glorified the violence and encouraged “martyrdom.” A few days later, TikTok removed the video stating that the clip “violated their terms.” But what about the children who already saw it? In 2015-16, Israel faced a massive onslaught of Palestinian so-called “lone wolf” terror attacks, many of which were committed by teens and even children. In multiple attacks, the perpetrators stated they were inspired or influenced by social media, which encouraged them to commit these acts of violence in order to “liberate Palestine.” Palestinian teens saw terrorism as a way to gain fame and adoration on social media – something which became, quite literally – worth killing and dying for. Facebook faced tremendous pressure (and a class action lawsuit) regarding their “community standards” and repeated failure to remove antisemitic and violent content in Arabic, an issue that continues to plague all the social media networks. The fact remains that children are seeing this content, disproportionately now on TikTok, and they are far more impressionable than we might think.
THE SAME WEEK the Palestinian video calling for violence went viral on TikTok, an article surfaced explaining that following the suicide of a Brazilian teen which was livestreamed on the platform and left up for an hour and a half, the platform did not even contact the police for at least three hours. What kind of corporate responsibility is that? Additionally, two other deaths have occurred on TikTok’s livestream including one other suicide. The fact that even one suicide occurred on the platform raises a lot of questions about the kind of culture the app is fostering and its ability to deal with heavier content that can and does emotionally scar viewers.
If you’re wondering what content might push a fragile teen to commit suicide, no need to look further than TikTok itself. In its own world of memes, inside jokes, hashtags and popularity contests, the app provides the opportunity for near instant fame if you’re clever or cute, and near instant bullying if you aren’t. The appalling comments on the awkward and sometimes humiliating videos teens and children post are painful to read – comments calling users ugly, stupid, fat, or even encouraging users to kill themselves. And these clips live in infamy forever on the internet with YouTube compilations that sometimes have millions of views. How are those 13-year-olds going to feel in five years? More alarmingly, there are a host of hashtags with thousands of videos of teens discussing cutting, eating disorders, suicide and more. TikTok’s attempt to deal with cyberbullying has been shockingly unsuccessful, after an investigative report showed the company purposely censored LGBT, fat, and disabled users they believed would be “susceptible” to bullying. Yes, you read that right, they censored and punished the video creators, not the bullies.
While it’s true that much of the content on TikTok is funny, even entertaining, it by no means is a place for children – not with the way the app is currently run. The sexualization of 12-year-olds dancing provocatively, attempting to imitate their favorite pop stars and celebrities – society should not be applauding this, and there is something stomach-turning about watching a 14-year-old lip-synching lyrics about lewd sexual acts that I have a hard time believing she even understands. The app also is being used to spur teens into mean-spirited, unhealthy behavior in the interest of notoriety or fame. For example, there was a TikTok “challenge” of encouraging users to record a video dirtying another person’s toothbrush, and then record them using it afterward. Teens gleefully recorded themselves dipping family member’s toothbrushes in the toilet, trash and more. It’s not cute, it’s not funny, and we ought to pay attention to the toxic attitudes and behaviors the app is encouraging.
Ultimately, TikTok is not worse than any other social media platform today. In fact, there’s a lot to love about TikTok when people aren’t being nasty to each other, and when children and teens aren’t misusing or being urged to misuse it. Unfortunately, we haven’t yet figured out a way to deal with that across all social media platforms. But on TikTok, its popularity with children – and the unfiltered access they seem to have – is particularly disturbing.
The writer is the CEO of Social Lite Creative, a digital marketing firm, and a research fellow at the Tel Aviv Institute.