Online and emoting: Preventing youthful tragedy

What happens when someone posts suicidal thoughts on Facebook, TikTok or Instagram? What if the post’s intentions aren’t clear to the average reader?

Today’s youths prefer to share their feelings and emotions through texts and online, in social media, in chat platforms while gaming and in other similar settings (photo credit: SNAPPY GOAT)
Today’s youths prefer to share their feelings and emotions through texts and online, in social media, in chat platforms while gaming and in other similar settings
(photo credit: SNAPPY GOAT)
Who’s responsible when someone cries for help on social media?
Max is 12-years-old, alone in his room. His mom and sister are in the living room of their small apartment in Ashdod when, suddenly, Max does the worst thing imaginable: He jumps from his window to his death. Max had posted comments about his mental distress on social media just hours earlier, but alas, no one listened.
This tragedy – a parent’s worst nightmare – happened a few months ago to a typical Israeli family. Max’s mom, dad, friends and relatives saw his daily conduct at home as nothing more than typical teenage behavior. This gut-wrenching scenario occurs much too frequently in Israel and around the world. Such tragic stories, particularly of young people who choose to end their lives by committing suicide, become a headline for a day or two, then recede to the unspoken corners of society.
According to the World Health Organization, mental distress affects one in five teenagers. At November’s Israeli Digital Health Conference, Prof. Arad Kodesh, director of mental health services for the Meuhedet Health Maintenance Organization, reported that in Israel, the number is one in four. In a typical Israeli school classroom of 32 students, that means approximately eight kids are struggling with some sort of mental distress.
Accordingly, Orit Shoshani Merom, director of social services for child development at Meuhedet, says, “It is recommended that you encourage your child to share with you what they have been exposed to.” However, where do most kids actually vent? Recent studies show what conventional wisdom has been voicing for years: Today’s youths prefer to share their feelings and emotions through texts and online, in social media, in chat platforms while gaming and in other similar settings.
The pandemic has made the situation even worse. Merom adds, “COVID has created a new normal-but-stressful routine to which we need to become adjusted. It is a time of feeling powerless, with a loss of control for us and for our children.”
What happens when someone posts suicidal thoughts on Facebook, TikTok or Instagram? What if the post’s intentions aren’t clear to the average reader? We’ll come back to that in a moment.
Consider three seemingly irrelevant and unconnected points:
• If you post on social media about buying gym shoes, you will start seeing ads for brand-name gym shoes. There are even cases where people just spoke about a subject – say, a trip overseas – and ads for flights and hotels appear!
• Israeli law clearly states that if you are aware of abuse or harm taking place, you must report it, and if you don’t, you are criminally liable and may serve time in jail.
• Social media platforms like Facebook or Twitter have the ability to discern the information you are posting and react to it. If you post something about Holocaust denial or committing a terrorist hate crime, your post will likely be tagged and removed and your account will be blocked.
WHAT’S THE connection between these points? It should be clear to everyone that social media platforms not only have the ability to monitor their users, they have a responsibility for their users as well. Their accountability must be similar to that of any Israeli who stands idly by while someone is being abused.
What is currently important to these companies is what they already know how to do: producing targeted marketing and user analysis. However, website owners and social media platforms must be held accountable for doing nothing when someone posts about their mental distress. As terrible as it is to threaten someone else’s life, allowing a teenager to take his or her own life is in some ways even worse. It almost seems as if it’s permitted to hurt one’s self, but not others.
Anyone who has been on public forum websites in Israel such as AskPeople or 4Girls can see that these relatively small websites are inundated with posts and that site owners are unable to keep up with the content. People posting on public forums can be ignored, given bad information, or worse, bullied into committing terrible acts with such comments as “I dare you to kill yourself” or “You really are pathetic, why are you even alive?”
Potential solutions exist, including the use of artificial intelligence and natural language processing, to “read and understand” texts, posts and even emojis. In addition, public and private mental health services, including hotlines, are accessible from any phone, such as the Israeli helpline Dial 105, which is run by the National Center for the Protection of Youth Online and staffed by professional mental health workers and police investigators.
Dr. Amir Gefen, former director of prevention services for Dial 105, explains, “When someone calls, we know what to do and do it very well. But what we really need is a better system of prevention and detection.”
Despite this, not one country in the world holds website owners and social media platforms accountable for this type of activity. Social media platforms must be held accountable for suicides that occur on their watch. Israel is a world leader in hi-tech, biotech and med-tech; it is time to be the leader in using technology for another societal good. We need to pressure governments, website owners and social media platforms to take responsibility regardless of what the law says. It is the moral and ethical thing to do.
The writer is a consultant in both philanthropy and social-tech start-ups. He has worked in this field for more than 25 years. [email protected]