- Decode with Adrija
- Posts
- Why Is Social Media Allowing Self Harm Videos To Go Viral?
Why Is Social Media Allowing Self Harm Videos To Go Viral?
This post may be triggering.
Lately, my social media feeds have been flooded with suicide videos. I report them. Nothing happens. No takedowns, no warnings—just endless, heartbreaking content left to spread like wildfire.
In December, Atul Subhash, a 34-year-old techie, was found hanging in his Bengaluru home. Before taking his life, he recorded an 80-minute video detailing the harassment he faced. That video wasn’t just passed around in WhatsApp groups of men’s rights activists—it was uploaded to YouTube, where news channels amplified it, sparking debate but also fueling morbid curiosity.
More recently, a TCS employee shared a final video before ending his life. The video showed a noose hanging from his ceiling fan, and, through tears, he begged, “Don’t touch my parents.” The video went viral on X. No warnings, no community notes. Even after I reported it, it is still online.
Just last week a video surfaced of a student in an Odisha engineering college who had taken her own life. Unlike the others, she hadn’t recorded it—her classmates did.
They filmed her lifeless body and shared it online. Content creators saw an opportunity and spread it across platforms. And as usual, social media companies did nothing.
The Internet’s Obsession with Death
I wish I could say I was shocked, but I’ve seen this before. A few years ago, I reported on the disturbing reality of the Internet’s hunger for death videos.
Take r/WatchPeopleDie, a now-banned Reddit forum that housed gruesome clips of real-life deaths—beheadings, drownings, electrocutions. Before its takedown in 2019, it had over 425,000 subscribers. And that was just the people who joined. The number of lurkers who consumed this content? Likely in the millions.
Dr. Carole Lieberman, a forensic psychiatrist, once explained to me:
“Just like some people are addicted to pornography, others are addicted to gruesome killings and deaths.”
I’ve spoken to families whose children’s final moments were plastered across the Internet. Andy Parker, whose journalist daughter Alison was murdered on live TV, has spent nearly a decade fighting tech companies to remove her death videos. Even today, when he searches her name on YouTube, he finds clip after clip of her murder—some from the live broadcast, others from the GoPro footage her killer posted, and even manipulated versions spread by conspiracy theorists.
The Christchurch mosque massacre in 2019 forced platforms to ban r/WatchPeopleDie, but the damage was done. The appetite for this content never left. It just went underground.
You can read my previous story on ‘when digital memory is a curse’ here.
The Algorithm’s Double Standards
Some argue that moderating such content is a slippery slope toward censorship. But let’s be honest—social media platforms already censor content all the time. And they do it selectively.
One Mumbai-based artist, told Decode, that Instagram buries posts about political conflict, hate crimes, and government criticism. Meanwhile, violent and harmful videos spread unchecked. “Even when political content is promoted, it often favors majoritarian perspectives. For example, I frequently come across reels from figures like Jordan Peterson and Ben Shapiro. If the algorithm is sophisticated enough to understand user preferences, why am I still being shown this content?” they asked.
Meta claims it uses AI to detect and remove harmful content. But if that’s true, why do these videos keep spreading?
Why does the algorithm seem so advanced when suppressing political content but so inept when dealing with real-life horror?
The Business of Tragedy
The ugly truth? Social media companies profit from these videos. Content moderation is expensive, and high-engagement posts—especially those that shock or disturb—keep users scrolling. More views mean more ad revenue.
Andy Parker even attempted to copyright his daughter’s death video as an NFT, hoping to use intellectual property laws to force its removal. That’s how desperate families have become—trying to outmaneuver billion-dollar tech companies just to preserve their loved one’s dignity.
Even when platforms remove a video, it’s often too late.
The Internet never forgets. And as long as these companies refuse to take real action, the cycle of trauma will continue.
Meanwhile, we can keep reporting. Keep pushing for better moderation. Keep demanding accountability. But until tech giants recognise that human lives aren’t just content, we’re stuck in an endless loop—where tragedy isn’t just witnessed, but commodified, shared, and consumed.
And that should scare us all..
MESSAGE FROM OUR SPONSOR
Join over 4 million Americans who start their day with 1440 – your daily digest for unbiased, fact-centric news. From politics to sports, we cover it all by analyzing over 100 sources. Our concise, 5-minute read lands in your inbox each morning at no cost. Experience news without the noise; let 1440 help you make up your own mind. Sign up now and invite your friends and family to be part of the informed.
🔥What’s Trending?
Musk, OnlineThe New York Times went through Elon Musk’s social media over the years and declared it to be a complete cringe show. You can read the journey of how Musk’s tone changed on Twitter (now X) over the years in this delightful article. |
Stay Offline For A WhileA new research states that blocking mobile internet access for two weeks can significantly improve mental health and overall well-being. Not that we didn’t know that staying off screen is a good idea, but research always helps with some motivation, perhaps. |
![]() | Lying CopsThe new video series is coming, on the so-called spy cops scandal. The series, made in collaboration with the Guardian, tells the stories of a group of women, who were devastated after they were deceived into long-term relationships by undercover officers. The women used ingenious methods over many years to discover the true identities of the men who had abused them. |
Got a story to share or something interesting from your social media feed? Drop me a line, and I might highlight it in my next newsletter.
See you in your inbox, every other Wednesday at 12 pm!
Was this email forwarded to you? Subscribe