Ronnie McNutt army veteran commits suicide on Facebook livestream as TikTok continues hosting graphic video despite pleas to have it removed.
Disconcert has come to the fore after video of an Army veteran shooting himself dead began widely circulating on TikTok, prompting viewers to condemn the app for not doing more to remove it.
Ronnie McNutt, 33, an Iraq War veteran and Mississippi man broadcast himself on Facebook Live shooting himself in the head on the social media site on August 31.
McNutt, who worked for Toyota in Blue Spring, New Albany, had recently broken up with his girlfriend and also suffered from post traumatic stress disorder.
The footage which was posted on TikTok and has been viewed so many times that some users said it appeared on the app’s ‘For You’ homepage.
TikTok said they are banning anyone who is sharing the video which has been described by users as ‘extremely gory and terrifying.’
Army vet’s final message before livestream suicide was cry for help & love
(( TW: SUICDE
If you see this video don’t watch it. Report it. This video is Ronnie McNutt’s suicide. In the video McNutt kills himself with a hunting rifle. I REPEAT: DO NOT WATCH IT, DO NOT SEARCH FOR IT, IF YOU FIND IT REPORT IT. This isn’t something to joke about. […] pic.twitter.com/2vc9KyjZNQ
— 𝐉𝐨𝐬𝐮𝐤𝐞 𝐇𝐢𝐠𝐚𝐬𝐡𝐢𝐤𝐚𝐭𝐚 ★ (@JosukePT4) September 7, 2020
‘How are videos like that allowed?’
But users have slammed Tiktok, and some have claimed people have been editing the videos to include shots of cats to trick viewers into watching the dailymail reports.
‘TikTok is one messed up app,’ Twitter user Elise wrote. ‘People are posting the video of Ronnie Mcnutt, a guy who committed suicide on Facebook Live. My thoughts are with his family.’
Another tweeted: ‘I was just scrolling through TikTok and a video of someone committing suicide was on my For You page. How are videos like that allowed. I’m so just idk [I don’t know] disgusted and sad and just freaked out.’
Friends of McNutt spoke of their heartbreak after the Mississippi man live streamed his suicide death.
One wrote online: ‘Please say a prayer right now for the family of Ronnie McNutt. He just killed himself live on Facebook and I cannot unsee this.
‘I tried but apparently it wasn’t quick enough to reach him. I wasn’t quick enough.
‘Dear God, I wish I could have got to him.’
rip ronnie mcnutt 🙁 my heart hurts so bad, i wish more people would talk about this. the video is so mf heartbreaking bro pic.twitter.com/fCUM8I8MZE
— 🧜🏼♀️ (@mayorga_asia) September 8, 2020
Scurry to take down footage
Since the former vet’s suicide, social media sites have been trying to take the footage down as those who have witnessed it urge others to avoid TikTok.
One wrote on Twitter: ‘If you see this guy on your For You page, please scroll up immediately.
‘It’s very gruesome and I highly suggest you stay away from TikTok for a while.’
Others are posting screenshot of the video’s beginning to make people aware of what clips to avoid.
A TikTok spokesman said: ‘On Sunday night, clips of a suicide that had been livestreamed on Facebook circulated on other platforms, including TikTok.
‘Our systems have been automatically detecting and flagging these clips for violating our policies against content that displays, praises, glorifies, or promotes suicide.
Did Facebook fail to stop the video being widely shared?
‘We are banning accounts that repeatedly try to upload clips, and we appreciate our community members who’ve reported content and warned others against watching, engaging, or sharing such videos on any platform out of respect for the person and their family.
‘If anyone in our community is struggling with thoughts of suicide or concerned about someone who is, we encourage them to seek support, and we provide access to hotlines directly from our app and in our Safety Centre.’
Facebook said in a statement: ‘We removed the original video from Facebook last month on the day it was streamed and have used automation technology to remove copies and uploads since that time. Our thoughts remain with Ronnie’s family and friends during this difficult time.’
A friend of McNutt said he tried desperately to prevent the suicide — and accused the social media giant of failing to stop the video from being shared online.
Offered Josh Steen who co hosted a podcast with McNutt via Heavy: ‘He didn’t seem to be the same guy that left for Iraq once he exited the service. I spent many a late night in our studio, via text message, and in person talking with him about life and his struggles.
‘Mental health issues are very, very real, and I honestly think that there are a lot of people who struggle with all areas of mental illness who let it go untreated. Or treat it with other things, it seems.’
Steen added that he’d been alerted to the video when it was being broadcast and had attempted to call McNutt several times.
Steen said that he did not believe that McNutt had set out to kill himself but that he was ‘incredibly drunk, and that plus his recent relationship issues led to the end result.’
Steen told Heavy: ‘I tried multiple times to call him, from my cell phone and our phone at the theater; both numbers he would easily recognize. I watched him pick his phone up, think for a second, and then decline my calls.
Social media platforms complicity
‘I really thought that if I could just get him to break his focus for just a second it would be alright. His laugh always made me laugh, and I’m glad that I have our archives to back through to hear it whenever I want to now.’
Steen now blames Facebook, claiming ‘Facebook could’ve stopped this and didn’t.’
While adding, ‘Facebook, Twitter, Instagram, and other social platforms could ban accounts, IPs, and stop the spread of this video. YouTube can flag you for using two seconds of a copyrighted song, but can’t seem to filter out my friend ending his life.’
This isn’t the first time Facebook has been criticized for its response to livestreams showing violent content. In 2019, the company came under fire for its failure to respond to the massacre in Christchurch, New Zealand, during which a terrorist live-streamed himself killing 50 Muslim worshippers.
TikTok, whose China-based owner Bytedance has been ordered by President Donald Trump to sell its U.S. operations, has been criticized for its content moderation policy in the past, especially on circulation of graphic content.