TikTok is trying to remove a suicide video uploaded to the platform as of Sunday. TikTok management, which made statements about the video that a group of users uploaded repeatedly or used sections in their content, announced that the video in question would continue to be removed and user accounts would be closed.
TikTok has been on the agenda for a few days with an interesting video. A suicide video posted on the platform on Sunday went viral within hours. Interestingly, a bunch of users aren’t trying to get this suicide video spread. As such, TikTok management took action on the issue and made important warnings.
TikTok spokesperson Hilary McQuaide stated that the video in question was removed immediately after it was shared, but some users have uploaded this video over and over again. Stating that some users use sections from the suicide moment while creating content, McQuaide said that the content was carefully examined and such videos would also be removed. Moreover, according to the TikTok spokesperson, users’ accounts are also in danger due to this suicide video.
Suicide footage is automatically detected and banned by TikTok’s algorithm
In the statements made by TikTok officials, it was mentioned that the images of the moment of suicide, statements praising the suicide and incidents that encouraged suicide violate the platform rules. Stating that every content uploaded to the platform is controlled by algorithms, the officials say that even if this and such videos are uploaded repeatedly, they will be removed instantly.
Your accounts can be closed
TikTok spokesperson Hilary McQuaide said that users who repeatedly try to share the suicide video, which has been spreading recently, will be closed immediately. Expressing that the majority of its users have already reported the accounts sharing this video, McQuaide thanked them and said that they were grateful for their efforts to protect the platform.
By the way, the shared suicide video is not the first. Users prefer Facebook Live for such events. In a study conducted by BuzzFeed News in 2017, at least 45 sensitive cases were detected in a short period of two years on Facebook Live, which was launched in 2015. These include topics such as suicide, shooting, murder, torture, and child abuse. Although Facebook says that it stops such content with its algorithms, it seems that some content could not be blocked.