With TikTok gaining more popularity, the question of online safety remains open. On Friday, the company announced that they would implement additional tools that will automatically delete videos from TikTok that go against the community guidelines.
Right now, there is a system in place that detects any content that can be viewed as potentially offensive or not safe for minors. But, each case is then reviewed by the safety team of TikTok. If a person sees a problem with the posted content, a user gets a notification, and the video is deleted. Over the course of upcoming weeks, the content will be checked and, in the case of violation, removed with the help of censoring tools. The violations may include graphic content, illegal activities, adult nudity, and more. It means that there will be both removals approved by the safety team and those that will be dealt with automatically.
According to the company, this will give their team an opportunity to focus on more complicated cases involving online harassment, spreading misinformation, bullying, and more.
Also, TikTok mentioned that a user will get notified after the first violation of the company’s policy and will receive a warning. If this user violates the policy again, the account will be deleted.
TikTok seems to be taking the issues with their platform more seriously after all the negative reactions they got because they don’t do anything about misinformation and hate speech present on the platform.
What do you think about the automatic censoring feature? Do you think it will be effective, or will it cause innocent videos to be deleted without any explanation? Please, tell us what you think in the comments below!
Leave a comment
Your comment is awaiting moderation. We save your draft here
0 Comments