TikTok is rolling out new features to combat online bullying and harassment
TIKTOK: Social networking platforms have been cracking down on cyber bullying with tools that automatically detect offensive comments. Instagram, for example, launched an artificial intelligence-powered feature in 2019 designed to check for abusive comments and notify users about it before they can post them.
TikTok announced today two new similar capabilities that it says are aimed at promoting “kindness” on the platform. One of these features is meant to warn you before you post an “inappropriate or unkind” comment. This comes in the form of a pop-up prompt that asks you to reconsider and edit your comment before posting it. It will also warn you if TikTok detects “words that may violate” its community guidelines. However, this does not absolutely prevent you from making offensive comments since you can post them anyway.
The other feature allows creators to control what type of comments appears on their content. You can turn on the “Filter All Comments” capability so unapproved comments won’t show up on your videos. This feature can be enabled in the comment filters menu; which already lets you filter spam and offensive comments as well as block specific keywords.
In addition to these new features, TikTok also announced its partnership with the Cyberbullying Research Center as part of its efforts to promote an anti-bullying environment for users. The goal is to support its community members, develop new programs that foster a supportive platform; and gain more insights about bullying within and beyond TikTok.
No comments:
Post a Comment