YouTube’s new feature to warn users before they post toxic comments
In a step to prevent bullying and hate on its platform, YouTube has introduced a new feature that will warn users when they post offensive comments. YouTube will prompt them before posting a comment to be more respectful.
According to Mashable, Johanna Wright, Vice President of Product Management at YouTube said in a blog, “We know that comments play a key role in helping creators connect with their community, but issues with the quality of comments are also one of the most consistent pieces of feedback we receive from creators.”
Google employee Sarah posted on Google support, “Today we’re introducing a new reminder in comments to help encourage respectful interactions. Now on Android, the reminder may pop up before posting a comment that may be offensive to others, giving the commenter the option to reflect before posting”.
However, the users can continue to comment if they want to. Apart from the new reminder feature, YouTube has also made a few more updates to make YouTube a more inclusive platform.
As per Mashable, the blog stated that YouTube will be testing a new filter in YouTube Studio for inappropriate and hurtful comments. These comments will be automatically held for review so that content creators on the platform won’t ever have to read these comments if they are not interested. YouTube said it will also be streamlining the comment moderation tools to make the overall process easy for creators.
Also, YouTube will also be invested in technology that can help enhance the system to detect and remove hateful comments. It will consider factors such as the topic of the video and the context of the comment that’s being made.
This story has been published from a wire agency feed without modifications to the text. Only the headline has been changed.