YouTube made a declaration on Thursday that it would commence warning users before posting comments that may be regarded as offensive to other people. It is part of an effort of the tech giant to "keep comments respectful."

YouTube to Block Comments if AI Algorithms Detect Offensive Wording

According to YouTube, it would start asking users to reconsider commenting before it goes up if Google's artificial intelligence (AI) detects the post as potentially disrespectful. The new YouTube prompt suggests that commenters assess the website's community guidelines if they are unsure if the post is offensive. Then it would provide the option to either edit the content or post anyway.

The video-sharing website is inducting new options to its platform to cooperate with numerous communities and prompt respectful engagements. The streaming site would caution customers when a remark they are about to submit could be offensive, giving them the chance to reconsider posting.

Users would be expecting to see warnings before the posting of comments. The Google-owned website would begin prompting the public to reflect should an insulting message be detected.

The feature would not bar comments altogether but would instead offer users the choice to edit their post or comment as it is.

According to YouTube, the video-sharing platform "may not always get it right" as the fresh system is continuously acquiring knowledge.

To directly block the posting of disrespectful comments, real-time analysis through AI algorithms detecting inappropriate words or phrases would be utilized.

Also Read: Facebook Faces Charges for Allegedly Discriminating American Workers, Lawsuit Filed by DOJ

According to Google in a blog post, "To encourage respectful conversations on YouTube, we're launching a new feature that will alert users when their comment may be offensive to others, giving them the option to reflect before posting," reported Tekdeeps.

The feature will provide users the opportunity to "reflect before posting," YouTube stated in a blog post.

The firm remarked it would also start asking its users to provide demographic information to find patterns of hate speech that could have a greater effect on some communities than others.

According to YouTube, "We'll then look closely at how content from different communities is treated in our search and discovery and monetization systems. We'll also be looking for possible patterns of hate, harassment, and discrimination that may affect some communities more than others," reported The Epoch Times.

The video website announced the feature and other measures to bolster inclusivity in a blog post, "To encourage respectful conversations on YouTube, we're launching a new feature that will warn users when their comment may be offensive to others, giving them the option to reflect before posting, reported Seattle Pi.

The comments feature is currently on Android.

YouTube could also check a brand new filter in YouTube Studio for extremely inappropriate and disrespectful feedback, which has been mechanically positioned to evaluate channel house owners not to check these feedbacks if not needed.

According to YouTube, the platform was working to shut current gaps in how its merchandise and insurance policies work for everyone, particularly the Black cluster.

Related Article: Spotify 2020 Wrapped: See Your Top Songs for 2020