YouTube warns users before posting potentially offensive comments

YouTube is introducing new features to its platform to support diverse communities and encourage respectful interactions. The streaming platform warns users when a comment they are trying to post may be offensive to others and gives them a chance to think before posting. YouTube is also testing a new filter in YouTube Studio for potentially inappropriate and offensive comments that have been automatically saved for review, so channel owners don’t have to view those comments if they don’t want to.

announcement In the updates on a blog post, YouTube said the platform was working to fill existing gaps in how YouTube products and policies work for everyone, especially the black community.

The new reminder of the Community Guidelines for Posting Potentially Harmful Comments is rolling out on Android. If the commenter wants to continue with the potentially harmful comment even after the popup notification, they can consider doing so or even edit / delete it. If users think they have been flagged incorrectly, they can let YouTube know about it through the notification popup.

Youtube community guidelines new youtube_community_guidlines_new

YouTube warns users when a comment they want to post offends others

According to a YouTube support Page learns from content that has been repeatedly reported by users. YouTube will also optimize the comment moderation tools to make it easier for creators not to read or engage in potentially inappropriate and hurtful comments.

To identify gaps in the system that could affect a creator’s ability to reach their full potential, YouTube will voluntarily encourage creators to provide their gender, sexual orientation, race, and ethnicity starting next year. In this way, the video streaming platform wanted to examine how content from different communities is treated in their systems, e.g. B. Search and Discovery and Monetization.

The Google company said it will also look into potential patterns of hatred, harassment and discrimination that may affect some communities more than others.

YouTube said it invested in technology to help its systems better identify and remove hateful comments by taking into account the subject of the video and the context of the comment.

Meanwhile, YouTube started Earlier this week, there were three new filters designed to help creators, artists, and publishers make premieres even more interactive. Two of the filters, Live Redirect and Trailers, have been rolled out for eligible creators, while the third filter, Countdown Themes, will be available in the coming months.

Live Direct enables developers to host a live stream as a pre-show just before Premiere airs. The live audience is automatically directed to the upcoming premiere immediately before the start. Trailers, on the other hand, allow users to upload a recorded hype video that plays on the watch page before the premiere. This clip can be anywhere from 15 seconds to three minutes. Countdown Themes allow developers to choose a custom countdown to their premiere for a range of themes, vibes, and moods.


Which is the best TV under Rs. 25,000? We discussed this on Orbital, our weekly technology podcast, which you can subscribe to via Apple Podcasts, Google Podcasts, or RSS, Download the episodeor just hit the play button below.

.

Source link : https://gadgets.ndtv.com/entertainment/news/youtube-warn-users-posting-potentially-offensive-comment-new-feature-diverse-communities-support-google-2334325#rss-gadgets-all