YouTube creators love to connect with their audience through the platform’s comments section, but it’s not much fun when posts left by viewers are offensive, bullying, or just plain nasty.
YouTube has implemented various measures over the years to try to ensure such comments stay off its video-sharing platform, or at least get pushed down the list to reduce visibility, but this week it’s launching a new feature aimed at encouraging people to think twice before posting a potentially offensive comment.
It means that when YouTube’s algorithms detect such a comment, it’ll issue a note to the poster saying “Keep comments respectful,” adding, “If you’re not sure whether your comment is appropriate, review our Community Guidelines.” It also asks the poster to let YouTube know if they think its algorithm has made a mistake in singling out the comment as potentially offensive. Indeed, YouTube notes that its computer systems are continuously learning and may not always get it right, especially in the early stages.
“The reminder may pop up before posting a comment that may be offensive to others, giving the commenter the option to reflect before posting,” YouTube explains on a Help page. “From the reminder, the commenter can move forward with posting the comment as is, or take a few extra moments to edit the comment before posting it.”
The new feature comes first to Android, with iOS and the web hopefully following soon.
The pop-up message follows similar features introduced by Instagram last year. Instagram said the feature “gives people a chance to reflect and undo their comment and prevents the recipient from receiving the harmful comment notification.” Twitter has also been testing a similar feature to try to reduce the amount of nasty and hurtful posts landing on its platform.
Google-owned YouTube says it’s also testing a system for YouTube Studio that will filter out potentially offensive comments that have been automatically held for the creator to review, enabling the creator to avoid them entirely if they wish to do so.
Finally, YouTube says it’s continuing to invest in technology designed to help its systems better detect and remove hateful comments by taking into account the topic of the video and the context of a comment.
And the company says its efforts are making a difference, noting that since early 2019, it’s increased the number of daily hate speech comment removals by 46 times, though it doesn’t give a precise figure.
- YouTube makes it easier for new creators to earn money
- These are the 10 most-viewed YouTube videos of all time
- YouTube relaxes rules around swearing and demonetization
- What is Ambient Mode on YouTube?
- YouTube is rolling out handles. Here’s what you need to know