Taking part in risky stunts — whether or not in front of a camera — is a practice that can occasionally end in disaster for those involved.
YouTube certainly wants no part of such shenanigans and updated its guidelines to ram home the point.
In a Q&A section posted on Tuesday, January 15, introducing its revamped guidelines, YouTube acknowledged that the video-streaming site is “home to many beloved viral challenges and pranks, like Jimmy Kimmel’s Terrible Christmas Presents prank or the water bottle flip challenge,” but said it had to make sure that “what’s funny doesn’t cross the line into also being harmful or dangerous.”
Harmful or dangerous? Ah, that would be stunts such as the so-called “Bird Box challenge,” where some folks, inspired by the recent Netflix Original starring Sandra Bullock, have been attempting a range of activities while wearing a blindfold — like driving a car or walking along a train track. We can throw 2018’s Tide pod challenge into the same category too — a rather risky endeavor that involved eating the contents of laundry detergent packs. “Anything for a thumbs up,” appeared to be the mantra behind the madness, though creators are after views to boost their revenue off ads, too.
YouTube already bans content showing dangerous activities, but the new rules go into more detail regarding “dangerous challenges and pranks.” It tells creators that it doesn’t even permit videos where someone believes they’re in some kind of physical danger, even if the situation is actually safe. The company offers examples such as home-invasion setups or drive-by shooting pranks. Stunts like fake bomb threats fit neatly into that category, too, though something as daft as that can also get you jailed.
YouTube was also keen to make it clear that it doesn’t allow pranks “that cause children to experience severe emotional distress, meaning something so bad that it could leave the child traumatized for life,” adding that it’s been working with psychologists to develop guidelines around the kinds of setups that go too far.
The company said it’s giving creators two months to review and clean up their content. During this time, challenges and pranks that violate its guidelines will be removed if its team gets to the banned content first, but the channel will not receive a strike during this period. If a creator disagrees with a strike, they can appeal against it.
A strike disappears after 90 days, but creators are warned that if their account receives three strikes for violations within a 90-day period, it will be terminated.
YouTube has received a barrage of criticism in recent years for hosting offensive content on its site, an issue which it says it’s tackling with machine-learning algorithms and the addition of more human reviewers.
- Are deepfakes a dangerous technology? Creators and regulators disagree
- How to report someone on Discord
- YouTube reveals why it’s been removing far more videos than usual
- TikTok took down over 104 million videos in the first half of 2020
- What the biggest tech companies are doing to make the 2020 election more secure