YouTube cracks down on disturbing content featuring children after backlash
Video hosting giant YouTube has begun clamping down on content showing children in vulnerable situations as well as disturbing material aimed at children. The move followed a wave of criticism against the video streaming service.
YouTube has “expanded its enforcement guidelines” concerning the removal of content featuring minors “that may be endangering a child, even if that was not the uploader's intent," Johanna Wright, vice president of product management said in a statement. She added the service also expanded its age-restriction policies on content featuring family entertainment characters in videos “containing mature themes or adult humor.”
The statement published on the YouTube official blog Wednesday, also said the service had discovered a large number of videos that pretend to be family-friendly but are “clearly not.” “While some of these videos may be suitable for adults, others are completely unacceptable,” and the company is in the process of removing the content.
More than 50 YouTube Channels as well as “thousands” of videos have already been removed under the new stricter guidelines over the last week, the statement says, adding, that the video service continues “to work quickly to remove more every day.”
The video hosting giant said further that it had, since June, removed ads, allowing YouTubers to capitalize on the content they post, from as many as 3 million videos “depicting family entertainment characters engaged in violent, offensive, or otherwise inappropriate behavior.”
The service, which is a unit of Alphabet Inc.'s Google, also pledged to increase the number of experts it works with to assess its content as well as double the number of Trusted Flaggers – regular users, who voluntarily monitor the site for harmful content and report it to administrators. YouTube also decided to crack down on “inappropriate sexual or predatory comments on videos featuring minors,” by announcing that it would further disable the entire comment section for any videos of minors where these types of comments would be found.
The wide-ranging measures from YouTube is in response to mounting criticism recently leveled against the Google-owned giant by the media as well as activist groups. This week, Buzzfeed reported about “hundreds of disturbing videos showing children in distress.” The media outlet also contacted YouTube regarding some of its verified accounts featuring such videos, each of which allegedly had millions of subscribers.
Earlier, the New York Times reported that some videos featuring content that may be inappropriate for children may have slipped through its automated filters and found their way to the YouTube Kids section, considered to be a children-friendly app. A British writer, James Bridle, also listed some of the questionable videos in his online essay published recently.
A group of concerned activists also created a forum on the Reddit internet platform called ElsaGate, named after a Walt Disney cartoon princess often seen in controversial YouTube videos. The activists compiled a list of YouTube channels posting disturbing content either featuring minors or aimed at children.
After being contacted by Buzzfeed, YouTube management deleted all videos featured in its report and suspended all accounts in question for violating the video service’s regulations, according to the media outlet. However, the report also said that, before the video hosting giant took action, live-action child exploitation videos had been rampant and easy to find on its network.
Earlier, the Telegraph also reported that even YouTube’s own Trusted Flaggers complained that the service failed to respond to their reports. In August, a volunteer told BBC that "there is no reliable way for a concerned parent, child in danger, or anyone else to reliably report and get action on a predatory channel."
Activists welcomed YouTube’s recent announcement of policy changes, but said further action is urgently needed. “Thank you but this can't just be some press release. We need real action,” one person said. Some people also questioned why the video hosting giant took so long to tackle the problem.