Faced with criticism over its YouTube Kids content, the video aggregator has launched a new policy that age-restricts this type of objectionable content in the YouTube main app when flagged.
Earlier in the year, YouTube kids was under fire for hosting videos with profanity and violent themes on its kid-friendly platform. For example, in one video, Peppa Pig is shown drinking bleach. Because a cute animal is the main character, the algorithm whitelisted the video.
In response to the current backlash, YouTube will no longer simply rely on algorithms to keep inappropriate content off the platform: If a video with recognizable children’s characters gets flagged in YouTube’s main app, which is much larger than the Kids app, it will be sent to the policy review team. That team will include thousands of people working around the clock in different time zones to review flagged content. Reviewers who find videos in violation of the new policy will identify them with an age-restriction warning on the main YouTube app. That automatically blocks them from showing up in the Kids app.
YouTube is adjusting the algorithm, but some feel more work needs to be done. For its part, the hosting platform has demonetized millions of videos and also revoked certain creators for including external links in end slates.
According to YouTube, it generally takes a few days for content to migrate from YouTube to YouTube Kids, and the goal is that within that time period, users and its volunteer moderators will flag anything potentially disturbing to children. YouTube’s new policy should be in place within a few weeks.