Researching YouTube Story Made Reporter 'Physically Ill'

Videos of young girls were inappropriately recommended
By John Johnson,  Newser Staff
Posted Jun 4, 2019 11:47 AM CDT
Updated Jun 8, 2019 7:00 AM CDT
Reporting on This YouTube Story Made Him Sick
A YouTube sign is shown across the street from the company's offices in San Bruno, Calif.   (AP Photo/Jeff Chiu)

Earlier this year, YouTube made a change in policy designed to curb pedophiles—it disabled comments on most videos featuring children after learning that predators were making use of the comment section. Now, however, a report in the New York Times suggests that the pedophile problem is far from over. The details:

  • Algorithm issue: Harvard researchers found that YouTube's algorithm was recommending innocent videos of children inappropriately, raising fears predators could be viewing them. This relates to the feature—all-important in terms of traffic to YouTube and its content providers—that recommends a video to watch next as another video is ending.
  • Sickening, literally: In tweeting about the story, the Times' Max Fisher first summed it up thusly: "YouTube’s algorithm has been curating home movies of unwitting families into a catalog of semi-nude kids, we found." Later, he added this: "I found reporting this emotionally straining, far more so than I'd anticipated. Watching the videos made me physically ill and I've been having regular nightmares. I only mention it because I cannot fathom what this is like for parents whose kids are swept up."

  • The user experience: The Times explains a potential path, with a user who watched erotic YouTube videos then being pointed to videos of younger looking women, then women in children's clothes. "Eventually, some users might be presented with videos of girls as young as 5 or 6 wearing bathing suits, or getting dressed, or doing a split. On its own, each video might be perfectly innocent, a home movie, say, made by a child. Any revealing frames are fleeting and appear accidental. But, grouped together, their shared features become unmistakable."
  • In response: YouTube said it would limit recommendations of videos with children if it determines the kids are in "risky situations." But the company will not make a more aggressive move recommended by researchers: turning recommendations off entirely on videos of children. The company says doing so would be unfair to those who create family-themed videos and need the clicks.
  • Other changes: On the day the story came out, YouTube announced policy changes aimed at protecting children in a blog post. For instance, the site will now prohibit children from live-streaming unless accompanied by an adult. In regard to the limits on recommendations, the company said it "already applied these changes to tens of millions of videos."
  • A fine line: YouTube is working with what the Verge describes as a "double-edged sword" here. "YouTube needs to ensure that its creators are protected from bad actors, but it also wants to promise its extensive creator base that they can continue to operate."
  • An example: For a tangible example of the problem, the Times talks to a woman whose 10-year-old daughter had uploaded a video of herself and a friend in the pool. Soon, it had 400,000 views, because the algorithm began recommending it to people who had viewed other videos of young girls who were partially clothed. "I got scared," the mother says.
(More YouTube stories.)

Get the news faster.
Tap to install our app.
Install the Newser News app
in two easy steps:
1. Tap in your navigation bar.
2. Tap to Add to Home Screen.