Twitter | Search | |
Max Fisher
YouTube’s algorithm has been curating home movies of unwitting families into a catalog of semi-nude kids, we found. YT often plays the videos after users watch softcore porn, building an audience of millions for what experts call child sexual exploitation
The site’s automated recommendation system, at times drawing on home movies of unwitting families, created a vast video catalog of prepubescent children.
The New York Times The New York Times @nytimes
Reply Retweet Like More
Max Fisher Jun 3
Replying to @Max_Fisher
Each video might appear innocent on its own, a home movie of a kid in a two-piece swimsuit or a nightie. But each has three common traits: • the girl is mostly unclothed or briefly nude • she is no older than age 8 • her video is being heavily promoted by YouTube’s algorithm
Reply Retweet Like
Max Fisher Jun 3
Replying to @Max_Fisher
Any user who watched one kiddie video would be directed by YouTube's algorithm to dozens more — each selected out of millions of otherwise-obscure home movies by an incredibly sophisticated piece of software that YouTube calls an artificial intelligence. The families had no idea.
Reply Retweet Like
Max Fisher Jun 3
Replying to @Max_Fisher
We talked to one mother, in Brazil, whose daughter had posted a video of her and a friend playing in swimsuits. YouTube’s algorithm found the video and promoted it to users who watched other partly-clothed prepubescent children. Within a few days of posting, it had 400,000 views
Reply Retweet Like
Max Fisher Jun 3
Replying to @Max_Fisher
We talked to child psychologists, sexual trauma specialists, psychologists who work with pedophiles, academic experts on pedophilia, network analysts. They all said YouTube has built a vast audience — maybe unprecedented — for child sexual exploitation, with grave risks for kids.
Reply Retweet Like
Max Fisher Jun 3
Replying to @Max_Fisher
YouTube, to its credit, said it has been working nonstop on this issue since a similar issue was first reported in February. YT also removed some of the videos immediately after we alerted the company, though not others that we did not specifically flag.
Reply Retweet Like
Max Fisher Jun 3
Replying to @Max_Fisher
YouTube’s algorithm also changed immediately after we notified the company, no longer linking the kiddie videos together. Strangely, however, YouTube insisted that the timing was a coincidence. When I pushed, YT said the timing might have been related, but wouldn’t say it was.
Reply Retweet Like
Max Fisher Jun 3
Replying to @Max_Fisher
I asked YouTube— why not just turn off recommendations on videos of kids? Your system can already identify videos of kids automatically. The recommendation algorithm is driving this whole child exploitation phenomenon. Switching it off would solve the problem and keep kids safe.
Reply Retweet Like
Max Fisher Jun 3
Replying to @Max_Fisher
Initially, YouTube gave me comment saying that they were trending in that direction. Experts were thrilled, calling it potentially a hugely positive step. Then YouTube “clarified” their comment. Creators rely on recommendations to drive traffic, they said, so would stay on. 📈
Reply Retweet Like
Max Fisher Jun 3
Replying to @Max_Fisher
On a personal note, I found reporting this emotionally straining, far more so than I'd anticipated. Watching the videos made me physically ill and I've been having regular nightmares. I only mention it because I cannot fathom what this is like for parents whose kids are swept up.
Reply Retweet Like
Max Fisher Jun 3
Replying to @kbennhold
As I reported last year with , YouTube’s algorithm does something similar with politics. We found it directing large numbers of Germans news consumers toward far-right extremist videos, with real-world implications.
Reply Retweet Like
Max Fisher Jun 3
Replying to @Max_Fisher
For more on what happens when YouTube and other social networks route an ever-growing global share of human social relations through engagement-maximizing algorithms, read our essay on “the Algorithmification of the Human Experience”:
Reply Retweet Like
Max Fisher Jun 6
Replying to @Max_Fisher
Wow: Senator Hawley has introduced legislation off of our story that would prohibit video-hosting services like YouTube from recommending videos that feature minors.
Reply Retweet Like
Max Fisher Jun 6
Replying to @Max_Fisher
When I asked YouTube about doing exactly this last week, the company did not deny that it was technically capable, and even hinted they might do so voluntarily. Later, YT said they would not do this because it would hurt traffic. Sen. Hawley’s bill would force them to do it.
Reply Retweet Like
Max Fisher Jun 7
Replying to @Max_Fisher
More momentum building: Senators Blumenthal and Blackburn send an open letter to YouTube’s CEO about our story, demanding to know, among other things, why YouTube will not turn off recommendations on videos of children.
Reply Retweet Like
Max Fisher Jun 17
Replying to @Max_Fisher
Samantha Bee did a very good segment on YouTube’s troubles, leading with our story on algorithm-assisted pedophilia:
Reply Retweet Like
Max Fisher Jun 19
Replying to @Max_Fisher
"YouTube executives reportedly mulling removing all children’s content." "The reports ... have also prompted a federal investigation by the Federal Trade Commission ... that could result in Google receiving a fine over its inability to protect children."
Reply Retweet Like