Monday, June 3, 2019

“I saw the video again and I got scared by the number of views"

NYT:
Christiane C. didn’t think anything of it when her 10-year-old daughter and a friend uploaded a video of themselves playing in a backyard pool.

“The video is innocent, it’s not a big deal,” said Christiane, who lives in a Rio de Janeiro suburb.

A few days later, her daughter shared exciting news: The video had thousands of views. Before long, it had ticked up to 400,000

...

YouTube’s automated recommendation system — which drives most of the platform’s billions of views by suggesting what users should watch next — had begun showing the video to users who watched other videos of prepubescent, partially clothed children, a team of researchers has found.

YouTube had curated the videos from across its archives, at times plucking out the otherwise innocuous home movies of unwitting families, the researchers say. In many cases, its algorithm referred users to the videos after they watched sexually themed content.

...

Users do not need to look for videos of children to end up watching them. The platform can lead them there through a progression of recommendations.

So a user who watches erotic videos might be recommended videos of women who become conspicuously younger, and then

...

The company said that because recommendations are the biggest traffic driver, removing them would hurt “creators” who rely on those clicks.
From a related thread:










Related: