Christiane C. didn’t think anything of it when her 10-year-old daughter and a friend uploaded a video of themselves playing in a backyard pool.From a related thread:
“The video is innocent, it’s not a big deal,” said Christiane, who lives in a Rio de Janeiro suburb.
A few days later, her daughter shared exciting news: The video had thousands of views. Before long, it had ticked up to 400,000
YouTube’s automated recommendation system — which drives most of the platform’s billions of views by suggesting what users should watch next — had begun showing the video to users who watched other videos of prepubescent, partially clothed children, a team of researchers has found.
YouTube had curated the videos from across its archives, at times plucking out the otherwise innocuous home movies of unwitting families, the researchers say. In many cases, its algorithm referred users to the videos after they watched sexually themed content.
Users do not need to look for videos of children to end up watching them. The platform can lead them there through a progression of recommendations.
So a user who watches erotic videos might be recommended videos of women who become conspicuously younger, and then
The company said that because recommendations are the biggest traffic driver, removing them would hurt “creators” who rely on those clicks.
I cover YouTube culture (more so than the platform), and I want to add a little insight into this from a creators angle. Here's a little intro into this thread: https://t.co/s5ocs459X6 pic.twitter.com/en7hc8VUvv— Julia Alexander (@loudmouthjulia) June 3, 2019
YouTube creators picked up on this. Jake Paul literally invited a family to live with him so he had kids to film with. Here's what he said about his decision to shoot with kids: pic.twitter.com/7NyJLMPXzQ— Julia Alexander (@loudmouthjulia) June 3, 2019
Kids became an easy way for creators to guarantee ads, so people found kids to throw in videos. These videos attracted advertisers. Viewership skyrockets, and suddenly it's not just a content strategy but a safe and vibrant business plan. But these videos also attracted predators— Julia Alexander (@loudmouthjulia) June 3, 2019
TWITTER RECOMMENDATION ALGORITHM: would you like to see some porn your friends like— samurai psychologist (@AliceAvizandum) July 8, 2018
FACEBOOK RECOMMENDATION ALGORITHM: this terrible thing happened a year ago
AMAZON RECOMMENDATION ALGORITHM: buy five more TVs
YOUTUBE RECOMMENDATION ALGORITHM: would you like to become a nazi