For example, the study doesn’t provide context of how many overall videos were recommended to the test accounts, and also doesn’t give insight into how the test accounts were set up, including whether YouTube’s Supervised Experiences tools were applied.Artmoney Artmani Skachat Besplatno Bez Registracii I Sms “But in reviewing this report’s methodology, it’s difficult for us to draw strong conclusions. We welcome research on our recommendations, and we’re exploring more ways to bring in academic researchers to study our systems. In a statement, a YouTube spokesperson said: “We offer a number of options for younger viewers, including a standalone YouTube Kids app and our Supervised Experience tools which are designed to create a safer experience for tweens and teens whose parents have decided they are ready to use the main YouTube app. “YouTube took no apparent steps to age-restrict these videos, despite stating it has the option to do so with content that features ‘adults participating in dangerous activities that minors could easily imitate’.” We use Google reCaptcha to protect our website and the Google Privacy Policy and Terms of Service apply. For more information see our Privacy Policy. Privacy Notice: Newsletters may contain info about charities, online ads, and content funded by outside parties. The report said: “Many of these videos violated YouTube policies, which prohibit showing ‘violent or gory content intended to shock or disgust viewers’, ‘harmful or dangerous acts involving minors’ and ‘instructions on how to convert a firearm to automatic’.” skip past newsletter promotion Despite that, the recommendation algorithm showed the users videos that were not only wildly age-inappropriate, but apparently in violation of YouTube’s terms of service altogether. Unfortunately, this is just the latest example of Big Tech’s algorithms taking the worst of the worst and pushing it to kids in an endless pursuit of engagement.”Īll four test accounts were disclosed to YouTube as being those of children, with consistent birthdates and, for the under-13s, a linked “parental” account in accordance with the video site’s policies. Now, we’re discovering that it recommends these videos to young people. TTP’s executive director, Michelle Kuppersmith, said: “It’s bad enough that YouTube makes videos glorifying gun violence accessible to children. ![]() ![]() “But TTP’s study shows there is a mechanism that can lead boys who play video games into a world of mass shootings and gun violence: YouTube’s recommendation algorithm.” “For more than two decades, politicians have pointed to violent video games as the root cause of mass shootings in the United States, even though researchers have found no evidence to support that claim,” the report said. TTP was clear that video games were not to blame for violent behaviour. ![]() For all the accounts, YouTube’s algorithm pushed content related to weapons, shootings and murderers, but for those who actively watched the material it recommended such footage at much higher volumes. For the 14-year-old, the playlist “consisted primarily of videos of first-person shooter games like Grand Theft Auto, Halo and Red Dead Redemption”.Īfter using the accounts to watch video gaming content, the researchers logged and analysed the videos the algorithm recommended, with one in each group passively tracking the recommendations and the other actively clicking and viewing them. Researchers created four fake “user accounts” – two identified as nine-year-old boys and two identified as 14-year-old boys – and used them to watch exclusively gaming-related videos, albeit not always strictly age-appropriate ones, in an attempt to come up with an accurate cross-section of what a real child and teenager would be looking at.įor the nine-year-old, that included videos for games such as Roblox and Lego Star Wars, but also the horror game Five Nights at Freddy’s, set in a parody of the Chuck E Cheese restaurant chain.
0 Comments
Leave a Reply. |