Illustration by Alex Castro / The Verge

That the machine learning-driven feed of YouTube recommendations can frequently surface results of an edgy or even radicalizing bent isn’t much of a question anymore. YouTube itself has pushed tools that it says could give users more control over their feed and transparency about certain recommendations, but it’s difficult for outsiders to know what kind of impact they’re having. Now, after spending much of the last year collecting data from the RegretsReporter extension (available for Firefox or Chrome), the Mozilla Foundation has more information on what people see when the algorithm makes the wrong choice and has released a detailed report (pdf).

In September 2020 the extension launched, taking a crowdsourced approach to find…

Continue reading…