Please join Noticeboard

Mozilla’s RegretsReporter data shows YouTube keeps recommending harmful videos

Headlines Today - United States

Illustration by Alex Castro / The Verge

That the machine learning-driven feed of YouTube recommendations can frequently surface results of an edgy or even radicalizing bent isn’t much of a question anymore. YouTube itself has pushed tools that it says could give users more control over their feed and transparency about certain recommendations, but it’s difficult for outsiders to know what kind of impact they’re having. Now, after spending much of the last year collecting data from the RegretsReporter extension (available for Firefox or Chrome), the Mozilla Foundation has more information on what people see when the algorithm makes the wrong choice and has released a detailed report (pdf).

In September 2020 the extension launched, taking a crowdsourced approach to find...

Continue reading…

Message

Create 3 Noticeboards to earn this Silver level Community Champion Badge.

View all badges that you can earn

You're only a minute from joining in

Discover Noticeboards

Report Content

Please tell us why you are reporting this content.