Hello, From conspiracy theories to propaganda, YouTube’s algorithm promotes a minefield of controversial videos. Have you ever wondered why it keeps recommending the same clickbait content even after you click “dislike” or “do not recommend”? So have we – and we looked into it. What we found out is alarming: YouTube’s user controls do not always work and largely fail to stop unwanted recommendations. Sign Mozilla's petition and call on YouTube to give users real control over their video recommendations! For our report “Does This Button Work? Investigating YouTube's Ineffective User Controls” we studied YouTube’s feedback mechanisms with the help of 20,000 Mozilla supporters who donated their data through Mozilla’s Regrets Reporter browser extension. Our main findings are: - Users are not in control of their experience on the video platform
- Even after using the feedback mechanisms, YouTube’s algorithm recommends unwanted videos
- YouTube can fix this problem
This calls into question two of YouTube’s favorite marketing claims: that people can shape their own experience, and that the algorithm optimizes for user satisfaction. Our current findings are even more worrying in the context of the previous YouTube research we have done: In our 2021 “YouTube Regrets” study, we found that YouTube’s algorithm promotes videos containing misinformation, hate speech and violence. A video recommender system that has been found to regularly recommend dangerous content and also does not consistently listen to user feedback desperately needs to be fixed. It’s time for YouTube to fix its feedback tools and put people in the driver’s seat. Add your name to our petition to tell YouTube to fix its feedback tools. Thank you, Christian Bock Head of Supporter Engagement Mozilla Foundation |