According to a new study by Mozilla, even if users notify YouTube that they are not interested in certain types of videos, similar recommendations continue to appear.
Using video recommendation data from more than 20,000 YouTube users, researchers found that users of Mozilla found that buttons such as not interested, not like, stop suggesting channels and delete from viewing history were largely ineffective in preventing similar content from being suggested.
The report shows that even when they are at their best, these buttons still allow more than half of the same proposal to appear. Even the buttons barely make any effect in blocking similar videos.
Research data shows YouTube's poor performance in video management
To collect data from videos and real users, researchers have asked users to use RegretsReporter, a browser-based extension that "stop recommending" shared YouTube videos that participants have watched.
Using data collected from more than 500 million recommended videos, research assistants created more than 44,000 video pairs - one rejected video, and one YouTube recommended video. Researchers then evaluate the pairs themselves or use machine learning technology to decide whether the proposal is too similar to the video that the user refused.
Compared to the grassroots control group, sending " dislike" and " dis concern" signals is only "a little effective" in preventing bad proposals, equivalent to preventing 12% out of 11,5% of bad proposals.
The No Chanel Proposal and Delease History buttons are slightly more efficient they can block 43% and 29% of bad recommendations but the researchers say the tools provided by the platform are not enough to remove unwanted content.
YouTube should respect feedback from users about their experiences, considering it a useful signal about how people want to spend their time on the platform, the researchers say.
YouTube speaks out
YouTube spokesperson Elena Hernandez said the actions were intentional because the platform did not try to block all content related to a topic. In addition, Hernandez also criticized the report for not considering how YouTube's control measures are designed.
Its important not to filter all topics or viewpoints in our controllers, as this can have negative effects on viewers, such as generating negative room vibration, he added.
We welcome academic research on our platform, which is why we have recently expanded our data API access through our YouTube Researcher Program. Mozilla's report does not take into account how our system really works and so we have difficulty collecting much detail, said a Youtube representative.
Mr. Hernandez said that Google's same-like definition does not consider how YouTubes recommendation system works. Accordingly, the not interested option will delete a specific video and the not recommended channel button will prevent that channel from being recommended in the future. The company said it is not looking to stop proposals for all content related to topics, views or speakers.
In addition to YouTube, other platforms such as TikTok and Instagram have introduced more and more feedback tools for users to use algorithms to display content that suits them. But users often complain that even if they have a flag that they don't want to see any content, similar recommendations still exist.