20 April, 2024
YouTube

The study indicates YouTube’s ‘dislike’ and ‘not interested’ buttons don’t work.

According to a Mozilla study, the majority of suggestions like these could not be prevented by using the feedback button.

According to a new study by Mozilla, even when users tell YouTube they aren’t interested in particular types of videos, the service will continue to recommend videos with the same content.

Using data from over 20,000 YouTube viewers’ video recommendations, Mozilla researchers showed that “not interested,” “dislike,” “stop recommending channel,” and “delete from watch history” buttons are highly ineffective at stopping similar content from being recommended. According to the study, even when these buttons are functioning well, they nevertheless let through more than half of the recommendations that are identical to what the user explicitly stated they were not interested in. Even at their least effective, the buttons still blocked a decent fraction of potentially offensive content.

In order to collect information about actual films and viewers, Mozilla researchers enlisted the help of volunteers who installed the organization’s RegretsReporter browser extension, which adds a universal “stop recommending” button on participants’ YouTube videos. Users were randomly assigned to one of five groups; those groups received one of five signals when they clicked the Mozilla-placed button: dislike, not interested, don’t suggest channel, remove from history, or no feedback.

Research assistants used information from more than 500 million recommended films to build more than 44,000 pairs of videos, each consisting of a “rejected” video and a video that was later recommended by YouTube. Researchers then evaluated the similarity between the two videos to determine if the recommendation was too close to the video the user had previously rejected.

Sending a “dislike” or “not interested” signal only prevented 12% and 11% of bad suggestions, respectively, compared to the control group. Researchers claim the platform’s options are still inadequate for steering away undesired content, despite the “don’t recommend channel” and “delete from history” buttons preventing 43% and 29% of bad recommendations, respectively.

Researchers argue that YouTube should take user comments seriously, viewing them as “important indications” about how viewers desire to spend their time on the site.

Because YouTube doesn’t strive to censor all content connected to a topic, spokesperson Elena Hernandez believes these behaviors are on purpose. However, Hernandez argued that the research ignored the purposeful design of YouTube’s controls.

Hernandez told The Verge, “It’s important that our controls do not filter out entire topics or opinions, as this might have detrimental implications for viewers, like creating echo chambers.” To facilitate more academic exploration of our service, we have recently increased Data API access via the YouTube Researcher Program. Insights are limited due to the fact that Mozilla’s report ignores the reality of our systems.

In Hernandez’s opinion, Mozilla’s definition of “similar” is flawed since it doesn’t take into account how YouTube’s recommendation engine actually functions. Hernandez claims that selecting “not interested” for a certain video and “don’t recommend channel” for a given channel will erase the video and prevent the channel from being recommended in the future, respectively. The corporation maintains that it is not trying to eliminate suggestions for any and all information pertaining to any particular viewpoint, position, or speaker.

In addition to YouTube, other platforms like TikTok and Instagram have introduced more and more feedback mechanisms for consumers to educate the algorithm to ostensibly serve them appropriate material. Users often see unwelcome recommendations despite marking them as such. According to Mozilla researcher Becca Ricks, platforms aren’t upfront about how feedback is taken into account, and it isn’t always clear what different settings do.

According to Ricks’s email to The Verge, “I think that in the case of YouTube, the platform is balancing user engagement with user satisfaction.” This means that YouTube must decide between recommending content that encourages users to spend more time on the site and content that the algorithm predicts users will like. Ultimately, it is up to the platform to decide which of these signals is given the most weight in the algorithm, but our research reveals that user feedback might not always be the most essential one.

Leave a Reply

Your email address will not be published. Required fields are marked *