The YouTube algorithm doesn’t care if you ‘thumbs down’ videos

<

div>

A picture of a YouTube screen with the mouse hovering over the Dislike button.

YouTube has already blocked videos from displaying the number of dislikes received, but apparently giving a thumbs down to a video doesn’t change the number of similar videos recommended by the platform.
Photo: Wachiwit (Shutterstock)

My YouTube recommendations are filled with old reruns of Gordon Ramsay’s Kitchen Nightmares. It might be partly my mistake to get drunk one night and watch a full episode. Let me tell you, if there’s one thing I don’t want on my feed anymore it’s the famous British braggart tearing up another chef while the world’s most hateful sound effects (braaa-reeeee) scroll randomly in the background. I didn’t like many of these videos, but now I have Hell’s Kitchen on my page and I feel more and more like a “raw” steak that Ramsay is prodding and scolding.

But apparently I’m not alone with my YouTube recommendation issues. A report from the Mozilla Foundation released Monday states, based on a survey and crowdsourced data, that the “dislikeThe “and don’t recommend channel” feedback tools don’t actually change video recommendations.

Well, there are two points here. One is that users constantly feel that the controls provided by YouTube owned by Google do not actually make a difference. Two, based on data collected from users, that controls offer “negligible” impact on recommendations, meaning “the most unwanted videos keep slipping away.”

The foundation was based on its own data Regrets Reporter browser plug-in tool that allows users to block certain YouTube videos from being displayed in their feed. The report says it based its analysis on 2,757 respondents and 22,722 people who gave Mozilla access to more than 567 million video tips taken from late 2021 to June 2022.

Although the researchers admit that the respondents are not a representative sample of YouTube large and diverse audience, a third of respondents said using YouTube controls didn’t seem to change video recommendations at all. One user told Mozilla that he would report the videos as deceptive or spam and that they would return to their feed later. Respondents often stated that blocking a channel would only lead to recommendations from similar channels.

YouTube’s algorithm advises video users they don’t want to see and is often worse than the old Ramsay cable. A 2021 report from Mozilla, also based on crowdsourced user data, stated that people browsing the video platform are regularly recommended violent content, hate speech and political disinformation.

In this latest report, Mozilla researchers found that pairs of videos, including those rejected by users, such as a Tucker Carlson screed, it would simply involve recommending another video from the Fox News YouTube channel. Based on a review of 40,000 video pairs, often when a channel is blocked the algorithm simply recommends very similar videos from similar channels. Using the “Dislike” or “Not Interested” buttons prevented only 12% and 11% of unwanted advice respectively compared to a control group. Using the “do not recommend channel” and “remove from view history” buttons was more effective in correcting user feeds, but only by 43% and 29% respectively.

“In our data analysis, we determined that YouTube’s user control mechanisms are inadequate as tools to prevent unwanted recommendations,” the Mozilla researchers wrote in their study.

YouTube spokesperson Elena Hernandez told Gizmodo in an email statement that “Our controls don’t filter out entire topics or views, as this could have negative effects on viewers, such as creating echo chambers.” The company said it does not prevent all related topic content from being recommended, but it also claims to promote “authoritative” content by suppressing “borderline” videos that come close to violating content moderation policies.

In a Blog post 2021, Cristos Goodrow, YouTube’s vice president of engineering, wrote that their system is “constantly evolving”, but that providing transparency about their algorithm “isn’t as simple as listing a formula for recommendations” as their systems take clicks into account. , look time, survey responses, sharing, likes and dislikes.

Obviously, just like every social media platform out there, YouTube has struggled to create systems that can fight the entire reach bad or even predatory content being uploaded to the site. A book on the way shared exclusively with Gizmodo claimed YouTube has come close to snatching billions of dollars in ad revenue to cope with the weird and creepy videos recommended for kids.

While Hernandez said the company has expanded his Data APIthe SPokesperson added “Mozilla’s report doesn’t take into account how our systems actually work, so it’s hard for us to gather a lot of information.”

But that’s a criticism that Mozilla also puts at Google’s feet, saying the company doesn’t provide enough access to allow researchers to gauge what affects YouTube’s secret sauce, AKA their algorithms.

    .

Leave a Reply

%d bloggers like this: