Connect with us

News

YouTube To Tweak Algorithm For their Recommendation Settings

Avatar

Published

on

Youtube

YouTube said that it will retool its recommendation algorithm which would suggest new videos for users so as to prevent promotion of conspiracies and false information, which would reflect a growing willingness for quelling misinformation on the world’s largest video platform. 

In a blog post, the company said that they have been taking a “closer look” at how they could reduce the spread of content which “comes close to but doesn’t quite cross the line” of violating their rules. YouTube has been criticized after it directed users to conspiracies and false content they started watching legitimate news.

YouTube said- “This algorithm change will be a six-month effort which will be small at first, and will initially apply to less than one percent of the content present on the site – and will only affect the English-language videos, which means the much-unwanted content will still be present in other languages.” 

The company went on to say that none of the videos will be deleted from YouTube. Such videos will be findable for people who wish to watch and subscribe to them. 

The blog post read- “We think this change will strike a balance between the maintenance of a platform for free speech and living up to our responsibility to users.

YouTube doesn’t prohibit conspiracy theories or other forms of false information. They ban hate speech though. 

YouTube’s recommendation feature suggested new videos to users based on the previously watched videos. The algorithm would take into account “watch time” and the number of views as attributes before suggesting videos. If a video had been viewed many times until the end, the software would recognize it was a quality video and would automatically promote it to others. Since the year 2016, they have incorporated satisfaction, likes, dislikes, and other metrics into their recommendation systems.

The algorithm took a sharp turn when it comes to mainstream videos, suggesting extremist ideas. The Washington Post reported that in December Youtube continued to recommend hateful and conspiratorial videos which would fuel the racist and anti-Semitic content.

 YouTube recently developed software that would help stop conspiracy theories from being viral but the result was a video of a Parkland school shooting in February, which was actually a conspiracy theory which claimed that a teenage survivor of the school shooting was a “crisis actor” and it was the top trending item on YouTube. 

YouTube’s search feature has also been called out since it promoted conspiracies and false content. In January, a search for RBG, the initials of Supreme Court Justice Ruth Bader Ginsburg, would return a high number of far-right videos which peddled conspiracies – and very less authentic content which was actually related to the news.

Six months back, YouTube recruited human evaluators who were asked to review content based on a set of guidelines. The company further took the feedback from the evaluators and used it to train algorithms which would generate recommendations.

Please follow and like us:
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending