Google-owned YouTube has announced that it will no longer recommend a video that is close to or about to violate its community guidelines. In a blog post, YouTube said “videos the site recommends, usually after a user has viewed one video, would no longer lead just too similar videos and instead would “pull in recommendations from a wider set of topics.” YouTube also stated that it will affect “less than 1% of video”, but may affect millions of clips. There are speculations that this change will create a balance between the freedom of speech, and the authenticity of the videos.
This YouTube update will not affect the video’s accessibility and if the user is subscribed to the channel like, create conspiracy content, or search for it, he/she will still see the related recommendations. With this update, Guillaume Chaslot, a former Google engineer praised the changes. He said that he helped to build the artificial intelligence used to curate recommended videos. “It’s only the beginning of a more humane technology. Technology that empowers all of us, instead of deceiving the most vulnerable,” Chaslot wrote.
According to Chaslot, the primary motive of YouTube’s artificial intelligence was to engage the users as long as possible on YouTube to promote more advertisements. When a user was benefited with the various videos, the artificial intelligence not only become biased towards the content in which users were more interested but also kept the record of the users’ interesting content to replicate the pattern with other users.
What Would Be The Impact of YouTube’s Modification?
YouTube’s spokesperson wrote, “we think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users.” YouTube continually making changes in its recommendation system to manage the rank of authentic videos and restricting conspiracy videos and content.
YouTube also stated that it wanted to recommend videos to the users which can help them in spending time wisely rather than making them frustrated or restless. Also, it has been working to extend recommendations so that the users could get a variety of videos instead of similar videos.
YouTube did not open up much about how it would identify which video would be expelled from its recommendations. Besides this, machine-learning algorithms will decide the specific videos that would be made or not. YouTube said it would implement a gradual change on a small set of videos in the United States, but attempts the strategies to introduce the modification globally as the system becomes more accurate.
This new alteration and policy show that YouTube is taking a more antagonistic move towards the offensive content even if it is not violating YouTube’s community standards.