FAQ's | Blog |    Live Chat 24x7

YouTube Automated Tool Goes Wrong

YouTube Automated Tool Goes Wrong

 

Just like the company expected, a new YouTube tool did not work and failed in a highly public way on 15, April 2019. The tool for battling misinformation wrongly linking a video of the burning collapse of the tower at Notre Dame Cathedral in Paris to the September 11, 2001, terrorist attack.

As the images of the iconic tower falling to the roads get newscasts around the world. YouTube channels also showing those newscasts, but below the videos, there were boxes providing information about the collapses of the World Trade Center after the terrorist attack, which killed thousands of people.

The details about the World Trade Center were posted automatically because of the visual resemblance that the computer did not detect the difference between the two different incidents. Due to this, YouTube start showing the information panels with factual data about the 9/11 tragedy in the fast few months.  

The failure highlight the enduring limits of computerized tools for identifying and fighting misinformation. Whereas top companies have hired thousands of human moderators to detect these type of issues. Silicon Valley executives have said that computers are faster and more efficient at detecting problems. But the incident shows the true facts and weakness of computerized systems. This is the issue comes just a few months after YouTube and Facebook fight to detect and block videos of a mass shooting at a New Zealand mosque that Internet users were posting and reposting.

Later YouTube accepts the fault, but the issue fed a wave of unjustified rumors on social media that the fire was a terrorist attack. This thing didn’t stop here, and some users spread the false news that the fire was sparked by the Muslim terrorists. 

The tool was one of the central ideas of YouTube projected last year in the outcome of the school shooting in Parkland, Florida. In this incident, a video suggesting one of the teenage survivors was a “crisis actor” rose to the top of YouTube’s “trending” videos.

To the whole incident of fire at the Notre Dame cathedral, YouTube said in a statement, “We are deeply saddened by the ongoing fire at the Notre Dame cathedral. Last year, we launched information panels with links to third-party sources like Encyclopaedia Britannica and Wikipedia for subjects subject to misinformation. These panels are triggered algorithmically, and our systems sometimes make the wrong call. We are disabling these panels for live streams related to the fire.”

Later a spokesperson said,” the company is reviewing and taking action in line with our rules.”

YouTube and other biggest companies have succeeded in using artificial intelligence to identify images and content that users upload to their platforms. These include child pornography and images from extremist terrorist groups, which rely on familiar flags, logos, and certain violent images.

But automated tools struggled with unexpected problems, like the visual similarity between two different collapses. These systems have also struggled with video content that including hateful theories, violating content, adult images, in a once recent incident, there was a clip encouraging children to commit suicide. 

Pedro Domingos, a machine-learning researcher and University of Washington professor, said the algorithm’s failure on Monday “doesn’t surprise me at all.”

If the automated algorithm saw a video of tall structure with the smoke and inferred, the algorithm will definitely struggle to identify the difference between them. The panel just read the similarities between the two incident and share the factual information of 9/11tragedy.  

The incident showed that the algorithms based panel are not prepared for the news events as they can be the reason behind misinformation too. The algorithm lack comprehension of human context or common sense. 

Source Url: https://www.washingtonpost.com/technology/”

 

Leave a Reply

Your email address will not be published. Required fields are marked *