FAQ's | Blog |    Live Chat 24x7

Facebook’s Flood Of Languages Leave It Struggling To Monitor Content

Facebook Inc’s battles with hate speech and different kinds of risky content that are being obstructed by the organization’s incapability to stay with a flood of new languages as mobile phone filled every corner of the globe. 

The organization provides its 2.3 billion users features like menus which prompts in 111 different languages, considered to be officially supported. A news channel has discovered another 31 widely spoken languages on Facebook that do not have official support.

Detailed rules also called ‘Community Standards’ which restrict users from posting inappropriate materials such as hate speech and celebration of violence, were translated in only 41 out of the 111 supported languages as of March 2019.

Facebook’s 15,000-in number substance balance workforce talks around 50 tongues, however, the organization said it employs professional interpreters when required. Automated tools for recognizing hate speech work in around 30.

Shortage of language has become the main barrier in fighting with the harmful content, and this damage can harm the company itself. In countries like Australia, U.K, and Singapore, are providing new regulations which include jails and penalty for executives, if they fail in removing the offensive posts. 

The rules are translated case by case relying on the critical mass of usage and Facebook’s primary source for speakers according to the spokesperson. The spokesperson also said that there is no exact number for critical mass. 

As per the early reports and records, hate speech on Facebook encourages the racial distillation in Myanmar which went unchecked in part as the company was slow to add moderation tools and staff for the local language. Facebook says it presently offers the guidelines in Burmese and has over 100 speakers of the language among its workforce. 

Facebook’s spokesperson said the company works on the security measures to protect the users from the destructive and violent content that had a level of language involvement that surpasses most any technology organization. 

In any case, human rights authorities state Facebook is in the risk of repeating Myanmar issues in those countries which are divided by violent conflicts where its language capabilities have not stayed with the effect of social media. 

Phil Robertson, deputy chief of Human Rights Watch’s Asia Division said, “these are the rules of the street where both customer and regulators should focus on a social media platform and protect them effectively.” “Inability to do as such opens the door to serious abuses” he added.

Abuse In Fijian:

Last year in November,  Mohammed Saneem, the supervisor of elections in Fiji said that he felt the impact of the language gap during the polls. He also said that he dedicated a staffer to emailing posts and translations to a Facebook employee in Singapore in the hope of removals.

But, Facebook said it did not request a translation, and it gave Reuters a post-election letter from Saneem praising its “timely and effective assistance.”

Saneem revealed to a news agency that he appreciated their help however had expected proactive measures from Facebook. He said, “If they are allowing users to post in their language, there should be guidelines available in the same language.”

According to the data from language encyclopedia Ethnologue, around 652 million people speak languages which are supported by Facebook, but the rules are only available in English. Another 230 million or more speak one of the 31 languages that have no official support.

To forbid offensive content, Facebook uses automated software as a critical tool. The organization said they use a particular type of Artificial Intelligence which is known as a machine learning tool. These tools recognize hate speech around 30 languages and terrorist propaganda in 19 languages.

Facebook vice president, Guy Rosen who oversees automated policy implementation, said that Artificial Intelligence requires a massive amount of data to prepare computers, and a shortage of text in different languages presents a challenge in developing the tools immediately.

Growth Region:

Beyond the automation and a couple of official fact checkers, Facebook depends on users to report offensive content which makes a noteworthy issue where community standards are not understood or even known to exist. 

In March, Ebele Okobi, Facebook’s director of public policy in Africa said that the continent had the world’s least rates of user reporting. She said, “Many individuals do not know that there is community standard.” 

She also made a statement in which she explained how Facebook has purchased radio ads in Nigeria and working with local associations to change the problem of language.  

At the same time, Facebook is doing a collaboration with wireless medium and different groups to grow internet access in the nation including Uganda and the Democratic Republic of Congo where it still doesn’t have official support. 

The organization declared in February that in a short time it would have its initial 100 sub-Saharan Africa-based content moderators at an outsourcing facility in Nairobi. They will join existing groups in checking the content in Somali, Oromo and different languages. 

Despite having a ban on the terrorist group on Facebook, last year post in the Somali language comes where Al Shabaab a militant group was celebrating to stayed on Facebook for a more extended period.

Ability To Derail:

This month the Oromo and Tigray an ethnic population was attacked through the inappropriate term which was written in the Amharic language, which violates Facebook’s ban on discussing ethnic groups through violent speech. 

Facebook evacuated those post and said that it had mistakenly permitted one of them, from December 2017, to stay online following an earlier user report. For government officials like Saneem in Fiji, Facebook attempts on enhancing content moderation, and language support are quite slow. He said he had already warned Facebook months before the election. The majority of them use Facebook, with half writing in English and half in Fijian, he evaluated.  Saneem said, “social media can derail the elections.”

Other social media platforms encounter the same issue just with the variation in the count. Facebook-owned Instagram has 1,179-word community rules in 30 out of 51 languages. Whereas WhatsApp community guidelines are available in nine of 58 supported languages. 

The researcher found that YouTube’s community guidelines are available in 40 languages out of  80 and Twitter Inc’s rules hare in 37 of 47 supported languages. The gap between the languages is the primary reason through which misinformation gets spread across the world. 

Leave a Reply

Your email address will not be published. Required fields are marked *