X

Reuters’ Report Prompts Facebook To Remove Burmese Translation Feature

Facebook had recently removed translating feature for Burmese posts and comments after the report by Reuters that proved that the tool was delivering unusual results. The tool helped users to translate the Burmese language, but due to the misuse, it was removed.

On August 15, a Reuters investigative report was published which documented how Facebook was not doing anything to contain the Burmese language posts about Myanmar’s Rohingya Muslims. Around 700,000 Rohingya have fled the country of Myanmar during the past year due to a military crackdown and ethnic violence. In August last year, United Nations investigators said that Facebook had been “a useful instrument for those seeking to spread hate” against the Rohingya Muslim minority group.

The Reuters Agency article brought to the notice that the translation feature was not running right. The agency cited an anti-Rohingya post that said in the Burmese language, “Kill all the kalars that you see in Myanmar; none of them should be left alive.” Kalar is a pejorative for the Rohingya community. Facebook had translated the same sentence post into English as “I shouldn’t have a rainbow in Myanmar.”

Facebook on August 28 had announced that the Burmese translation feature was removed and cited the Reuters report and feedback from its users prompting it to switch off the feature.

“We are working on ways to improve the quality of the translations and until then, we have switched off this feature in Myanmar,” the spokeswoman said in an email.

Apart from that, Facebook has proved to be ineffective in regarding other problems to interpret Burmese, which is the main local language of Myanmar. Back in April, the social media company posted a Burmese translation of the company’s internal “Community Standards” enforcement guidelines.

Many of the passages seemed to be messed up. A sentence that in English said “we take our role in keeping abuse off our service seriously” was translated into the Burmese language as “we take our role seriously by abusing our services.”

More than 1,000 examples of hate speech on Facebook were found by the Reuters investigation which included calling the Rohingya people and other Muslims in general as dogs, maggots, and rapists and suggested that they be fed to pigs along with other ideas like they be shot or exterminated. The rules of Facebook specifically prohibit attacking any kind of ethnic groups with “violent or dehumanizing speech” or comparing them to animals in general.

Within few hours after the article was published by Reuters, Facebook issued a statement asserting that it had been “too slow to prevent misinformation and hate” in Myanmar country and that it was taking action in this regard, which includes investing in technological research like artificial intelligence that can keep a watch on posts that violate its rules.

You May Also Read: US Department of Justice To Probe Whether Social Media Is Suppressing Free Speech