Skip to main content

Facebook responds to Myanmar genocide report, bans accounts that reached millions

After Facebook was accused of being too slow to respond to posts that contributed to ethnic violence in Myanmar, the network removed more than a dozen accounts operated by military officials and military-related organizations minutes after a UN report accused the military of genocide. Facebook has now banned 20 people and organizations from using the network following the UN report.

Facebook removed 18 accounts, one Instagram account and 52 pages, which combined had a reach of nearly 12 million people, the social network said today, August 27. The ban of 20 people includes some that didn’t already have a presence on Facebook, the company added. The accounts include Senior General Min Aung Hlaing, the commander-in-chief of the armed forces in Myanmar, as well as the Myawady, a military television network. 

Facebook says that the ban is to “prevent them from using our service to further inflame ethnic and religious tensions.” The ban also included accounts for “coordinated inauthentic behavior” relating to the conflict in the country.

A report from the UN released today accuses the country’s military of genocide, war crimes, and crimes against humanity directed toward the Rohingya people living in the country. The Rohingya are a minority living in Myanmar (formerly Burma). According to the BBC, the minority is the largest group of Muslims living in Myanmar, which is predominately Buddhist. 

The BBC reports that the Rohingya have had a presence in the country since the 1970s, and the government considers them illegal immigrants. The reports suggest that 25,000 were killed and at least 700,000 have fled the country. The UN’s report calls for six military leaders to go on trial and for further investigation with a UN-created independent group.

Facebook admits to being too slow to respond to the crisis. A year ago, six organizations urged the company to take action, saying that fake news posts on the network incited violence. In 2016, when 3G began popping up in the country, a report suggested farmers already used Facebook, not for social networking, but as a source of news. Facebook says that more people in Myanmar use the network for information than almost any other country.

Facebook has since drastically expanded the number of content moderators speaking Burmese, and has improved the reporting tools available to users. The company is also building artificial intelligence tools to help recognize bad content, Facebook said earlier this month. The company has also worked with organizations to run campaigns to teach Facebook users how to spot fake news in the country that only recently gained widespread internet access with a growth in access to smartphones.

Facebook says they are preserving the content from those banned accounts (presumably, that means the information could be used in an investigation). “We continue to work to prevent the misuse of Facebook in Myanmar — including through the independent human rights impact assessment we commissioned earlier in the year,” Facebook wrote today. “This is a huge responsibility given so many people there rely on Facebook for information — more so than in almost any other country given the nascent state of the news media and the recent rapid adoption of mobile phones. It’s why we’re so determined to do better in the future.”

Editors' Recommendations