Facebook facing mounting legal fights over Myanmar genocide

07 December 2021 - 07:50 By Naomi Nix
subscribe Just R20 for the first month. Support independent journalism by subscribing to our digital news package.
Subscribe now
Facebook parent Meta Platforms Inc. is facing mounting legal challenges by Rohingya refugees who blame the social media company for inciting genocidal violence in 2017 against the Muslim minority in Myanmar.
Facebook parent Meta Platforms Inc. is facing mounting legal challenges by Rohingya refugees who blame the social media company for inciting genocidal violence in 2017 against the Muslim minority in Myanmar.
Image: Bloomberg

Facebook parent Meta Platforms Inc. is facing mounting legal challenges by Rohingya refugees who blame the social media company for inciting genocidal violence in 2017 against the Muslim minority in Myanmar.

The company allowed an “out-of-control spread of anti-Rohingya content” despite repeated warnings from civil society groups and human rights activists about its deadly consequences, according to a complaint filed Monday in state court in California.

Separately, members of the Rohingya community from Myanmar who now live in the UK and in refugee camps in Bangladesh told Meta they intend to pursue a lawsuit in the U.K.’s High Court over its failure to take action against hate on its platform.

The legal challenges add to the public scrutiny facing Meta following a series of critical media reports based on internal documents disclosed by former Facebook product manager-turned-whistle-blower Frances Haugen. The company is battling accusations that it has prioritised the growth of its platforms at the expense of fighting hate speech, disinformation and violent extremism.

“The last five years, and in fact just the last five months, have made it abundantly clear that Facebook’s path to promote the very worst of humanity was not the result of a bug but rather a carefully designed feature,” according to the complaint in San Mateo County Superior Court, near where Meta is based.  

How Facebook Algorithms Can Fight Over Your Feed: QuickTake

The plaintiff, who isn’t named in the lawsuit, is seeking more than $150 billion in damages on behalf of an estimated 10,000 Rohingya Muslims in the US who fled Myanmar after June 2012 to escape the threat of violence. Her lawsuit is seeking class action status. 

In the UK legal challenge, the plaintiffs will argue that Facebook used algorithms that amplified hate speech against the Rohingya people and failed to invest in enough content moderators who spoke Burmese or Rohingya, according to the letter they sent the court. 

“Despite Facebook’s acknowledgment of its role in such real-world harms and its proclaimed position as a positive force in the world, no meaningful compensation has been offered to any survivor,” the legal notice said.

On Thursday, 16 Rohingya young people and advocates in a refugee camp in Bangladesh plan to submit a complaint against Facebook to Ireland’s Organisation for Economic Co-operation and Development, arguing its social network incited violence against their community. 

Meta didn’t immediately respond to a request for comment.

Facebook instituted reforms after a company-commissioned study in 2018 found that its platform was being used to co-ordinate violent repression in Myanmar. More broadly, the company in recent years has stepped up use of artificial intelligence and human-powered system to rid its networks of problematic speech. 

The company bans user-posted content that directs attacks against people on the basis of their race, country of origin, religion, sexual orientation and other sensitive attributes. It also bars users from posting messages that include calls for violence. 

Facebook became so popular in Myanmar that for the majority of its digitally-connected residents the social network had become synonymous with the internet itself. Facebook dominated the developing market because it partnered with local mobile operators who agreed not to charge for the data used to support a cheap basic version of the app and it supported Myanmar fonts better than other tech platforms, according to the California suit. 

By 2012, Myanmar’s military-dominated government and everyday users began spreading fearful and dehumanising messages about Rohingya Muslims. By 2017, the site was being used to recruit and train “civilian death squads” to perpetuate violence, according to the complaint. In the end, tens of thousands of Rohingya were murdered while hundreds of thousands  saw “indescribable violence and misery that they will carry with them for the rest of their lives,” the refugee alleged. 

Lawyers for the refugee argue that unlike the federal Communications Decency Act that shields internet platforms from lawsuits over user-generated content, “Burmese law does not immunise social media companies for their role in inciting violence and contributing to genocide,” according to the complaint. 

Facebook is participating in an international investigation of the Myanmar genocide led by the West African nation Gambia, but was faulted by a judge in Washington, D.C. earlier this year for resisting disclosure of internal company records.

  • More stories like this are available on bloomberg.com
  • ©2021 Bloomberg L.P.


subscribe Just R20 for the first month. Support independent journalism by subscribing to our digital news package.
Subscribe now

Would you like to comment on this article?
Sign up (it's quick and free) or sign in now.

Speech Bubbles

Please read our Comment Policy before commenting.