“Facebook profited from our suffering. The survivors have no option other than a lawsuit against Facebook. It will be an injustice if Rohingya survivors are not compensated for their losses,” he told the Thomson Reuters Foundation.
Meta did not respond to a request for comment.
In an earlier statement in response to the lawsuit, a Meta spokesperson said the company was “appalled by the crimes committed against the Rohingya people in Myanmar.”
“We’ve built a dedicated team of Burmese speakers, banned the Tatmadaw (Myanmar military), disrupted networks manipulating public debate and taken action on harmful misinformation to help keep people safe. We’ve also invested in Burmese-language technology to reduce the prevalence of violating content.”
Strategic litigation like this - you never know where it can go. In recent times we have seen climate-change litigation becoming more commonplace and getting some wins
A day after the lawsuit was filed, Meta said it would ban several accounts linked to the Myanmar military, and said on Wednesday it had built a new artificial intelligence system that can adapt more easily to take action on new or evolving types of harmful content faster.
It was a sign that the tech giant was rattled, said Debbie Stothard, founder of the Alternative ASEAN Network on Burma (ALTSEAN), an advocacy group.
“The timing of these announcements shows the lawsuit is a wake-up call. The lawsuit itself is quite a bold move, but the Rohingya clearly felt there were sufficient grounds,” she said.
“Strategic litigation like this - you never know where it can go. In recent times we have seen climate-change litigation becoming more commonplace and getting some wins,” she added.
More than 730,000 Rohingya Muslims fled Myanmar’s Rakhine state in August 2017 after a military crackdown that refugees said included mass killings and rape. Rights groups documented killings of civilians and burning of villages.
Myanmar authorities say they were battling an insurgency and deny carrying out systematic atrocities.
United Nations human rights investigators said in 2018 that the use of Facebook had played a key role in spreading hate speech that fuelled the violence against the Rohingya.
The question to ask is not, will the lawsuit succeed, but why was it necessary? It’s about making social media companies accountable
A Reuters investigation that year, cited in the US complaint, found more than 1,000 examples of posts, comments and images attacking the Rohingya and other Muslims on Facebook.
But in the United States, platforms such as Facebook are protected from liability over content posted by users by a law known as Section 230.
The Rohingya complaint says it seeks to apply Myanmar law to the claims if Section 230 is raised as a defence.
“Based on the precedents, this case should lose,” said Eric Goldman, a professor of law at Santa Clara University School of Law. “But you’ve got so much antipathy towards Facebook nowadays - anything is possible.”
While the technology industry and others have long held that Section 230 is a crucial protection, the statute has become increasingly controversial as the power of internet companies has grown.
Earlier this year, Meta chief executive Mark Zuckerberg laid out steps to reform the law, saying that companies should have immunity from liability only if they follow best practices for removing damaging material from their platforms.
The lawsuit is a good test case for courts to limit how much immunity platforms are afforded, said David Mindell, a partner at Edelson PC, one of the law firms that brought the suit.
“This case is about what happens when a powerful company has this unchecked power over the world,” he said.
Goldman and Mindell said that recent whistleblower complaints from inside Facebook, which allege the company did not act even when it knew its platform was being used for human rights abuses, could buttress the lawsuit, as could the company’s admission that it was “too slow” to contain the abuse.
The lawsuit highlights that “a company can apologise all they like, but at the end of the day, people were harmed,” said David Kaye, a human rights lawyer who chairs the board of the Global Network Initiative, a group that includes Facebook and other tech firms.
“And those stateless people can’t go to the government of Myanmar for remedy. And if they can’t go to the company - what’s the remedy?”
The International Criminal Court has opened a case into the accusations of crimes. In September, a US federal judge ordered Facebook to release records of accounts connected to anti-Rohingya violence in Myanmar that the social media giant had shut down.
The progress of the lawsuit would be keenly watched by not just the Rohingya, but also other groups and individuals who have been harmed by online hate speech, said Stothard.
“Refugees, migrants, LGBT people, other minorities - they have all suffered serious harm,” she said.
“The question to ask is not, will the lawsuit succeed, but why was it necessary? It’s about making social media companies accountable,” she said.