A Flagrant Violation of the Rights of Rohingyas by Facebook against Which Compensation Case Was Filed
A lawyer for Muslim victims said that the Rohingya are demanding compensation from the US company “Facebook” of 150 billion pounds, accusing the social network of contributing to the genocide they were subjected to in Myanmar in 2017 by spreading hatred against them.
According to our sources, a lawsuit has been filed in the United Kingdom and the United States to seek compensation for the Rohingya refugees for what happened to them, in one of the largest class actions for victims of a crime against humanity filed before a local court anywhere in the world.
Technical investigations showed that Facebook’s algorithms broadcast and amplified hate speech against the Rohingya Muslims who lived in western Myanmar and were treated with racism and contempt in their country by the Buddhist majority.
Facebook, which is now a subsidiary of Meta, while acknowledging its shortcomings in monitoring anti-Rohingya content, has failed to hire moderators who can read Burmese or Rohingya languages or understand the fraught political landscape in Myanmar.
The crux of the complaint stems from revelations of Facebook’s willingness to trade Rohingya lives for better market penetration in a small Southeast Asian country.
We consider the Facebook as a robot that has been programmed to achieve a unique mission: growth. The undeniable reality is that the growth of Facebook, fueled by hatred, division and disinformation, has destroyed the lives of hundreds of thousands of Rohingya.
A reprimand was sent by the lawyers to Meta – the owner of Facebook – saying: “Our clients have been subjected to serious acts of violence, killings and/or other serious human rights violations committed as part of the campaign of genocide and crimes against humanity carried out against them in Myanmar as widely acknowledged and reported, this campaign has been fueled by the widespread dissemination and amplification of material across the Facebook.
In testimonies obtained from Rohingya citizens; men, women and children were shot, stabbed and burned by Myanmar soldiers and local Buddhist civilians before burial in mass graves, some even used acid to melt the faces of the dead in what appeared to be deliberate attempts to prevent identification.
We believe that Facebook has failed to remove posts that incite violence or close pages that promoted hate speech, despite repeated warnings from human rights organisations and press reports since 2013, and that this content has increased hate and violence.
We stress the need for content to be processed through human intelligence, not artificial, because artificial intelligence algorithms are not sufficiently capable of determining human meanings and feelings, and it is dangerous to let electronic robots control the fate of humans, as this foretells of a catastrophic situation with unimaginable consequences.