advertisement
"Facebook profits from hateful content generated against the Rohingya. That is why they are so hesitant to take it down," said Kawalpreet Kaur, an advocate representing two Rohingya refugees who have filed a Public Interest Litigation (PIL) in the Delhi High Court against "online hate campaigns" against their community.
The PIL was filed on 3 January by Mohammad Hamim and Kawsar Mohammed – two Rohingya refugees living in New Delhi after having escaped persecution in Myanmar's Rakhine state in 2017. They are among at least 22,000 Rohingya in India registered with the United Nations High Commissioner for Refugees.
According to the petition, there has been a proliferation of "hate campaigns" against the Rohingya in India over the years, with people referring to them as "terrorists" and "insects" who have entered the country illegally.
"The Rohingya, who are at the receiving end of Islamophobic content, feel threatened. The community decided that this is the time to say no to the rise of hate campaigns against them. That is one of the motives behind filing this petition," Kaur told The Quint.
She further said that the plea was filed keeping in mind that 2024 is an election year, and the community fears that online hate will increase manifold in the run-up to the Lok Sabha polls.
"This Islamophobic content is motivated and there is deliberate inaction against them. In the run-up to the 2019 elections, there were comments made by several prominent people as well on Facebook which were not taken down," Kaur said.
The plea further cited a 2019 study into hate speech on Facebook in India by advocacy group Equality Labs. The group had found that six percent of Islamophobic posts in the country were targeted against the Rohingya, even though the community comprised only 0.02 percent of India's Muslim population at the time.
The plea alleged that one of the reasons behind Facebook's "inaction" in combatting hate speech against the Rohingya was the lack of content moderators for Indian languages.
The Quint reached out to Facebook for a comment but has not received a response yet.
The plea cited measures that Facebook had taken to combat hate speech in the English language and asked for the same standards to be applied to Indian languages as well.
For instance, a mechanism called ‘Break the Glass’ (BTG) is designed for such critical purposes as it can curb the amplifying capabilities of Meta’s engagement-centric algorithms. The petition stated that Meta had employed BTG measures in the US during the 2020 presidential elections and the Capitol Hill riots on 6 January 2021.
"Facebook has to change its algorithms. It has to show how many content moderators it has for Indian languages. The way Facebook operates is not neutral – despite the fact that it likes to present itself in that way," Kaur told The Quint.
She further said that Facebook does not need a court order or any other direction to take down hate speech, as is mentioned in its Community Guidelines. However, she alleged that the company refuses to do so as it "profits" from hate campaigns against the Muslim community in India.
"We believe that Facebook's algorithms specifically promote this kind of content in the name of engagement, in the name of generating more revenue. That is why they are so hesitant to take it down," she said.
The advocate also pointed out that screenshots of all the hate-related content against Rohingya included in the PIL is in the public domain and violates the company's Community Standards.
This isn't the first time the issue of hate content against the Rohingya community has been highlighted.
In September 2022, Amnesty International had said that Facebook should pay reparations to hundreds of thousands of Rohingya displaced from their homes in Myanmar. This came after several rights advocates and victims' associations claimed that hateful content against the community witnessed a boom due to Facebook's algorithms.
"Many Rohingya tried to report anti-Rohingya content via Facebook's 'report' function but to no avail, allowing these hateful narratives to proliferate and reach unprecedented audiences in Myanmar," Amnesty International had said.
In another incident, rights group Global Witness claimed that Facebook failed to detect calls to incite violence against the Rohingya on its platform.
While the rights group retracted the ads before they were published, the fact that they were approved by Facebook for publication highlighted the company's failure to detect hate speech, the report said.
(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)
Published: undefined