Members Only
lock close icon

'Facebook Profits From Hate': Lawyer for Rohingya Refugees in Online Abuse Case

The community fears that online hate will increase manifold in the run-up to the 2024 Lok Sabha elections.

Sakshat Chandok
India
Published:
<div class="paragraphs"><p>Rohingya refugees. Image used for representational purposes only.&nbsp;</p></div>
i

Rohingya refugees. Image used for representational purposes only. 

(Photo: Erum Gour/Altered by The Quint)

advertisement

"Facebook profits from hateful content generated against the Rohingya. That is why they are so hesitant to take it down," said Kawalpreet Kaur, an advocate representing two Rohingya refugees who have filed a Public Interest Litigation (PIL) in the Delhi High Court against "online hate campaigns" against their community.

The PIL was filed on 3 January by Mohammad Hamim and Kawsar Mohammed – two Rohingya refugees living in New Delhi after having escaped persecution in Myanmar's Rakhine state in 2017. They are among at least 22,000 Rohingya in India registered with the United Nations High Commissioner for Refugees.

The plea has been filed under Article 226 of the Indian Constitution, seeking the protection of the right to life of community members in Delhi and other parts of the country. The case is expected to be listed for hearing on Tuesday, 30 January.

Proliferation of 'Hate Speech' Ahead of Lok Sabha Elections

According to the petition, there has been a proliferation of "hate campaigns" against the Rohingya in India over the years, with people referring to them as "terrorists" and "insects" who have entered the country illegally.

"The Rohingya, who are at the receiving end of Islamophobic content, feel threatened. The community decided that this is the time to say no to the rise of hate campaigns against them. That is one of the motives behind filing this petition," Kaur told The Quint.

She further said that the plea was filed keeping in mind that 2024 is an election year, and the community fears that online hate will increase manifold in the run-up to the Lok Sabha polls.

The petition states that several anti-Rohingya posts had gone viral before the 2019 general elections, with no ramifications for the users that posted them. To drive its point home, the plea included dozens of screenshots displaying hateful content against the Rohingya in India – which referred to them as "cockroaches" and "rapists" and accused them of undertaking forceful conversions.

"This Islamophobic content is motivated and there is deliberate inaction against them. In the run-up to the 2019 elections, there were comments made by several prominent people as well on Facebook which were not taken down," Kaur said.

The plea further cited a 2019 study into hate speech on Facebook in India by advocacy group Equality Labs. The group had found that six percent of Islamophobic posts in the country were targeted against the Rohingya, even though the community comprised only 0.02 percent of India's Muslim population at the time.

Facebook 'Unable' Or 'Unwilling' to Take Down Hate Speech?

The plea alleged that one of the reasons behind Facebook's "inaction" in combatting hate speech against the Rohingya was the lack of content moderators for Indian languages.

The Quint reached out to Facebook for a comment but has not received a response yet.

The plea cited measures that Facebook had taken to combat hate speech in the English language and asked for the same standards to be applied to Indian languages as well.

For instance, a mechanism called ‘Break the Glass’ (BTG) is designed for such critical purposes as it can curb the amplifying capabilities of Meta’s engagement-centric algorithms. The petition stated that Meta had employed BTG measures in the US during the 2020 presidential elections and the Capitol Hill riots on 6 January 2021.

ADVERTISEMENT
ADVERTISEMENT

"Facebook has to change its algorithms. It has to show how many content moderators it has for Indian languages. The way Facebook operates is not neutral – despite the fact that it likes to present itself in that way," Kaur told The Quint.

She further said that Facebook does not need a court order or any other direction to take down hate speech, as is mentioned in its Community Guidelines. However, she alleged that the company refuses to do so as it "profits" from hate campaigns against the Muslim community in India.

"We believe that Facebook's algorithms specifically promote this kind of content in the name of engagement, in the name of generating more revenue. That is why they are so hesitant to take it down," she said.

The advocate also pointed out that screenshots of all the hate-related content against Rohingya included in the PIL is in the public domain and violates the company's Community Standards.

"What results from this kind of hate is real violence against the community that resides here peacefully. Rumours spread that Rohingya form relationships with Hindu women and convert their religion. Such misinformation affects them in real time and has real consequences. They feel that their life is actually threatened."
Advocate Kawalpreet Kaur to The Quint

Online Hate Against Rohingya a Long-Pending Issue

This isn't the first time the issue of hate content against the Rohingya community has been highlighted.

In September 2022, Amnesty International had said that Facebook should pay reparations to hundreds of thousands of Rohingya displaced from their homes in Myanmar. This came after several rights advocates and victims' associations claimed that hateful content against the community witnessed a boom due to Facebook's algorithms.

"Many Rohingya tried to report anti-Rohingya content via Facebook's 'report' function but to no avail, allowing these hateful narratives to proliferate and reach unprecedented audiences in Myanmar," Amnesty International had said.

In another incident, rights group Global Witness claimed that Facebook failed to detect calls to incite violence against the Rohingya on its platform.

In a report submitted to AP, the rights group said that it had submitted eight ads to Facebook for approval, and each of them contained different forms of hate speech against the Rohingya. However, all eight ads were approved by the platform for publication.

While the rights group retracted the ads before they were published, the fact that they were approved by Facebook for publication highlighted the company's failure to detect hate speech, the report said.

(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)

Become a Member to unlock
  • Access to all paywalled content on site
  • Ad-free experience across The Quint
  • Early previews of our Special Projects
Continue

Published: undefined

ADVERTISEMENT
SCROLL FOR NEXT