FB Oversight Board Picks 6 Cases on Hate Speech, Nudity, Violence

The Oversight Board comprises 20 independent expert members globally, including Sudhir Krishnaswamy from India.

The Quint
Policy
Published:
“A bold experiment” is how Dr Krishnaswamy, the only Indian of Facebook Oversight Board, described it in a conversation with The Quint.
i
“A bold experiment” is how Dr Krishnaswamy, the only Indian of Facebook Oversight Board, described it in a conversation with The Quint.
(Image: Aroop Mishra/The Quint)

advertisement

The Facebook Oversight Board has selected six cases from more than 20,000 cases that were referred to it following the opening of user appeals in October 2020. The cases include five user appeals and one case referred by Facebook.

While five cases deal with hate speech issues and incitement to violence, one deals with the topic of nudity around breast cancer awareness on Facebook.

The Oversight Board, comprising 20 independent expert members from around the world, including Sudhir Krishnaswamy from India, is an independent body that will adjudicate on cases related to content moderation.

“As the Board cannot hear every appeal, we are prioritising cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Facebook's policies,” the Board said.

Announced on 7 May as a measure to hold Facebook accountable on issues of hate speech, privacy and misinformation, the Board will have the last word on what content stays up and (eventually) what content comes down. The Board, however, is yet to begin hearing cases.

Each of the cases announced today has been assigned to five-member panels and the Board expects to decide on each case, and for Facebook to have acted on this decision, within 90 days.

“Once the Board has reached a decision on these cases, Facebook will be required to implement our decisions, as well as publicly respond to any additional policy recommendations that the Board makes,”the Board said in its blog.

1. Hate Speech Issue

Case referred by user

A user posted a screenshot of two tweets by former Malaysian Prime Minister, Dr Mahathir Mohamad, in which the former Prime Minister stated that "Muslims have a right to be angry and kill millions of French people for the massacres of the past" and "[b]ut by and large the Muslims have not applied the 'eye for an eye' law. Muslims don't. The French shouldn't. Instead the French should teach their people to respect other people's feelings."

The user did not add a caption alongside the screenshots. Facebook removed the post for violating its policy on hate speech.

2. Hate speech Issue

Case referred by user

A user posted two well-known photos of a deceased child lying fully clothed on a beach at the water's edge. The accompanying text (in Burmese) asks why there is no retaliation against China for its treatment of Uyghur Muslims, in contrast to the recent killings in France relating to cartoons. The post also refers to the Syrian refugee crisis.

Facebook removed the content for violating its hate speech policy. The user indicated in their appeal to the Oversight Board that the post was meant to disagree with people who think that the killer is right and to emphasise that human lives matter more than religious ideologies.

3. Hate Speech issue

Case referred by user

A user posted alleged historical photos showing churches in Baku, Azerbaijan, with accompanying text stating that Baku was built by Armenians and asking where the churches have gone. The user stated that Armenians are restoring mosques on their land because it is part of their history. The user said that the "т.а.з.и.к.и" are destroying churches and have no history.

The user stated that they are against "Azerbaijani aggression" and "vandalism". The content was removed for violating Facebook's hate speech policy.

ADVERTISEMENT
ADVERTISEMENT

4. Nudity & Sexual Activity Issue

Case referred by user

A user in Brazil posted a picture on Instagram with a title in Portuguese indicating that it was to raise awareness of signs of breast cancer.

Eight photographs within the picture showed breast cancer symptoms with corresponding explanations of the symptoms underneath. Five of the photographs included visible and uncovered female nipples.

The remaining three photographs included female breasts, with the nipples either out of shot or covered by a hand.

Facebook removed the post for violating its policy on adult nudity and sexual activity. The post has a pink background, and the user indicated in a statement to the Oversight Board that it was shared as part of the national "Pink October" campaign for the prevention of breast cancer.

5. Dangerous Individuals and Organisations Issue

Case referred by user

A user in the US was prompted by Facebook's "On This Day" function to reshare a "memory" in the form of a post that the user made two years ago. The user reshared the content. The post (in English) is an alleged quote from Joseph Goebbels, the Reich Minister of Propaganda in Nazi Germany, on the need to appeal to emotions and instincts, instead of intellect, and on the unimportance of truth.

Facebook removed the content for violating its policy on dangerous individuals and organisations. The user indicated in their appeal to the Oversight Board that the quote is important as the user considers the current US presidency to be following a fascist model.

6. Violence & Incitement Issue

Case refered by Facebook

A user posted a video and accompanying text within a Facebook group related to COVID-19. In the video and text, there is a description of an alleged scandal about the Agence Nationale de Sécurité du Médicament (the French agency responsible for regulating health products) purportedly refusing authorisation for use of hydroxychloroquine and azithromycin against COVID-19, but authorising promotional mail for remdesivir.

The user criticises the lack of a health strategy in France and states that “[Didier] Raoult’s cure” is being used elsewhere to save lives. The video was viewed approximately 50,000 times and shared under 1,000 times.

Facebook removed the content for violating its policy on violence and incitement and in its referral indicated to the Oversight Board that this case presents an example of the challenges faced when addressing the risk of offline harm that can be caused by misinformation about the COVID-19 pandemic.

(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)

Published: undefined

ADVERTISEMENT
SCROLL FOR NEXT