This may seem like a no-brainer for those who know and salute the life of the 23-year-old revolutionary leader Shaheed Bhagat Singh, hanged on March 23, 1931. But for Meta’s Oversight Board, which adjudicates over content moderation decisions on Facebook and Instagram, whether to retain the word ‘shaheed’ or not is a big question.
According to the Oversight Board’s latest newsletter, ‘shaheed’ is a word that accounts for more content removal under Meta’s rules than any other word! So the Oversight Board has invited public comments on whether it should remove the word ‘Shaheed” from posts on Facebook and Instagram. Public comments are being sought upto April 10, 2023.
According to the Oversight Board, the request to adjudicate on the use of the word ‘shaheed’ came from Meta:
“Meta says it removes content referring to designated dangerous individuals as “shaheed” because it translates the word as “martyr.” It therefore considers it a form of praise. Praising a designated individual is prohibited under the Dangerous Individuals and Organizations policy. However, the company acknowledges that the meaning of “shaheed” varies.
In its request, Meta describes the word “shaheed” as an “honorific” term, used by many communities around the world, across cultures, religions, and languages. The company says the term has “multiple meanings” and is “used to describe someone dying unexpectedly or prematurely, at times referring to an honourable death, such as when one dies in an accident or in a conflict or war.” Meta states that the common English translation is “martyr,” and assumes this meaning for the purposes of content moderation, in all contexts. However, it notes that “there is no direct equivalent to the term in the English language.”
This is what the newsletter says:
How should Meta moderate the Arabic term “shaheed”?
Did you know that the Arabic term “shaheed” (translated as “martyr”) accounts for more content removals under Meta’s rules than any other word or phrase on Facebook and Instagram?
Two weeks ago, the Board had accepted a request from Meta to review its approach to moderating the Arabic term “shaheed,” when used to refer to individuals it classifies as dangerous, including terrorists. In its request, Meta asks the Board whether it should continue to remove content using “shaheed” to refer to individuals designated as dangerous under its Dangerous Individuals and Organizations policy.
We are committed to bringing diverse perspectives from third parties into our case review process. That’s why we ask users and subject matter experts to share information through our public comment process that may help the Board deliberate specific cases. If you or your organizations feel that you can contribute valuable perspective to our new case, you can submit your contributions via our link below.
The public comment window for the policy advisory opinion will be open until 15:00 UTC on Monday, 10 April.
This timeline is longer than the public comment period for new cases as the policy advisory opinion process does not have the same time constraints as case decisions.
Additionally, public comments can be up to six pages in length. For this policy advisory opinion, comments can be submitted in Bengali, as well as any of the languages available on the Board’s website, which include Arabic, Farsi, Urdu, Bahasa Indonesia and Turkish.
Facebook censorship
Censorship has dogged Facebook, Instagram and WhatsApp and compliance with government take down requests is high (see here and here for a sample of how bad it is) . Meta’s rules are tricky and problematic in the best of times. Meta routinely takes down (read censors) a range of posts and accounts for any number of reasons – from nudity to pro-Palestine posts. In the past, it has taken down content or blocked accounts over sexual harassment, Kashmir, caste based violence and alleged violation of its ‘real name policy’ . In 2016, Facebook restored the iconic ‘Napalm girl’ photo.
Facebook faced a lot of flak for its arbitrary content moderation decisions and in 2020, after a two year global consultation process, it set up the Oversight Board, an independent body for users to appeal to. The board examines the validity of Meta’s decision to take down or retain content in different cases . In 2021, Meta accepted some of the Oversight Board recommendations.
For long, journalists have struggled with a uniform policy to describe political activists in conflict regions, especially those who advocate the use of violence in separatist movements. They can be described as militants, insurgents or terrorists, depending upon whether the description comes from civil society or the State and security forces. In conflict areas like Kashmir, for instance, they can also described in local newspapers as freedom fighters. For those killed, the use of the term ‘martyr’ or ‘shaheed’ is routine. For journalists, the labels are an ever-present dilemma, exposing them to attack from both sides.
It remains to be seen how the Oversight Board decides on this issue.
Read the Oversight Board’s detailed document on the public comments here.