To read this decision in Sorani Kurdish, click here.
کرتە لێرە بکە،بۆ خوێندنەوەی ئەم بڕیارە بە زمانی کوردیی سۆرانی.
Case summary
The Oversight Board has overturned Meta’s original decision to leave up a Facebook post that
mocks a target of gender-based violence. While Meta has since recognized this post broke its
rules on Bullying and Harassment, the Board has identified a gap in Meta’s existing rules
which seems to allow content that normalizes gender-based violence by praising, justifying,
celebrating or mocking it (for example, in cases where the target is not identifiable, or the
picture is of a fictional character). The Board recommends that Meta undertake a policy
development process to address this gap.
About the case
In May 2021, a Facebook user in Iraq posted a photo with a caption in Arabic. The photo
shows a woman with visible marks of a physical attack, including bruises on her face and
body. The caption begins by warning women about making a mistake when writing to their
husbands. The caption states that the woman in the photo wrote a letter to her husband,
which he misunderstood, according to the caption, due to the woman’s typographical error.
According to the post, the husband thought the woman asked him to bring her a “donkey,”
while in fact, she was asking him for a “veil.” In Arabic, the words for “donkey” and “veil” look
similar (“ "حمارand “)"خمار. The post implies that because of the misunderstanding caused by
the typographical error in her letter, the husband physically beat her. The caption then states
that the woman got what she deserved as a result of the mistake. There are several laughing
and smiling emojis throughout the post.
The woman depicted in the photograph is an activist from Syria whose image has been
shared on social media in the past. The caption does not name her, but her face is clearly
visible. The post also includes a hashtag used in conversations in Syria supporting women.
In February 2023, a Facebook user reported the content three times for violating Meta’s
Violence and Incitement Community Standard. If content is not reviewed within 48 hours, the
report is automatically closed, as it was in this case. The content remained on the platform for
nearly two years and was not reviewed by a human moderator